Apr 16 00:23:12.924995 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Apr 16 00:23:12.925021 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Wed Apr 15 22:32:48 -00 2026 Apr 16 00:23:12.925031 kernel: KASLR enabled Apr 16 00:23:12.925037 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Apr 16 00:23:12.925043 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x138595418 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Apr 16 00:23:12.925048 kernel: random: crng init done Apr 16 00:23:12.925055 kernel: ACPI: Early table checksum verification disabled Apr 16 00:23:12.925061 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Apr 16 00:23:12.925068 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Apr 16 00:23:12.925075 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925081 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925087 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925093 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925100 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925108 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925116 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925122 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925129 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:23:12.925136 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Apr 16 00:23:12.925142 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Apr 16 00:23:12.925149 kernel: NUMA: Failed to initialise from firmware Apr 16 00:23:12.925155 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Apr 16 00:23:12.925162 kernel: NUMA: NODE_DATA [mem 0x139671800-0x139676fff] Apr 16 00:23:12.925168 kernel: Zone ranges: Apr 16 00:23:12.925174 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Apr 16 00:23:12.925182 kernel: DMA32 empty Apr 16 00:23:12.925188 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Apr 16 00:23:12.925195 kernel: Movable zone start for each node Apr 16 00:23:12.925201 kernel: Early memory node ranges Apr 16 00:23:12.925208 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Apr 16 00:23:12.925214 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Apr 16 00:23:12.925220 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Apr 16 00:23:12.925227 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Apr 16 00:23:12.925233 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Apr 16 00:23:12.925240 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Apr 16 00:23:12.925246 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Apr 16 00:23:12.925252 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Apr 16 00:23:12.925260 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Apr 16 00:23:12.925288 kernel: psci: probing for conduit method from ACPI. Apr 16 00:23:12.925295 kernel: psci: PSCIv1.1 detected in firmware. Apr 16 00:23:12.925306 kernel: psci: Using standard PSCI v0.2 function IDs Apr 16 00:23:12.925312 kernel: psci: Trusted OS migration not required Apr 16 00:23:12.925320 kernel: psci: SMC Calling Convention v1.1 Apr 16 00:23:12.925328 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Apr 16 00:23:12.925335 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Apr 16 00:23:12.925342 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Apr 16 00:23:12.925350 kernel: pcpu-alloc: [0] 0 [0] 1 Apr 16 00:23:12.925356 kernel: Detected PIPT I-cache on CPU0 Apr 16 00:23:12.925363 kernel: CPU features: detected: GIC system register CPU interface Apr 16 00:23:12.925370 kernel: CPU features: detected: Hardware dirty bit management Apr 16 00:23:12.925377 kernel: CPU features: detected: Spectre-v4 Apr 16 00:23:12.925383 kernel: CPU features: detected: Spectre-BHB Apr 16 00:23:12.925390 kernel: CPU features: kernel page table isolation forced ON by KASLR Apr 16 00:23:12.925398 kernel: CPU features: detected: Kernel page table isolation (KPTI) Apr 16 00:23:12.925405 kernel: CPU features: detected: ARM erratum 1418040 Apr 16 00:23:12.925412 kernel: CPU features: detected: SSBS not fully self-synchronizing Apr 16 00:23:12.925419 kernel: alternatives: applying boot alternatives Apr 16 00:23:12.925427 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=0adf63447ce845e6a0056fdc0e76e619192ad10bb115f878c5a0d78c1b8c220d Apr 16 00:23:12.925434 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Apr 16 00:23:12.925441 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Apr 16 00:23:12.925447 kernel: Fallback order for Node 0: 0 Apr 16 00:23:12.925454 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Apr 16 00:23:12.925461 kernel: Policy zone: Normal Apr 16 00:23:12.925467 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 16 00:23:12.925476 kernel: software IO TLB: area num 2. Apr 16 00:23:12.925483 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Apr 16 00:23:12.925490 kernel: Memory: 3882824K/4096000K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 213176K reserved, 0K cma-reserved) Apr 16 00:23:12.925497 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Apr 16 00:23:12.925504 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 16 00:23:12.925512 kernel: rcu: RCU event tracing is enabled. Apr 16 00:23:12.925519 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Apr 16 00:23:12.925525 kernel: Trampoline variant of Tasks RCU enabled. Apr 16 00:23:12.925532 kernel: Tracing variant of Tasks RCU enabled. Apr 16 00:23:12.925539 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 16 00:23:12.925546 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Apr 16 00:23:12.925553 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Apr 16 00:23:12.925561 kernel: GICv3: 256 SPIs implemented Apr 16 00:23:12.925606 kernel: GICv3: 0 Extended SPIs implemented Apr 16 00:23:12.925613 kernel: Root IRQ handler: gic_handle_irq Apr 16 00:23:12.925619 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Apr 16 00:23:12.925626 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Apr 16 00:23:12.925633 kernel: ITS [mem 0x08080000-0x0809ffff] Apr 16 00:23:12.925640 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Apr 16 00:23:12.925647 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Apr 16 00:23:12.925654 kernel: GICv3: using LPI property table @0x00000001000e0000 Apr 16 00:23:12.925661 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Apr 16 00:23:12.925668 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 16 00:23:12.925677 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 00:23:12.925685 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Apr 16 00:23:12.925692 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Apr 16 00:23:12.925699 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Apr 16 00:23:12.925706 kernel: Console: colour dummy device 80x25 Apr 16 00:23:12.925713 kernel: ACPI: Core revision 20230628 Apr 16 00:23:12.925720 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Apr 16 00:23:12.925727 kernel: pid_max: default: 32768 minimum: 301 Apr 16 00:23:12.925734 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Apr 16 00:23:12.925741 kernel: landlock: Up and running. Apr 16 00:23:12.925750 kernel: SELinux: Initializing. Apr 16 00:23:12.925757 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 16 00:23:12.925764 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 16 00:23:12.925771 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 16 00:23:12.925778 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 16 00:23:12.925785 kernel: rcu: Hierarchical SRCU implementation. Apr 16 00:23:12.925792 kernel: rcu: Max phase no-delay instances is 400. Apr 16 00:23:12.925799 kernel: Platform MSI: ITS@0x8080000 domain created Apr 16 00:23:12.925807 kernel: PCI/MSI: ITS@0x8080000 domain created Apr 16 00:23:12.925815 kernel: Remapping and enabling EFI services. Apr 16 00:23:12.925823 kernel: smp: Bringing up secondary CPUs ... Apr 16 00:23:12.925830 kernel: Detected PIPT I-cache on CPU1 Apr 16 00:23:12.925837 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Apr 16 00:23:12.925845 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Apr 16 00:23:12.925852 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 00:23:12.925859 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Apr 16 00:23:12.925866 kernel: smp: Brought up 1 node, 2 CPUs Apr 16 00:23:12.925873 kernel: SMP: Total of 2 processors activated. Apr 16 00:23:12.925880 kernel: CPU features: detected: 32-bit EL0 Support Apr 16 00:23:12.925888 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Apr 16 00:23:12.925896 kernel: CPU features: detected: Common not Private translations Apr 16 00:23:12.925908 kernel: CPU features: detected: CRC32 instructions Apr 16 00:23:12.925917 kernel: CPU features: detected: Enhanced Virtualization Traps Apr 16 00:23:12.925924 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Apr 16 00:23:12.925931 kernel: CPU features: detected: LSE atomic instructions Apr 16 00:23:12.925939 kernel: CPU features: detected: Privileged Access Never Apr 16 00:23:12.925946 kernel: CPU features: detected: RAS Extension Support Apr 16 00:23:12.925955 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Apr 16 00:23:12.925963 kernel: CPU: All CPU(s) started at EL1 Apr 16 00:23:12.925970 kernel: alternatives: applying system-wide alternatives Apr 16 00:23:12.925977 kernel: devtmpfs: initialized Apr 16 00:23:12.925985 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 16 00:23:12.925993 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Apr 16 00:23:12.926000 kernel: pinctrl core: initialized pinctrl subsystem Apr 16 00:23:12.926007 kernel: SMBIOS 3.0.0 present. Apr 16 00:23:12.926017 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Apr 16 00:23:12.926025 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 16 00:23:12.926033 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Apr 16 00:23:12.926040 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Apr 16 00:23:12.926048 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Apr 16 00:23:12.926055 kernel: audit: initializing netlink subsys (disabled) Apr 16 00:23:12.926063 kernel: audit: type=2000 audit(0.015:1): state=initialized audit_enabled=0 res=1 Apr 16 00:23:12.926071 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 16 00:23:12.926078 kernel: cpuidle: using governor menu Apr 16 00:23:12.926087 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Apr 16 00:23:12.926095 kernel: ASID allocator initialised with 32768 entries Apr 16 00:23:12.926103 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 16 00:23:12.926110 kernel: Serial: AMBA PL011 UART driver Apr 16 00:23:12.926117 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Apr 16 00:23:12.926124 kernel: Modules: 0 pages in range for non-PLT usage Apr 16 00:23:12.926132 kernel: Modules: 509008 pages in range for PLT usage Apr 16 00:23:12.926139 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 16 00:23:12.926146 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Apr 16 00:23:12.926155 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Apr 16 00:23:12.926163 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Apr 16 00:23:12.926170 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 16 00:23:12.926178 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Apr 16 00:23:12.926185 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Apr 16 00:23:12.926193 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Apr 16 00:23:12.926200 kernel: ACPI: Added _OSI(Module Device) Apr 16 00:23:12.926207 kernel: ACPI: Added _OSI(Processor Device) Apr 16 00:23:12.926215 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 16 00:23:12.926224 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 16 00:23:12.926232 kernel: ACPI: Interpreter enabled Apr 16 00:23:12.926239 kernel: ACPI: Using GIC for interrupt routing Apr 16 00:23:12.926247 kernel: ACPI: MCFG table detected, 1 entries Apr 16 00:23:12.926254 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Apr 16 00:23:12.926261 kernel: printk: console [ttyAMA0] enabled Apr 16 00:23:12.926317 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Apr 16 00:23:12.926489 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Apr 16 00:23:12.926587 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Apr 16 00:23:12.926661 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Apr 16 00:23:12.926728 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Apr 16 00:23:12.926791 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Apr 16 00:23:12.926801 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Apr 16 00:23:12.926808 kernel: PCI host bridge to bus 0000:00 Apr 16 00:23:12.926883 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Apr 16 00:23:12.926948 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Apr 16 00:23:12.927007 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Apr 16 00:23:12.927064 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Apr 16 00:23:12.927155 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Apr 16 00:23:12.927235 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Apr 16 00:23:12.927324 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Apr 16 00:23:12.927392 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Apr 16 00:23:12.927473 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.927540 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Apr 16 00:23:12.927628 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.927699 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Apr 16 00:23:12.927773 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.927841 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Apr 16 00:23:12.927923 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.927990 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Apr 16 00:23:12.928064 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.928134 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Apr 16 00:23:12.928208 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.928298 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Apr 16 00:23:12.928382 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.928450 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Apr 16 00:23:12.928522 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.928637 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Apr 16 00:23:12.928716 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Apr 16 00:23:12.928783 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Apr 16 00:23:12.928864 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Apr 16 00:23:12.928931 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Apr 16 00:23:12.929009 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Apr 16 00:23:12.929110 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Apr 16 00:23:12.929179 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Apr 16 00:23:12.929246 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 16 00:23:12.929343 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Apr 16 00:23:12.929418 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Apr 16 00:23:12.929496 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Apr 16 00:23:12.929581 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Apr 16 00:23:12.929663 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Apr 16 00:23:12.929742 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Apr 16 00:23:12.931389 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Apr 16 00:23:12.931525 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Apr 16 00:23:12.931626 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] Apr 16 00:23:12.931701 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Apr 16 00:23:12.931787 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Apr 16 00:23:12.931858 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Apr 16 00:23:12.931926 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Apr 16 00:23:12.932008 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Apr 16 00:23:12.932078 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Apr 16 00:23:12.932147 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Apr 16 00:23:12.932215 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 16 00:23:12.932321 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Apr 16 00:23:12.932421 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Apr 16 00:23:12.932506 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Apr 16 00:23:12.932640 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Apr 16 00:23:12.932715 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Apr 16 00:23:12.932782 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Apr 16 00:23:12.932853 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Apr 16 00:23:12.932920 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Apr 16 00:23:12.932985 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Apr 16 00:23:12.933054 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Apr 16 00:23:12.933141 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Apr 16 00:23:12.933219 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Apr 16 00:23:12.933410 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Apr 16 00:23:12.933482 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Apr 16 00:23:12.933545 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Apr 16 00:23:12.933637 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Apr 16 00:23:12.933703 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Apr 16 00:23:12.933769 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Apr 16 00:23:12.933854 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Apr 16 00:23:12.933918 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Apr 16 00:23:12.933981 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Apr 16 00:23:12.934048 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Apr 16 00:23:12.934113 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Apr 16 00:23:12.934177 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Apr 16 00:23:12.934251 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Apr 16 00:23:12.936448 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Apr 16 00:23:12.936541 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Apr 16 00:23:12.936637 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Apr 16 00:23:12.936710 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 00:23:12.936780 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Apr 16 00:23:12.936848 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 00:23:12.936918 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Apr 16 00:23:12.936996 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 00:23:12.937065 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Apr 16 00:23:12.937133 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 00:23:12.937202 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Apr 16 00:23:12.937347 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 00:23:12.937430 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Apr 16 00:23:12.937497 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 00:23:12.937624 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Apr 16 00:23:12.937708 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 00:23:12.937778 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Apr 16 00:23:12.937843 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 00:23:12.937909 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Apr 16 00:23:12.937974 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 00:23:12.938042 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Apr 16 00:23:12.938115 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Apr 16 00:23:12.938180 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Apr 16 00:23:12.938244 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Apr 16 00:23:12.938326 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Apr 16 00:23:12.938395 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Apr 16 00:23:12.938462 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Apr 16 00:23:12.938530 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Apr 16 00:23:12.938630 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Apr 16 00:23:12.938708 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Apr 16 00:23:12.938776 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Apr 16 00:23:12.938843 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Apr 16 00:23:12.938910 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Apr 16 00:23:12.938974 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Apr 16 00:23:12.939040 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Apr 16 00:23:12.939107 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Apr 16 00:23:12.939174 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Apr 16 00:23:12.939244 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Apr 16 00:23:12.940794 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Apr 16 00:23:12.940883 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Apr 16 00:23:12.940958 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Apr 16 00:23:12.941036 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Apr 16 00:23:12.941105 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Apr 16 00:23:12.941172 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Apr 16 00:23:12.941244 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Apr 16 00:23:12.941416 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Apr 16 00:23:12.941486 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Apr 16 00:23:12.941550 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 00:23:12.941650 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Apr 16 00:23:12.941729 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Apr 16 00:23:12.941793 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Apr 16 00:23:12.941858 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Apr 16 00:23:12.941921 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 00:23:12.941993 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Apr 16 00:23:12.942063 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Apr 16 00:23:12.942132 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Apr 16 00:23:12.942197 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Apr 16 00:23:12.942299 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Apr 16 00:23:12.942365 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 00:23:12.942438 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Apr 16 00:23:12.942506 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Apr 16 00:23:12.942617 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Apr 16 00:23:12.942694 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Apr 16 00:23:12.942760 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 00:23:12.942834 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Apr 16 00:23:12.942909 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] Apr 16 00:23:12.942977 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Apr 16 00:23:12.943041 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Apr 16 00:23:12.943105 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Apr 16 00:23:12.943169 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 00:23:12.943242 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Apr 16 00:23:12.943326 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Apr 16 00:23:12.943395 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Apr 16 00:23:12.943465 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Apr 16 00:23:12.943530 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Apr 16 00:23:12.943611 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 00:23:12.943687 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Apr 16 00:23:12.943756 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Apr 16 00:23:12.943825 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Apr 16 00:23:12.943893 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Apr 16 00:23:12.943959 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Apr 16 00:23:12.944028 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Apr 16 00:23:12.944093 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 00:23:12.944162 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Apr 16 00:23:12.944228 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Apr 16 00:23:12.944776 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Apr 16 00:23:12.944862 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 00:23:12.944931 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Apr 16 00:23:12.944997 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Apr 16 00:23:12.945070 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Apr 16 00:23:12.945134 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 00:23:12.945204 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Apr 16 00:23:12.945262 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Apr 16 00:23:12.945353 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Apr 16 00:23:12.945429 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Apr 16 00:23:12.945492 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Apr 16 00:23:12.945559 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 00:23:12.945668 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Apr 16 00:23:12.945730 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Apr 16 00:23:12.945790 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 00:23:12.945860 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Apr 16 00:23:12.945922 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Apr 16 00:23:12.945987 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 00:23:12.946058 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Apr 16 00:23:12.946119 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Apr 16 00:23:12.946196 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 00:23:12.946265 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Apr 16 00:23:12.947051 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Apr 16 00:23:12.947115 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 00:23:12.947192 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Apr 16 00:23:12.947253 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Apr 16 00:23:12.947406 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 00:23:12.947510 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Apr 16 00:23:12.947628 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Apr 16 00:23:12.947696 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 00:23:12.947765 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Apr 16 00:23:12.947826 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Apr 16 00:23:12.947888 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 00:23:12.947958 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Apr 16 00:23:12.948020 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Apr 16 00:23:12.948086 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 00:23:12.948096 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Apr 16 00:23:12.948104 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Apr 16 00:23:12.948113 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Apr 16 00:23:12.948120 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Apr 16 00:23:12.948128 kernel: iommu: Default domain type: Translated Apr 16 00:23:12.948136 kernel: iommu: DMA domain TLB invalidation policy: strict mode Apr 16 00:23:12.948144 kernel: efivars: Registered efivars operations Apr 16 00:23:12.948154 kernel: vgaarb: loaded Apr 16 00:23:12.948162 kernel: clocksource: Switched to clocksource arch_sys_counter Apr 16 00:23:12.948170 kernel: VFS: Disk quotas dquot_6.6.0 Apr 16 00:23:12.948178 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 16 00:23:12.948185 kernel: pnp: PnP ACPI init Apr 16 00:23:12.948263 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Apr 16 00:23:12.948542 kernel: pnp: PnP ACPI: found 1 devices Apr 16 00:23:12.948551 kernel: NET: Registered PF_INET protocol family Apr 16 00:23:12.948559 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Apr 16 00:23:12.948590 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Apr 16 00:23:12.948598 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 16 00:23:12.948606 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 16 00:23:12.948614 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Apr 16 00:23:12.948622 kernel: TCP: Hash tables configured (established 32768 bind 32768) Apr 16 00:23:12.948630 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 16 00:23:12.948638 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 16 00:23:12.948645 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 16 00:23:12.948757 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Apr 16 00:23:12.948773 kernel: PCI: CLS 0 bytes, default 64 Apr 16 00:23:12.948781 kernel: kvm [1]: HYP mode not available Apr 16 00:23:12.948788 kernel: Initialise system trusted keyrings Apr 16 00:23:12.948796 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Apr 16 00:23:12.948804 kernel: Key type asymmetric registered Apr 16 00:23:12.948812 kernel: Asymmetric key parser 'x509' registered Apr 16 00:23:12.948820 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Apr 16 00:23:12.948827 kernel: io scheduler mq-deadline registered Apr 16 00:23:12.948835 kernel: io scheduler kyber registered Apr 16 00:23:12.948845 kernel: io scheduler bfq registered Apr 16 00:23:12.948853 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Apr 16 00:23:12.948924 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Apr 16 00:23:12.948990 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Apr 16 00:23:12.949055 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.949124 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Apr 16 00:23:12.949193 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Apr 16 00:23:12.949259 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.949366 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Apr 16 00:23:12.949435 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Apr 16 00:23:12.949500 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.949617 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Apr 16 00:23:12.949703 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Apr 16 00:23:12.949770 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.949842 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Apr 16 00:23:12.949909 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Apr 16 00:23:12.949974 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.950045 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Apr 16 00:23:12.950116 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Apr 16 00:23:12.950182 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.950251 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Apr 16 00:23:12.950382 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Apr 16 00:23:12.950450 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.950519 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Apr 16 00:23:12.950607 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Apr 16 00:23:12.950674 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.950686 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Apr 16 00:23:12.950754 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Apr 16 00:23:12.950820 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Apr 16 00:23:12.950885 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:23:12.950896 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Apr 16 00:23:12.950906 kernel: ACPI: button: Power Button [PWRB] Apr 16 00:23:12.950915 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Apr 16 00:23:12.950990 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Apr 16 00:23:12.951064 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Apr 16 00:23:12.951075 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 16 00:23:12.951083 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Apr 16 00:23:12.951152 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Apr 16 00:23:12.951163 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Apr 16 00:23:12.951171 kernel: thunder_xcv, ver 1.0 Apr 16 00:23:12.951181 kernel: thunder_bgx, ver 1.0 Apr 16 00:23:12.951189 kernel: nicpf, ver 1.0 Apr 16 00:23:12.951197 kernel: nicvf, ver 1.0 Apr 16 00:23:12.951291 kernel: rtc-efi rtc-efi.0: registered as rtc0 Apr 16 00:23:12.951361 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-04-16T00:23:12 UTC (1776298992) Apr 16 00:23:12.951372 kernel: hid: raw HID events driver (C) Jiri Kosina Apr 16 00:23:12.951380 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Apr 16 00:23:12.951388 kernel: watchdog: Delayed init of the lockup detector failed: -19 Apr 16 00:23:12.951399 kernel: watchdog: Hard watchdog permanently disabled Apr 16 00:23:12.951407 kernel: NET: Registered PF_INET6 protocol family Apr 16 00:23:12.951415 kernel: Segment Routing with IPv6 Apr 16 00:23:12.951423 kernel: In-situ OAM (IOAM) with IPv6 Apr 16 00:23:12.951430 kernel: NET: Registered PF_PACKET protocol family Apr 16 00:23:12.951438 kernel: Key type dns_resolver registered Apr 16 00:23:12.951446 kernel: registered taskstats version 1 Apr 16 00:23:12.951453 kernel: Loading compiled-in X.509 certificates Apr 16 00:23:12.951462 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 42c6438655eac241afd498b973a7e22ad5b14a7d' Apr 16 00:23:12.951471 kernel: Key type .fscrypt registered Apr 16 00:23:12.951479 kernel: Key type fscrypt-provisioning registered Apr 16 00:23:12.951487 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 16 00:23:12.951494 kernel: ima: Allocated hash algorithm: sha1 Apr 16 00:23:12.951502 kernel: ima: No architecture policies found Apr 16 00:23:12.951510 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Apr 16 00:23:12.951518 kernel: clk: Disabling unused clocks Apr 16 00:23:12.951526 kernel: Freeing unused kernel memory: 39424K Apr 16 00:23:12.951534 kernel: Run /init as init process Apr 16 00:23:12.951543 kernel: with arguments: Apr 16 00:23:12.951551 kernel: /init Apr 16 00:23:12.951559 kernel: with environment: Apr 16 00:23:12.951578 kernel: HOME=/ Apr 16 00:23:12.951586 kernel: TERM=linux Apr 16 00:23:12.951596 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 16 00:23:12.951607 systemd[1]: Detected virtualization kvm. Apr 16 00:23:12.951615 systemd[1]: Detected architecture arm64. Apr 16 00:23:12.951626 systemd[1]: Running in initrd. Apr 16 00:23:12.951634 systemd[1]: No hostname configured, using default hostname. Apr 16 00:23:12.951642 systemd[1]: Hostname set to . Apr 16 00:23:12.951651 systemd[1]: Initializing machine ID from VM UUID. Apr 16 00:23:12.951659 systemd[1]: Queued start job for default target initrd.target. Apr 16 00:23:12.951667 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 00:23:12.951676 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 00:23:12.951685 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 16 00:23:12.951695 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 16 00:23:12.951705 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 16 00:23:12.951714 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 16 00:23:12.951724 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 16 00:23:12.951733 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 16 00:23:12.951741 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 00:23:12.951750 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 16 00:23:12.951759 systemd[1]: Reached target paths.target - Path Units. Apr 16 00:23:12.951768 systemd[1]: Reached target slices.target - Slice Units. Apr 16 00:23:12.951776 systemd[1]: Reached target swap.target - Swaps. Apr 16 00:23:12.951785 systemd[1]: Reached target timers.target - Timer Units. Apr 16 00:23:12.951793 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 16 00:23:12.951801 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 16 00:23:12.951810 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 16 00:23:12.951818 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 16 00:23:12.951828 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 16 00:23:12.951836 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 16 00:23:12.951845 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 00:23:12.951853 systemd[1]: Reached target sockets.target - Socket Units. Apr 16 00:23:12.951861 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 16 00:23:12.951870 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 16 00:23:12.951878 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 16 00:23:12.951887 systemd[1]: Starting systemd-fsck-usr.service... Apr 16 00:23:12.951895 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 16 00:23:12.951905 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 16 00:23:12.951914 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:23:12.951922 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 16 00:23:12.951930 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 00:23:12.951939 systemd[1]: Finished systemd-fsck-usr.service. Apr 16 00:23:12.951975 systemd-journald[238]: Collecting audit messages is disabled. Apr 16 00:23:12.951998 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 16 00:23:12.952008 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:23:12.952018 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 16 00:23:12.952027 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 16 00:23:12.952037 kernel: Bridge firewalling registered Apr 16 00:23:12.952045 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 16 00:23:12.952054 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 16 00:23:12.952065 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 16 00:23:12.952076 systemd-journald[238]: Journal started Apr 16 00:23:12.952098 systemd-journald[238]: Runtime Journal (/run/log/journal/9d23e8afbd6748c2adba8d58253e23f5) is 8.0M, max 76.6M, 68.6M free. Apr 16 00:23:12.912357 systemd-modules-load[239]: Inserted module 'overlay' Apr 16 00:23:12.941276 systemd-modules-load[239]: Inserted module 'br_netfilter' Apr 16 00:23:12.961660 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 16 00:23:12.961718 systemd[1]: Started systemd-journald.service - Journal Service. Apr 16 00:23:12.976529 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 16 00:23:12.980332 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 16 00:23:12.981358 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:23:12.982235 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 00:23:13.001529 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 16 00:23:13.002883 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 00:23:13.006450 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 16 00:23:13.023984 dracut-cmdline[272]: dracut-dracut-053 Apr 16 00:23:13.031511 dracut-cmdline[272]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=0adf63447ce845e6a0056fdc0e76e619192ad10bb115f878c5a0d78c1b8c220d Apr 16 00:23:13.046876 systemd-resolved[274]: Positive Trust Anchors: Apr 16 00:23:13.046892 systemd-resolved[274]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 16 00:23:13.046930 systemd-resolved[274]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 16 00:23:13.051922 systemd-resolved[274]: Defaulting to hostname 'linux'. Apr 16 00:23:13.053054 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 16 00:23:13.055176 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 16 00:23:13.129379 kernel: SCSI subsystem initialized Apr 16 00:23:13.134316 kernel: Loading iSCSI transport class v2.0-870. Apr 16 00:23:13.141327 kernel: iscsi: registered transport (tcp) Apr 16 00:23:13.155358 kernel: iscsi: registered transport (qla4xxx) Apr 16 00:23:13.155461 kernel: QLogic iSCSI HBA Driver Apr 16 00:23:13.212133 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 16 00:23:13.220617 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 16 00:23:13.242299 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 16 00:23:13.242375 kernel: device-mapper: uevent: version 1.0.3 Apr 16 00:23:13.243326 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Apr 16 00:23:13.295322 kernel: raid6: neonx8 gen() 15662 MB/s Apr 16 00:23:13.312325 kernel: raid6: neonx4 gen() 15583 MB/s Apr 16 00:23:13.329649 kernel: raid6: neonx2 gen() 13163 MB/s Apr 16 00:23:13.346362 kernel: raid6: neonx1 gen() 10428 MB/s Apr 16 00:23:13.363316 kernel: raid6: int64x8 gen() 6900 MB/s Apr 16 00:23:13.380348 kernel: raid6: int64x4 gen() 7277 MB/s Apr 16 00:23:13.397319 kernel: raid6: int64x2 gen() 6077 MB/s Apr 16 00:23:13.414360 kernel: raid6: int64x1 gen() 5022 MB/s Apr 16 00:23:13.414468 kernel: raid6: using algorithm neonx8 gen() 15662 MB/s Apr 16 00:23:13.431334 kernel: raid6: .... xor() 11920 MB/s, rmw enabled Apr 16 00:23:13.431426 kernel: raid6: using neon recovery algorithm Apr 16 00:23:13.436333 kernel: xor: measuring software checksum speed Apr 16 00:23:13.436414 kernel: 8regs : 19778 MB/sec Apr 16 00:23:13.436437 kernel: 32regs : 17362 MB/sec Apr 16 00:23:13.437328 kernel: arm64_neon : 26972 MB/sec Apr 16 00:23:13.437387 kernel: xor: using function: arm64_neon (26972 MB/sec) Apr 16 00:23:13.487317 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 16 00:23:13.502291 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 16 00:23:13.509549 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 00:23:13.524224 systemd-udevd[456]: Using default interface naming scheme 'v255'. Apr 16 00:23:13.527767 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 00:23:13.536552 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 16 00:23:13.556285 dracut-pre-trigger[460]: rd.md=0: removing MD RAID activation Apr 16 00:23:13.598010 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 16 00:23:13.608586 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 16 00:23:13.660290 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 00:23:13.668145 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 16 00:23:13.696060 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 16 00:23:13.698888 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 16 00:23:13.701550 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 00:23:13.702910 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 16 00:23:13.709516 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 16 00:23:13.728527 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 16 00:23:13.762543 kernel: scsi host0: Virtio SCSI HBA Apr 16 00:23:13.769873 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Apr 16 00:23:13.769961 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Apr 16 00:23:13.800294 kernel: ACPI: bus type USB registered Apr 16 00:23:13.800359 kernel: usbcore: registered new interface driver usbfs Apr 16 00:23:13.801835 kernel: usbcore: registered new interface driver hub Apr 16 00:23:13.801874 kernel: usbcore: registered new device driver usb Apr 16 00:23:13.807207 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 16 00:23:13.808945 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:23:13.811862 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 16 00:23:13.813674 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 16 00:23:13.813831 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:23:13.817202 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:23:13.822292 kernel: sr 0:0:0:0: Power-on or device reset occurred Apr 16 00:23:13.822516 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Apr 16 00:23:13.822634 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Apr 16 00:23:13.823787 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 16 00:23:13.823984 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Apr 16 00:23:13.824080 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Apr 16 00:23:13.825339 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 16 00:23:13.827699 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Apr 16 00:23:13.827897 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Apr 16 00:23:13.827997 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Apr 16 00:23:13.829883 kernel: hub 1-0:1.0: USB hub found Apr 16 00:23:13.830559 kernel: hub 1-0:1.0: 4 ports detected Apr 16 00:23:13.830702 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Apr 16 00:23:13.830809 kernel: hub 2-0:1.0: USB hub found Apr 16 00:23:13.831147 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:23:13.832940 kernel: hub 2-0:1.0: 4 ports detected Apr 16 00:23:13.854354 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:23:13.859332 kernel: sd 0:0:0:1: Power-on or device reset occurred Apr 16 00:23:13.859604 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Apr 16 00:23:13.859717 kernel: sd 0:0:0:1: [sda] Write Protect is off Apr 16 00:23:13.859813 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Apr 16 00:23:13.859895 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Apr 16 00:23:13.863758 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Apr 16 00:23:13.863806 kernel: GPT:17805311 != 80003071 Apr 16 00:23:13.863818 kernel: GPT:Alternate GPT header not at the end of the disk. Apr 16 00:23:13.863827 kernel: GPT:17805311 != 80003071 Apr 16 00:23:13.863844 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 16 00:23:13.863853 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:23:13.864603 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Apr 16 00:23:13.871892 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 16 00:23:13.892489 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:23:13.920459 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (502) Apr 16 00:23:13.920886 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Apr 16 00:23:13.925296 kernel: BTRFS: device fsid a6240e59-bdb5-4432-bae9-6f06a7303c55 devid 1 transid 37 /dev/sda3 scanned by (udev-worker) (518) Apr 16 00:23:13.933508 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Apr 16 00:23:13.944601 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 16 00:23:13.954696 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Apr 16 00:23:13.955389 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Apr 16 00:23:13.961534 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 16 00:23:13.971305 disk-uuid[574]: Primary Header is updated. Apr 16 00:23:13.971305 disk-uuid[574]: Secondary Entries is updated. Apr 16 00:23:13.971305 disk-uuid[574]: Secondary Header is updated. Apr 16 00:23:13.979562 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:23:13.984291 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:23:13.991306 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:23:14.066293 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Apr 16 00:23:14.205184 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Apr 16 00:23:14.205249 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Apr 16 00:23:14.205445 kernel: usbcore: registered new interface driver usbhid Apr 16 00:23:14.205458 kernel: usbhid: USB HID core driver Apr 16 00:23:14.310317 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Apr 16 00:23:14.439304 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Apr 16 00:23:14.493329 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Apr 16 00:23:14.995059 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:23:14.995123 disk-uuid[575]: The operation has completed successfully. Apr 16 00:23:15.045660 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 16 00:23:15.045779 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 16 00:23:15.065659 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 16 00:23:15.080561 sh[595]: Success Apr 16 00:23:15.094449 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Apr 16 00:23:15.151438 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 16 00:23:15.154439 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 16 00:23:15.156294 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 16 00:23:15.175142 kernel: BTRFS info (device dm-0): first mount of filesystem a6240e59-bdb5-4432-bae9-6f06a7303c55 Apr 16 00:23:15.175255 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:23:15.175327 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Apr 16 00:23:15.175353 kernel: BTRFS info (device dm-0): disabling log replay at mount time Apr 16 00:23:15.175729 kernel: BTRFS info (device dm-0): using free space tree Apr 16 00:23:15.182342 kernel: BTRFS info (device dm-0): enabling ssd optimizations Apr 16 00:23:15.184492 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 16 00:23:15.186550 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 16 00:23:15.195877 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 16 00:23:15.200138 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 16 00:23:15.215651 kernel: BTRFS info (device sda6): first mount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:23:15.215768 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:23:15.215783 kernel: BTRFS info (device sda6): using free space tree Apr 16 00:23:15.220793 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 16 00:23:15.220881 kernel: BTRFS info (device sda6): auto enabling async discard Apr 16 00:23:15.234375 kernel: BTRFS info (device sda6): last unmount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:23:15.235060 systemd[1]: mnt-oem.mount: Deactivated successfully. Apr 16 00:23:15.245508 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 16 00:23:15.250602 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 16 00:23:15.330117 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 16 00:23:15.339548 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 16 00:23:15.360447 systemd-networkd[783]: lo: Link UP Apr 16 00:23:15.360456 systemd-networkd[783]: lo: Gained carrier Apr 16 00:23:15.362219 systemd-networkd[783]: Enumeration completed Apr 16 00:23:15.362703 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:15.362706 systemd-networkd[783]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:23:15.363937 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 16 00:23:15.364184 systemd-networkd[783]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:15.364192 systemd-networkd[783]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:23:15.364688 systemd[1]: Reached target network.target - Network. Apr 16 00:23:15.374959 ignition[694]: Ignition 2.19.0 Apr 16 00:23:15.365722 systemd-networkd[783]: eth0: Link UP Apr 16 00:23:15.374966 ignition[694]: Stage: fetch-offline Apr 16 00:23:15.365729 systemd-networkd[783]: eth0: Gained carrier Apr 16 00:23:15.375011 ignition[694]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:23:15.365742 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:15.375020 ignition[694]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:23:15.372244 systemd-networkd[783]: eth1: Link UP Apr 16 00:23:15.375199 ignition[694]: parsed url from cmdline: "" Apr 16 00:23:15.372253 systemd-networkd[783]: eth1: Gained carrier Apr 16 00:23:15.375202 ignition[694]: no config URL provided Apr 16 00:23:15.372297 systemd-networkd[783]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:15.375207 ignition[694]: reading system config file "/usr/lib/ignition/user.ign" Apr 16 00:23:15.379369 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 16 00:23:15.375214 ignition[694]: no config at "/usr/lib/ignition/user.ign" Apr 16 00:23:15.375220 ignition[694]: failed to fetch config: resource requires networking Apr 16 00:23:15.375617 ignition[694]: Ignition finished successfully Apr 16 00:23:15.388801 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 16 00:23:15.402194 ignition[786]: Ignition 2.19.0 Apr 16 00:23:15.402206 ignition[786]: Stage: fetch Apr 16 00:23:15.402416 ignition[786]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:23:15.402427 ignition[786]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:23:15.402545 ignition[786]: parsed url from cmdline: "" Apr 16 00:23:15.402549 ignition[786]: no config URL provided Apr 16 00:23:15.402554 ignition[786]: reading system config file "/usr/lib/ignition/user.ign" Apr 16 00:23:15.402562 ignition[786]: no config at "/usr/lib/ignition/user.ign" Apr 16 00:23:15.402581 ignition[786]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Apr 16 00:23:15.403240 ignition[786]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Apr 16 00:23:15.416380 systemd-networkd[783]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 16 00:23:15.429366 systemd-networkd[783]: eth0: DHCPv4 address 46.224.6.157/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 16 00:23:15.603483 ignition[786]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Apr 16 00:23:15.612101 ignition[786]: GET result: OK Apr 16 00:23:15.612281 ignition[786]: parsing config with SHA512: 45bfd9a868a96c701e9ee47d814d3041c3e3f5db268b72569f018221a5f84fe990f6ee8e578044e8f5d851b231f8745d4fbb6415d07352868804e2ebbfdb3b79 Apr 16 00:23:15.618303 unknown[786]: fetched base config from "system" Apr 16 00:23:15.619153 ignition[786]: fetch: fetch complete Apr 16 00:23:15.618319 unknown[786]: fetched base config from "system" Apr 16 00:23:15.619158 ignition[786]: fetch: fetch passed Apr 16 00:23:15.618339 unknown[786]: fetched user config from "hetzner" Apr 16 00:23:15.619210 ignition[786]: Ignition finished successfully Apr 16 00:23:15.622498 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 16 00:23:15.629661 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 16 00:23:15.642029 ignition[794]: Ignition 2.19.0 Apr 16 00:23:15.642039 ignition[794]: Stage: kargs Apr 16 00:23:15.642235 ignition[794]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:23:15.642245 ignition[794]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:23:15.643335 ignition[794]: kargs: kargs passed Apr 16 00:23:15.643397 ignition[794]: Ignition finished successfully Apr 16 00:23:15.645238 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 16 00:23:15.650467 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 16 00:23:15.664022 ignition[800]: Ignition 2.19.0 Apr 16 00:23:15.664039 ignition[800]: Stage: disks Apr 16 00:23:15.664284 ignition[800]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:23:15.664305 ignition[800]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:23:15.666120 ignition[800]: disks: disks passed Apr 16 00:23:15.666210 ignition[800]: Ignition finished successfully Apr 16 00:23:15.668909 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 16 00:23:15.670624 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 16 00:23:15.671234 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 16 00:23:15.673943 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 16 00:23:15.674518 systemd[1]: Reached target sysinit.target - System Initialization. Apr 16 00:23:15.675191 systemd[1]: Reached target basic.target - Basic System. Apr 16 00:23:15.681550 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 16 00:23:15.701301 systemd-fsck[808]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Apr 16 00:23:15.705737 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 16 00:23:15.715114 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 16 00:23:15.766299 kernel: EXT4-fs (sda9): mounted filesystem a7d1b52a-2d60-4e63-87fc-077f5b665cf4 r/w with ordered data mode. Quota mode: none. Apr 16 00:23:15.766957 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 16 00:23:15.768590 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 16 00:23:15.775413 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 16 00:23:15.779417 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 16 00:23:15.781463 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Apr 16 00:23:15.784437 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 16 00:23:15.784482 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 16 00:23:15.794199 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (816) Apr 16 00:23:15.796628 kernel: BTRFS info (device sda6): first mount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:23:15.796683 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:23:15.796695 kernel: BTRFS info (device sda6): using free space tree Apr 16 00:23:15.798313 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 16 00:23:15.806757 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 16 00:23:15.812288 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 16 00:23:15.812361 kernel: BTRFS info (device sda6): auto enabling async discard Apr 16 00:23:15.815163 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 16 00:23:15.864149 coreos-metadata[818]: Apr 16 00:23:15.863 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Apr 16 00:23:15.865503 initrd-setup-root[843]: cut: /sysroot/etc/passwd: No such file or directory Apr 16 00:23:15.866949 coreos-metadata[818]: Apr 16 00:23:15.866 INFO Fetch successful Apr 16 00:23:15.868627 coreos-metadata[818]: Apr 16 00:23:15.868 INFO wrote hostname ci-4081-3-6-n-56c15b786d to /sysroot/etc/hostname Apr 16 00:23:15.872112 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 16 00:23:15.877596 initrd-setup-root[851]: cut: /sysroot/etc/group: No such file or directory Apr 16 00:23:15.883717 initrd-setup-root[858]: cut: /sysroot/etc/shadow: No such file or directory Apr 16 00:23:15.889684 initrd-setup-root[865]: cut: /sysroot/etc/gshadow: No such file or directory Apr 16 00:23:16.002111 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 16 00:23:16.008423 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 16 00:23:16.011696 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 16 00:23:16.020323 kernel: BTRFS info (device sda6): last unmount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:23:16.040348 ignition[933]: INFO : Ignition 2.19.0 Apr 16 00:23:16.040348 ignition[933]: INFO : Stage: mount Apr 16 00:23:16.040348 ignition[933]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 00:23:16.040348 ignition[933]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:23:16.042778 ignition[933]: INFO : mount: mount passed Apr 16 00:23:16.042778 ignition[933]: INFO : Ignition finished successfully Apr 16 00:23:16.046343 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 16 00:23:16.058559 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 16 00:23:16.060421 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 16 00:23:16.173311 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 16 00:23:16.187615 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 16 00:23:16.197323 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (944) Apr 16 00:23:16.199354 kernel: BTRFS info (device sda6): first mount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:23:16.199405 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:23:16.199422 kernel: BTRFS info (device sda6): using free space tree Apr 16 00:23:16.203312 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 16 00:23:16.203410 kernel: BTRFS info (device sda6): auto enabling async discard Apr 16 00:23:16.206709 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 16 00:23:16.233026 ignition[962]: INFO : Ignition 2.19.0 Apr 16 00:23:16.233026 ignition[962]: INFO : Stage: files Apr 16 00:23:16.234181 ignition[962]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 00:23:16.234181 ignition[962]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:23:16.236024 ignition[962]: DEBUG : files: compiled without relabeling support, skipping Apr 16 00:23:16.236024 ignition[962]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 16 00:23:16.236024 ignition[962]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 16 00:23:16.239955 ignition[962]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 16 00:23:16.239955 ignition[962]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 16 00:23:16.242490 ignition[962]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 16 00:23:16.241094 unknown[962]: wrote ssh authorized keys file for user: core Apr 16 00:23:16.244379 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 16 00:23:16.244379 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Apr 16 00:23:16.292014 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Apr 16 00:23:16.374088 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 16 00:23:16.374088 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Apr 16 00:23:16.374088 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Apr 16 00:23:16.374088 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Apr 16 00:23:16.378675 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-arm64.raw: attempt #1 Apr 16 00:23:16.448742 systemd-networkd[783]: eth0: Gained IPv6LL Apr 16 00:23:16.719842 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Apr 16 00:23:16.831387 systemd-networkd[783]: eth1: Gained IPv6LL Apr 16 00:23:17.478105 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Apr 16 00:23:17.478105 ignition[962]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Apr 16 00:23:17.482136 ignition[962]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 16 00:23:17.482136 ignition[962]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 16 00:23:17.482136 ignition[962]: INFO : files: files passed Apr 16 00:23:17.482136 ignition[962]: INFO : Ignition finished successfully Apr 16 00:23:17.482119 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 16 00:23:17.487487 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 16 00:23:17.493724 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 16 00:23:17.498591 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 16 00:23:17.500296 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 16 00:23:17.509327 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 16 00:23:17.509327 initrd-setup-root-after-ignition[989]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 16 00:23:17.512080 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 16 00:23:17.515043 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 16 00:23:17.516049 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 16 00:23:17.527438 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 16 00:23:17.568903 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 16 00:23:17.569062 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 16 00:23:17.571368 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 16 00:23:17.572308 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 16 00:23:17.573645 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 16 00:23:17.582577 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 16 00:23:17.599364 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 16 00:23:17.608538 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 16 00:23:17.619718 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 16 00:23:17.620550 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 00:23:17.622189 systemd[1]: Stopped target timers.target - Timer Units. Apr 16 00:23:17.623395 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 16 00:23:17.623550 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 16 00:23:17.625107 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 16 00:23:17.625836 systemd[1]: Stopped target basic.target - Basic System. Apr 16 00:23:17.626885 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 16 00:23:17.627966 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 16 00:23:17.628997 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 16 00:23:17.630102 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 16 00:23:17.631191 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 16 00:23:17.632392 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 16 00:23:17.633391 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 16 00:23:17.634505 systemd[1]: Stopped target swap.target - Swaps. Apr 16 00:23:17.635420 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 16 00:23:17.635583 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 16 00:23:17.636837 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 16 00:23:17.637484 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 00:23:17.638540 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 16 00:23:17.640308 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 00:23:17.641405 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 16 00:23:17.641533 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 16 00:23:17.643143 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 16 00:23:17.643248 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 16 00:23:17.644459 systemd[1]: ignition-files.service: Deactivated successfully. Apr 16 00:23:17.644566 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 16 00:23:17.645571 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Apr 16 00:23:17.645677 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 16 00:23:17.656737 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 16 00:23:17.658844 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 16 00:23:17.659148 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 00:23:17.663556 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 16 00:23:17.664153 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 16 00:23:17.664315 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 00:23:17.666127 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 16 00:23:17.666227 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 16 00:23:17.675809 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 16 00:23:17.677308 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 16 00:23:17.686472 ignition[1013]: INFO : Ignition 2.19.0 Apr 16 00:23:17.688978 ignition[1013]: INFO : Stage: umount Apr 16 00:23:17.688978 ignition[1013]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 00:23:17.688978 ignition[1013]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:23:17.688978 ignition[1013]: INFO : umount: umount passed Apr 16 00:23:17.688978 ignition[1013]: INFO : Ignition finished successfully Apr 16 00:23:17.691243 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 16 00:23:17.693149 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 16 00:23:17.693252 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 16 00:23:17.696583 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 16 00:23:17.696695 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 16 00:23:17.698156 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 16 00:23:17.698254 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 16 00:23:17.699445 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 16 00:23:17.699510 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 16 00:23:17.700391 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 16 00:23:17.700429 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 16 00:23:17.701397 systemd[1]: Stopped target network.target - Network. Apr 16 00:23:17.702183 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 16 00:23:17.702233 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 16 00:23:17.703250 systemd[1]: Stopped target paths.target - Path Units. Apr 16 00:23:17.704154 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 16 00:23:17.704649 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 00:23:17.705870 systemd[1]: Stopped target slices.target - Slice Units. Apr 16 00:23:17.706903 systemd[1]: Stopped target sockets.target - Socket Units. Apr 16 00:23:17.707910 systemd[1]: iscsid.socket: Deactivated successfully. Apr 16 00:23:17.707956 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 16 00:23:17.709018 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 16 00:23:17.709061 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 16 00:23:17.710294 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 16 00:23:17.710353 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 16 00:23:17.711519 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 16 00:23:17.711566 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 16 00:23:17.712819 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 16 00:23:17.712872 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 16 00:23:17.714051 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 16 00:23:17.717113 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 16 00:23:17.724383 systemd-networkd[783]: eth0: DHCPv6 lease lost Apr 16 00:23:17.728164 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 16 00:23:17.728395 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 16 00:23:17.732411 systemd-networkd[783]: eth1: DHCPv6 lease lost Apr 16 00:23:17.733992 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 16 00:23:17.734073 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 00:23:17.736232 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 16 00:23:17.736370 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 16 00:23:17.737957 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 16 00:23:17.738021 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 16 00:23:17.753020 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 16 00:23:17.754361 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 16 00:23:17.754482 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 16 00:23:17.756188 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 16 00:23:17.756242 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 16 00:23:17.758315 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 16 00:23:17.758367 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 16 00:23:17.759640 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 00:23:17.775903 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 16 00:23:17.776939 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 16 00:23:17.780185 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 16 00:23:17.782358 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 00:23:17.783544 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 16 00:23:17.783598 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 16 00:23:17.784938 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 16 00:23:17.784973 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 00:23:17.788072 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 16 00:23:17.788177 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 16 00:23:17.790253 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 16 00:23:17.790367 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 16 00:23:17.792206 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 16 00:23:17.792310 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:23:17.799548 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 16 00:23:17.800132 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 16 00:23:17.800195 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 00:23:17.802840 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 16 00:23:17.802895 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:23:17.814033 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 16 00:23:17.814174 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 16 00:23:17.816905 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 16 00:23:17.823622 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 16 00:23:17.834309 systemd[1]: Switching root. Apr 16 00:23:17.872017 systemd-journald[238]: Journal stopped Apr 16 00:23:18.815662 systemd-journald[238]: Received SIGTERM from PID 1 (systemd). Apr 16 00:23:18.815741 kernel: SELinux: policy capability network_peer_controls=1 Apr 16 00:23:18.815756 kernel: SELinux: policy capability open_perms=1 Apr 16 00:23:18.815770 kernel: SELinux: policy capability extended_socket_class=1 Apr 16 00:23:18.815780 kernel: SELinux: policy capability always_check_network=0 Apr 16 00:23:18.815795 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 16 00:23:18.815806 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 16 00:23:18.815815 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 16 00:23:18.815825 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 16 00:23:18.815835 kernel: audit: type=1403 audit(1776298998.065:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 16 00:23:18.815850 systemd[1]: Successfully loaded SELinux policy in 34.461ms. Apr 16 00:23:18.815872 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 11.088ms. Apr 16 00:23:18.815884 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 16 00:23:18.815896 systemd[1]: Detected virtualization kvm. Apr 16 00:23:18.815907 systemd[1]: Detected architecture arm64. Apr 16 00:23:18.815917 systemd[1]: Detected first boot. Apr 16 00:23:18.815928 systemd[1]: Hostname set to . Apr 16 00:23:18.815938 systemd[1]: Initializing machine ID from VM UUID. Apr 16 00:23:18.815949 zram_generator::config[1056]: No configuration found. Apr 16 00:23:18.815962 systemd[1]: Populated /etc with preset unit settings. Apr 16 00:23:18.815972 systemd[1]: initrd-switch-root.service: Deactivated successfully. Apr 16 00:23:18.815986 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Apr 16 00:23:18.815997 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Apr 16 00:23:18.816010 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 16 00:23:18.816021 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 16 00:23:18.816031 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 16 00:23:18.816042 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 16 00:23:18.816053 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 16 00:23:18.816066 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 16 00:23:18.816076 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 16 00:23:18.816088 systemd[1]: Created slice user.slice - User and Session Slice. Apr 16 00:23:18.816100 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 00:23:18.816111 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 00:23:18.816121 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 16 00:23:18.816132 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 16 00:23:18.816145 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 16 00:23:18.816160 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 16 00:23:18.816174 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Apr 16 00:23:18.816187 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 00:23:18.816204 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Apr 16 00:23:18.816218 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Apr 16 00:23:18.816230 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Apr 16 00:23:18.816248 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 16 00:23:18.816260 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 00:23:18.816288 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 16 00:23:18.816300 systemd[1]: Reached target slices.target - Slice Units. Apr 16 00:23:18.816310 systemd[1]: Reached target swap.target - Swaps. Apr 16 00:23:18.816322 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 16 00:23:18.816337 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 16 00:23:18.816348 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 16 00:23:18.816359 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 16 00:23:18.816369 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 00:23:18.816382 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 16 00:23:18.816393 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 16 00:23:18.816404 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 16 00:23:18.816414 systemd[1]: Mounting media.mount - External Media Directory... Apr 16 00:23:18.816425 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 16 00:23:18.816435 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 16 00:23:18.816445 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 16 00:23:18.816456 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 16 00:23:18.816467 systemd[1]: Reached target machines.target - Containers. Apr 16 00:23:18.816489 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 16 00:23:18.816501 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:23:18.816514 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 16 00:23:18.816527 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 16 00:23:18.816540 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 00:23:18.816554 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 16 00:23:18.816565 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 00:23:18.816576 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 16 00:23:18.816586 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 00:23:18.816597 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 16 00:23:18.816608 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Apr 16 00:23:18.816619 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Apr 16 00:23:18.816630 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Apr 16 00:23:18.816642 systemd[1]: Stopped systemd-fsck-usr.service. Apr 16 00:23:18.816653 kernel: fuse: init (API version 7.39) Apr 16 00:23:18.816663 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 16 00:23:18.816674 kernel: ACPI: bus type drm_connector registered Apr 16 00:23:18.816685 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 16 00:23:18.816695 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 16 00:23:18.816707 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 16 00:23:18.816717 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 16 00:23:18.816728 systemd[1]: verity-setup.service: Deactivated successfully. Apr 16 00:23:18.816739 systemd[1]: Stopped verity-setup.service. Apr 16 00:23:18.816752 kernel: loop: module loaded Apr 16 00:23:18.816762 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 16 00:23:18.816773 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 16 00:23:18.816783 systemd[1]: Mounted media.mount - External Media Directory. Apr 16 00:23:18.816796 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 16 00:23:18.816806 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 16 00:23:18.816817 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 16 00:23:18.816828 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 00:23:18.816869 systemd-journald[1126]: Collecting audit messages is disabled. Apr 16 00:23:18.816892 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 16 00:23:18.816906 systemd-journald[1126]: Journal started Apr 16 00:23:18.816931 systemd-journald[1126]: Runtime Journal (/run/log/journal/9d23e8afbd6748c2adba8d58253e23f5) is 8.0M, max 76.6M, 68.6M free. Apr 16 00:23:18.564073 systemd[1]: Queued start job for default target multi-user.target. Apr 16 00:23:18.583651 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Apr 16 00:23:18.584554 systemd[1]: systemd-journald.service: Deactivated successfully. Apr 16 00:23:18.818573 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 16 00:23:18.821301 systemd[1]: Started systemd-journald.service - Journal Service. Apr 16 00:23:18.823305 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 00:23:18.824373 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 00:23:18.833049 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 16 00:23:18.833336 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 16 00:23:18.835394 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 00:23:18.835582 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 00:23:18.836639 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 16 00:23:18.836794 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 16 00:23:18.839618 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 00:23:18.840239 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 00:23:18.845417 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 16 00:23:18.846454 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 16 00:23:18.859781 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 16 00:23:18.862314 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 16 00:23:18.868980 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 16 00:23:18.877538 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 16 00:23:18.884523 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 16 00:23:18.886456 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 16 00:23:18.886518 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 16 00:23:18.889780 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Apr 16 00:23:18.895522 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 16 00:23:18.901143 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 16 00:23:18.901971 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:23:18.905023 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 16 00:23:18.909641 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 16 00:23:18.911022 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 16 00:23:18.918663 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 16 00:23:18.920195 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 16 00:23:18.922544 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 16 00:23:18.928156 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 16 00:23:18.934306 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 16 00:23:18.936797 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 16 00:23:18.937784 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 16 00:23:18.938756 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 16 00:23:18.951663 systemd-journald[1126]: Time spent on flushing to /var/log/journal/9d23e8afbd6748c2adba8d58253e23f5 is 78.706ms for 1124 entries. Apr 16 00:23:18.951663 systemd-journald[1126]: System Journal (/var/log/journal/9d23e8afbd6748c2adba8d58253e23f5) is 8.0M, max 584.8M, 576.8M free. Apr 16 00:23:19.049677 systemd-journald[1126]: Received client request to flush runtime journal. Apr 16 00:23:19.049748 kernel: loop0: detected capacity change from 0 to 114432 Apr 16 00:23:19.049775 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 16 00:23:18.963349 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 00:23:18.976729 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Apr 16 00:23:18.979388 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 16 00:23:18.984428 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 16 00:23:18.992700 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Apr 16 00:23:19.048195 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 16 00:23:19.050833 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 16 00:23:19.054317 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 16 00:23:19.055075 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 16 00:23:19.058298 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Apr 16 00:23:19.062865 udevadm[1177]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Apr 16 00:23:19.066304 kernel: loop1: detected capacity change from 0 to 209336 Apr 16 00:23:19.070082 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 16 00:23:19.096143 systemd-tmpfiles[1189]: ACLs are not supported, ignoring. Apr 16 00:23:19.096884 systemd-tmpfiles[1189]: ACLs are not supported, ignoring. Apr 16 00:23:19.107407 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 00:23:19.118331 kernel: loop2: detected capacity change from 0 to 114328 Apr 16 00:23:19.159302 kernel: loop3: detected capacity change from 0 to 8 Apr 16 00:23:19.183310 kernel: loop4: detected capacity change from 0 to 114432 Apr 16 00:23:19.194325 kernel: loop5: detected capacity change from 0 to 209336 Apr 16 00:23:19.214320 kernel: loop6: detected capacity change from 0 to 114328 Apr 16 00:23:19.234333 kernel: loop7: detected capacity change from 0 to 8 Apr 16 00:23:19.235766 (sd-merge)[1195]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Apr 16 00:23:19.238895 (sd-merge)[1195]: Merged extensions into '/usr'. Apr 16 00:23:19.246866 systemd[1]: Reloading requested from client PID 1170 ('systemd-sysext') (unit systemd-sysext.service)... Apr 16 00:23:19.246883 systemd[1]: Reloading... Apr 16 00:23:19.388291 zram_generator::config[1221]: No configuration found. Apr 16 00:23:19.511360 ldconfig[1165]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 16 00:23:19.554319 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:23:19.603116 systemd[1]: Reloading finished in 355 ms. Apr 16 00:23:19.629310 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 16 00:23:19.630343 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 16 00:23:19.631298 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 16 00:23:19.639534 systemd[1]: Starting ensure-sysext.service... Apr 16 00:23:19.643575 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 16 00:23:19.654786 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 00:23:19.660457 systemd[1]: Reloading requested from client PID 1259 ('systemctl') (unit ensure-sysext.service)... Apr 16 00:23:19.660491 systemd[1]: Reloading... Apr 16 00:23:19.686352 systemd-udevd[1261]: Using default interface naming scheme 'v255'. Apr 16 00:23:19.689666 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 16 00:23:19.689934 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 16 00:23:19.690635 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 16 00:23:19.690852 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Apr 16 00:23:19.690905 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Apr 16 00:23:19.693598 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Apr 16 00:23:19.693609 systemd-tmpfiles[1260]: Skipping /boot Apr 16 00:23:19.706186 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Apr 16 00:23:19.706659 systemd-tmpfiles[1260]: Skipping /boot Apr 16 00:23:19.777341 zram_generator::config[1295]: No configuration found. Apr 16 00:23:19.928640 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:23:20.001409 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1306) Apr 16 00:23:20.014440 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Apr 16 00:23:20.014794 systemd[1]: Reloading finished in 354 ms. Apr 16 00:23:20.042970 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 00:23:20.046076 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 00:23:20.055335 kernel: mousedev: PS/2 mouse device common for all mice Apr 16 00:23:20.118819 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Apr 16 00:23:20.118923 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Apr 16 00:23:20.118944 kernel: [drm] features: -context_init Apr 16 00:23:20.120315 kernel: [drm] number of scanouts: 1 Apr 16 00:23:20.120422 kernel: [drm] number of cap sets: 0 Apr 16 00:23:20.122313 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Apr 16 00:23:20.123327 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Apr 16 00:23:20.128290 kernel: Console: switching to colour frame buffer device 160x50 Apr 16 00:23:20.132606 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 16 00:23:20.149810 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Apr 16 00:23:20.159149 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 16 00:23:20.161589 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:23:20.165486 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 00:23:20.172650 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 00:23:20.175773 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 00:23:20.177201 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:23:20.181573 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 16 00:23:20.188333 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 16 00:23:20.193611 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 16 00:23:20.203725 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 16 00:23:20.206090 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 00:23:20.208679 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 00:23:20.212337 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 16 00:23:20.234236 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 00:23:20.235159 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 00:23:20.258903 systemd[1]: Finished ensure-sysext.service. Apr 16 00:23:20.261743 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 00:23:20.262016 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 00:23:20.277088 augenrules[1396]: No rules Apr 16 00:23:20.280939 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 16 00:23:20.282556 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 16 00:23:20.288322 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 16 00:23:20.293577 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:23:20.299670 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 16 00:23:20.303572 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 00:23:20.306571 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:23:20.308332 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 16 00:23:20.309004 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 16 00:23:20.314638 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Apr 16 00:23:20.319746 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 16 00:23:20.323503 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 16 00:23:20.327372 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:23:20.328598 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 16 00:23:20.328890 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 16 00:23:20.331153 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 00:23:20.331517 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 00:23:20.335206 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 16 00:23:20.349617 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 16 00:23:20.350594 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 16 00:23:20.359669 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Apr 16 00:23:20.367590 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Apr 16 00:23:20.370127 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 16 00:23:20.374053 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 16 00:23:20.385365 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 16 00:23:20.396975 lvm[1416]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 16 00:23:20.427328 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Apr 16 00:23:20.429774 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 16 00:23:20.439847 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Apr 16 00:23:20.447833 lvm[1429]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 16 00:23:20.469836 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Apr 16 00:23:20.497816 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:23:20.504288 systemd-resolved[1383]: Positive Trust Anchors: Apr 16 00:23:20.504821 systemd-resolved[1383]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 16 00:23:20.504857 systemd-resolved[1383]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 16 00:23:20.515733 systemd-resolved[1383]: Using system hostname 'ci-4081-3-6-n-56c15b786d'. Apr 16 00:23:20.519490 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 16 00:23:20.519710 systemd-networkd[1382]: lo: Link UP Apr 16 00:23:20.519715 systemd-networkd[1382]: lo: Gained carrier Apr 16 00:23:20.520286 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 16 00:23:20.521879 systemd-networkd[1382]: Enumeration completed Apr 16 00:23:20.522013 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 16 00:23:20.523160 systemd-networkd[1382]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:20.523165 systemd-networkd[1382]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:23:20.523590 systemd[1]: Reached target network.target - Network. Apr 16 00:23:20.524601 systemd-networkd[1382]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:20.524617 systemd-networkd[1382]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:23:20.525224 systemd-networkd[1382]: eth0: Link UP Apr 16 00:23:20.525228 systemd-networkd[1382]: eth0: Gained carrier Apr 16 00:23:20.525246 systemd-networkd[1382]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:20.533701 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 16 00:23:20.533789 systemd-networkd[1382]: eth1: Link UP Apr 16 00:23:20.533794 systemd-networkd[1382]: eth1: Gained carrier Apr 16 00:23:20.533817 systemd-networkd[1382]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:23:20.536384 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Apr 16 00:23:20.538418 systemd[1]: Reached target sysinit.target - System Initialization. Apr 16 00:23:20.539484 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 16 00:23:20.540832 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 16 00:23:20.542004 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 16 00:23:20.543063 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 16 00:23:20.543185 systemd[1]: Reached target paths.target - Path Units. Apr 16 00:23:20.544142 systemd[1]: Reached target time-set.target - System Time Set. Apr 16 00:23:20.545235 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 16 00:23:20.546088 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 16 00:23:20.547037 systemd[1]: Reached target timers.target - Timer Units. Apr 16 00:23:20.548485 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 16 00:23:20.551043 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 16 00:23:20.567240 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 16 00:23:20.568916 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 16 00:23:20.569916 systemd[1]: Reached target sockets.target - Socket Units. Apr 16 00:23:20.570620 systemd[1]: Reached target basic.target - Basic System. Apr 16 00:23:20.571172 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 16 00:23:20.571211 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 16 00:23:20.579528 systemd[1]: Starting containerd.service - containerd container runtime... Apr 16 00:23:20.582072 systemd-networkd[1382]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 16 00:23:20.586027 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Apr 16 00:23:20.588124 systemd-networkd[1382]: eth0: DHCPv4 address 46.224.6.157/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 16 00:23:20.589866 systemd-timesyncd[1407]: Network configuration changed, trying to establish connection. Apr 16 00:23:20.591244 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 16 00:23:20.600476 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 16 00:23:20.604976 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 16 00:23:20.606374 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 16 00:23:20.611712 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 16 00:23:20.616490 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 16 00:23:20.618126 jq[1442]: false Apr 16 00:23:20.622557 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Apr 16 00:23:20.633581 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 16 00:23:20.638743 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 16 00:23:20.644569 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 16 00:23:20.646115 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 16 00:23:20.647185 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 16 00:23:20.648761 systemd[1]: Starting update-engine.service - Update Engine... Apr 16 00:23:20.652472 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 16 00:23:20.657760 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 16 00:23:20.659332 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 16 00:23:20.660739 systemd-timesyncd[1407]: Contacted time server 139.162.187.236:123 (0.flatcar.pool.ntp.org). Apr 16 00:23:20.660804 systemd-timesyncd[1407]: Initial clock synchronization to Thu 2026-04-16 00:23:20.922769 UTC. Apr 16 00:23:20.683785 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 16 00:23:20.683999 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 16 00:23:20.696739 coreos-metadata[1439]: Apr 16 00:23:20.696 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Apr 16 00:23:20.699560 coreos-metadata[1439]: Apr 16 00:23:20.699 INFO Fetch successful Apr 16 00:23:20.699910 coreos-metadata[1439]: Apr 16 00:23:20.699 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Apr 16 00:23:20.707181 jq[1453]: true Apr 16 00:23:20.711222 coreos-metadata[1439]: Apr 16 00:23:20.708 INFO Fetch successful Apr 16 00:23:20.715068 dbus-daemon[1440]: [system] SELinux support is enabled Apr 16 00:23:20.730998 extend-filesystems[1443]: Found loop4 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found loop5 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found loop6 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found loop7 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda1 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda2 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda3 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found usr Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda4 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda6 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda7 Apr 16 00:23:20.730998 extend-filesystems[1443]: Found sda9 Apr 16 00:23:20.730998 extend-filesystems[1443]: Checking size of /dev/sda9 Apr 16 00:23:20.730784 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 16 00:23:20.787709 tar[1457]: linux-arm64/LICENSE Apr 16 00:23:20.787709 tar[1457]: linux-arm64/helm Apr 16 00:23:20.739000 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 16 00:23:20.792357 jq[1472]: true Apr 16 00:23:20.739037 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 16 00:23:20.740891 (ntainerd)[1467]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 16 00:23:20.746417 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 16 00:23:20.746439 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 16 00:23:20.750400 systemd[1]: motdgen.service: Deactivated successfully. Apr 16 00:23:20.750720 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 16 00:23:20.815665 update_engine[1452]: I20260416 00:23:20.811895 1452 main.cc:92] Flatcar Update Engine starting Apr 16 00:23:20.815989 extend-filesystems[1443]: Resized partition /dev/sda9 Apr 16 00:23:20.835562 extend-filesystems[1492]: resize2fs 1.47.1 (20-May-2024) Apr 16 00:23:20.836839 systemd[1]: Started update-engine.service - Update Engine. Apr 16 00:23:20.837173 update_engine[1452]: I20260416 00:23:20.836895 1452 update_check_scheduler.cc:74] Next update check in 5m26s Apr 16 00:23:20.842552 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 16 00:23:20.851304 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Apr 16 00:23:20.873055 systemd-logind[1451]: New seat seat0. Apr 16 00:23:20.881613 systemd-logind[1451]: Watching system buttons on /dev/input/event0 (Power Button) Apr 16 00:23:20.881642 systemd-logind[1451]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Apr 16 00:23:20.881895 systemd[1]: Started systemd-logind.service - User Login Management. Apr 16 00:23:20.898868 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Apr 16 00:23:20.900887 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 16 00:23:20.927318 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1298) Apr 16 00:23:20.983332 bash[1512]: Updated "/home/core/.ssh/authorized_keys" Apr 16 00:23:20.986073 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 16 00:23:21.003861 systemd[1]: Starting sshkeys.service... Apr 16 00:23:21.047818 containerd[1467]: time="2026-04-16T00:23:21.047661582Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Apr 16 00:23:21.069337 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Apr 16 00:23:21.081419 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Apr 16 00:23:21.090613 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.084114690Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.085773974Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.085816896Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.085835196Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.097826319Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.097871802Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.097954382Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.097971030Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.098172998Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.098191133Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099316 containerd[1467]: time="2026-04-16T00:23:21.098211954Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099542 containerd[1467]: time="2026-04-16T00:23:21.098223149Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099542 containerd[1467]: time="2026-04-16T00:23:21.098330433Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099542 containerd[1467]: time="2026-04-16T00:23:21.098546322Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099542 containerd[1467]: time="2026-04-16T00:23:21.098730940Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:23:21.099542 containerd[1467]: time="2026-04-16T00:23:21.098748042Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Apr 16 00:23:21.099542 containerd[1467]: time="2026-04-16T00:23:21.098841900Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Apr 16 00:23:21.099542 containerd[1467]: time="2026-04-16T00:23:21.098927000Z" level=info msg="metadata content store policy set" policy=shared Apr 16 00:23:21.104869 extend-filesystems[1492]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Apr 16 00:23:21.104869 extend-filesystems[1492]: old_desc_blocks = 1, new_desc_blocks = 5 Apr 16 00:23:21.104869 extend-filesystems[1492]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Apr 16 00:23:21.110382 extend-filesystems[1443]: Resized filesystem in /dev/sda9 Apr 16 00:23:21.110382 extend-filesystems[1443]: Found sr0 Apr 16 00:23:21.107772 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 16 00:23:21.114907 containerd[1467]: time="2026-04-16T00:23:21.113168025Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Apr 16 00:23:21.114907 containerd[1467]: time="2026-04-16T00:23:21.113325171Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Apr 16 00:23:21.114907 containerd[1467]: time="2026-04-16T00:23:21.113353221Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Apr 16 00:23:21.114907 containerd[1467]: time="2026-04-16T00:23:21.113372306Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Apr 16 00:23:21.114907 containerd[1467]: time="2026-04-16T00:23:21.113400026Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Apr 16 00:23:21.114907 containerd[1467]: time="2026-04-16T00:23:21.113660944Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Apr 16 00:23:21.110163 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.115362117Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.115568753Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117352630Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117375433Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117390966Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117416992Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117441985Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117468713Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117496721Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117512543Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117526382Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117540717Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117571659Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.117652 containerd[1467]: time="2026-04-16T00:23:21.117587398Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.117996 containerd[1467]: time="2026-04-16T00:23:21.117602022Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.117996 containerd[1467]: time="2026-04-16T00:23:21.117618505Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.117996 containerd[1467]: time="2026-04-16T00:23:21.117630940Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118083044Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118106673Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118122867Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118157816Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118197515Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118214990Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118240313Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118255226Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118303477Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118334460Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118351232Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118363254Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118516269Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Apr 16 00:23:21.119352 containerd[1467]: time="2026-04-16T00:23:21.118684403Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Apr 16 00:23:21.119715 containerd[1467]: time="2026-04-16T00:23:21.118707496Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Apr 16 00:23:21.119715 containerd[1467]: time="2026-04-16T00:23:21.118722657Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Apr 16 00:23:21.119715 containerd[1467]: time="2026-04-16T00:23:21.118733811Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119715 containerd[1467]: time="2026-04-16T00:23:21.118747278Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Apr 16 00:23:21.119715 containerd[1467]: time="2026-04-16T00:23:21.118759671Z" level=info msg="NRI interface is disabled by configuration." Apr 16 00:23:21.119715 containerd[1467]: time="2026-04-16T00:23:21.118769833Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Apr 16 00:23:21.119828 containerd[1467]: time="2026-04-16T00:23:21.119134276Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Apr 16 00:23:21.119828 containerd[1467]: time="2026-04-16T00:23:21.119195953Z" level=info msg="Connect containerd service" Apr 16 00:23:21.119828 containerd[1467]: time="2026-04-16T00:23:21.119272749Z" level=info msg="using legacy CRI server" Apr 16 00:23:21.120108 containerd[1467]: time="2026-04-16T00:23:21.119283614Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 16 00:23:21.123728 containerd[1467]: time="2026-04-16T00:23:21.122461643Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Apr 16 00:23:21.123728 containerd[1467]: time="2026-04-16T00:23:21.123532168Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 16 00:23:21.123958 containerd[1467]: time="2026-04-16T00:23:21.123880830Z" level=info msg="Start subscribing containerd event" Apr 16 00:23:21.124010 containerd[1467]: time="2026-04-16T00:23:21.123939946Z" level=info msg="Start recovering state" Apr 16 00:23:21.124153 containerd[1467]: time="2026-04-16T00:23:21.124140220Z" level=info msg="Start event monitor" Apr 16 00:23:21.124457 containerd[1467]: time="2026-04-16T00:23:21.124431914Z" level=info msg="Start snapshots syncer" Apr 16 00:23:21.128469 containerd[1467]: time="2026-04-16T00:23:21.127341094Z" level=info msg="Start cni network conf syncer for default" Apr 16 00:23:21.128469 containerd[1467]: time="2026-04-16T00:23:21.127393104Z" level=info msg="Start streaming server" Apr 16 00:23:21.128469 containerd[1467]: time="2026-04-16T00:23:21.128134424Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 16 00:23:21.128469 containerd[1467]: time="2026-04-16T00:23:21.128250052Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 16 00:23:21.128576 systemd[1]: Started containerd.service - containerd container runtime. Apr 16 00:23:21.128867 containerd[1467]: time="2026-04-16T00:23:21.128846991Z" level=info msg="containerd successfully booted in 0.083458s" Apr 16 00:23:21.150635 locksmithd[1498]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 16 00:23:21.152928 coreos-metadata[1522]: Apr 16 00:23:21.152 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Apr 16 00:23:21.156328 coreos-metadata[1522]: Apr 16 00:23:21.155 INFO Fetch successful Apr 16 00:23:21.158777 unknown[1522]: wrote ssh authorized keys file for user: core Apr 16 00:23:21.198128 update-ssh-keys[1531]: Updated "/home/core/.ssh/authorized_keys" Apr 16 00:23:21.201094 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Apr 16 00:23:21.207825 systemd[1]: Finished sshkeys.service. Apr 16 00:23:21.521947 tar[1457]: linux-arm64/README.md Apr 16 00:23:21.536959 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 16 00:23:21.597167 sshd_keygen[1478]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 16 00:23:21.626606 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 16 00:23:21.637549 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 16 00:23:21.644917 systemd[1]: issuegen.service: Deactivated successfully. Apr 16 00:23:21.645170 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 16 00:23:21.652729 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 16 00:23:21.668407 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 16 00:23:21.677378 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 16 00:23:21.684889 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Apr 16 00:23:21.688125 systemd[1]: Reached target getty.target - Login Prompts. Apr 16 00:23:21.695525 systemd-networkd[1382]: eth0: Gained IPv6LL Apr 16 00:23:21.700414 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 16 00:23:21.703257 systemd[1]: Reached target network-online.target - Network is Online. Apr 16 00:23:21.710634 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:23:21.716629 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 16 00:23:21.752360 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 16 00:23:22.207546 systemd-networkd[1382]: eth1: Gained IPv6LL Apr 16 00:23:22.628965 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:23:22.631794 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 16 00:23:22.632741 systemd[1]: Startup finished in 847ms (kernel) + 5.376s (initrd) + 4.601s (userspace) = 10.825s. Apr 16 00:23:22.646626 (kubelet)[1570]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:23:23.255968 kubelet[1570]: E0416 00:23:23.255902 1570 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:23:23.260286 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:23:23.260580 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:23:33.353610 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 16 00:23:33.363629 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:23:33.496662 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:23:33.509223 (kubelet)[1589]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:23:33.560498 kubelet[1589]: E0416 00:23:33.559533 1589 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:23:33.566004 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:23:33.566343 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:23:43.603240 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Apr 16 00:23:43.613946 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:23:43.742660 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:23:43.747881 (kubelet)[1604]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:23:43.796468 kubelet[1604]: E0416 00:23:43.796389 1604 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:23:43.799782 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:23:43.800056 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:23:49.040060 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 16 00:23:49.041630 systemd[1]: Started sshd@0-46.224.6.157:22-4.175.71.9:37782.service - OpenSSH per-connection server daemon (4.175.71.9:37782). Apr 16 00:23:49.165012 sshd[1612]: Accepted publickey for core from 4.175.71.9 port 37782 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:23:49.166617 sshd[1612]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:23:49.178844 systemd-logind[1451]: New session 1 of user core. Apr 16 00:23:49.181778 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 16 00:23:49.192956 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 16 00:23:49.209411 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 16 00:23:49.218688 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 16 00:23:49.235990 (systemd)[1616]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 16 00:23:49.350706 systemd[1616]: Queued start job for default target default.target. Apr 16 00:23:49.364038 systemd[1616]: Created slice app.slice - User Application Slice. Apr 16 00:23:49.364104 systemd[1616]: Reached target paths.target - Paths. Apr 16 00:23:49.364136 systemd[1616]: Reached target timers.target - Timers. Apr 16 00:23:49.366289 systemd[1616]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 16 00:23:49.381241 systemd[1616]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 16 00:23:49.381395 systemd[1616]: Reached target sockets.target - Sockets. Apr 16 00:23:49.381410 systemd[1616]: Reached target basic.target - Basic System. Apr 16 00:23:49.381449 systemd[1616]: Reached target default.target - Main User Target. Apr 16 00:23:49.381476 systemd[1616]: Startup finished in 135ms. Apr 16 00:23:49.381714 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 16 00:23:49.386759 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 16 00:23:49.502953 systemd[1]: Started sshd@1-46.224.6.157:22-4.175.71.9:37784.service - OpenSSH per-connection server daemon (4.175.71.9:37784). Apr 16 00:23:49.622520 sshd[1627]: Accepted publickey for core from 4.175.71.9 port 37784 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:23:49.623956 sshd[1627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:23:49.630022 systemd-logind[1451]: New session 2 of user core. Apr 16 00:23:49.638722 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 16 00:23:49.738576 sshd[1627]: pam_unix(sshd:session): session closed for user core Apr 16 00:23:49.743978 systemd[1]: sshd@1-46.224.6.157:22-4.175.71.9:37784.service: Deactivated successfully. Apr 16 00:23:49.746129 systemd[1]: session-2.scope: Deactivated successfully. Apr 16 00:23:49.747174 systemd-logind[1451]: Session 2 logged out. Waiting for processes to exit. Apr 16 00:23:49.748392 systemd-logind[1451]: Removed session 2. Apr 16 00:23:49.786960 systemd[1]: Started sshd@2-46.224.6.157:22-4.175.71.9:37792.service - OpenSSH per-connection server daemon (4.175.71.9:37792). Apr 16 00:23:49.908411 sshd[1634]: Accepted publickey for core from 4.175.71.9 port 37792 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:23:49.910787 sshd[1634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:23:49.916526 systemd-logind[1451]: New session 3 of user core. Apr 16 00:23:49.933595 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 16 00:23:50.031175 sshd[1634]: pam_unix(sshd:session): session closed for user core Apr 16 00:23:50.036100 systemd[1]: session-3.scope: Deactivated successfully. Apr 16 00:23:50.039538 systemd-logind[1451]: Session 3 logged out. Waiting for processes to exit. Apr 16 00:23:50.039755 systemd[1]: sshd@2-46.224.6.157:22-4.175.71.9:37792.service: Deactivated successfully. Apr 16 00:23:50.042881 systemd-logind[1451]: Removed session 3. Apr 16 00:23:50.066934 systemd[1]: Started sshd@3-46.224.6.157:22-4.175.71.9:37800.service - OpenSSH per-connection server daemon (4.175.71.9:37800). Apr 16 00:23:50.202425 sshd[1641]: Accepted publickey for core from 4.175.71.9 port 37800 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:23:50.204717 sshd[1641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:23:50.211917 systemd-logind[1451]: New session 4 of user core. Apr 16 00:23:50.217769 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 16 00:23:50.323007 sshd[1641]: pam_unix(sshd:session): session closed for user core Apr 16 00:23:50.327302 systemd[1]: sshd@3-46.224.6.157:22-4.175.71.9:37800.service: Deactivated successfully. Apr 16 00:23:50.329549 systemd[1]: session-4.scope: Deactivated successfully. Apr 16 00:23:50.330722 systemd-logind[1451]: Session 4 logged out. Waiting for processes to exit. Apr 16 00:23:50.332649 systemd-logind[1451]: Removed session 4. Apr 16 00:23:50.352762 systemd[1]: Started sshd@4-46.224.6.157:22-4.175.71.9:37802.service - OpenSSH per-connection server daemon (4.175.71.9:37802). Apr 16 00:23:50.476548 sshd[1648]: Accepted publickey for core from 4.175.71.9 port 37802 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:23:50.480029 sshd[1648]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:23:50.486153 systemd-logind[1451]: New session 5 of user core. Apr 16 00:23:50.493690 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 16 00:23:50.591460 sudo[1651]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 16 00:23:50.592013 sudo[1651]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:23:50.614196 sudo[1651]: pam_unix(sudo:session): session closed for user root Apr 16 00:23:50.631921 sshd[1648]: pam_unix(sshd:session): session closed for user core Apr 16 00:23:50.637545 systemd[1]: sshd@4-46.224.6.157:22-4.175.71.9:37802.service: Deactivated successfully. Apr 16 00:23:50.640353 systemd[1]: session-5.scope: Deactivated successfully. Apr 16 00:23:50.642454 systemd-logind[1451]: Session 5 logged out. Waiting for processes to exit. Apr 16 00:23:50.644379 systemd-logind[1451]: Removed session 5. Apr 16 00:23:50.667067 systemd[1]: Started sshd@5-46.224.6.157:22-4.175.71.9:37816.service - OpenSSH per-connection server daemon (4.175.71.9:37816). Apr 16 00:23:50.791440 sshd[1656]: Accepted publickey for core from 4.175.71.9 port 37816 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:23:50.794350 sshd[1656]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:23:50.802115 systemd-logind[1451]: New session 6 of user core. Apr 16 00:23:50.808650 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 16 00:23:50.896896 sudo[1660]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 16 00:23:50.897203 sudo[1660]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:23:50.902143 sudo[1660]: pam_unix(sudo:session): session closed for user root Apr 16 00:23:50.908339 sudo[1659]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Apr 16 00:23:50.908697 sudo[1659]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:23:50.933400 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Apr 16 00:23:50.934963 auditctl[1663]: No rules Apr 16 00:23:50.936628 systemd[1]: audit-rules.service: Deactivated successfully. Apr 16 00:23:50.936913 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Apr 16 00:23:50.944852 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 16 00:23:50.971092 augenrules[1681]: No rules Apr 16 00:23:50.973154 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 16 00:23:50.975540 sudo[1659]: pam_unix(sudo:session): session closed for user root Apr 16 00:23:50.992525 sshd[1656]: pam_unix(sshd:session): session closed for user core Apr 16 00:23:50.998674 systemd[1]: sshd@5-46.224.6.157:22-4.175.71.9:37816.service: Deactivated successfully. Apr 16 00:23:51.003048 systemd[1]: session-6.scope: Deactivated successfully. Apr 16 00:23:51.004382 systemd-logind[1451]: Session 6 logged out. Waiting for processes to exit. Apr 16 00:23:51.024714 systemd[1]: Started sshd@6-46.224.6.157:22-4.175.71.9:37818.service - OpenSSH per-connection server daemon (4.175.71.9:37818). Apr 16 00:23:51.026794 systemd-logind[1451]: Removed session 6. Apr 16 00:23:51.138651 sshd[1689]: Accepted publickey for core from 4.175.71.9 port 37818 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:23:51.140012 sshd[1689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:23:51.144990 systemd-logind[1451]: New session 7 of user core. Apr 16 00:23:51.152560 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 16 00:23:51.233710 sudo[1692]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 16 00:23:51.234334 sudo[1692]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:23:51.537659 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 16 00:23:51.540404 (dockerd)[1708]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 16 00:23:51.783337 dockerd[1708]: time="2026-04-16T00:23:51.783030503Z" level=info msg="Starting up" Apr 16 00:23:51.887281 dockerd[1708]: time="2026-04-16T00:23:51.887226868Z" level=info msg="Loading containers: start." Apr 16 00:23:51.998327 kernel: Initializing XFRM netlink socket Apr 16 00:23:52.082864 systemd-networkd[1382]: docker0: Link UP Apr 16 00:23:52.099095 dockerd[1708]: time="2026-04-16T00:23:52.098922013Z" level=info msg="Loading containers: done." Apr 16 00:23:52.119295 dockerd[1708]: time="2026-04-16T00:23:52.117466768Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 16 00:23:52.119295 dockerd[1708]: time="2026-04-16T00:23:52.117579827Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Apr 16 00:23:52.119295 dockerd[1708]: time="2026-04-16T00:23:52.117697048Z" level=info msg="Daemon has completed initialization" Apr 16 00:23:52.118545 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3257473087-merged.mount: Deactivated successfully. Apr 16 00:23:52.155632 dockerd[1708]: time="2026-04-16T00:23:52.155398999Z" level=info msg="API listen on /run/docker.sock" Apr 16 00:23:52.160446 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 16 00:23:52.697729 containerd[1467]: time="2026-04-16T00:23:52.697396981Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.11\"" Apr 16 00:23:53.286488 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3050744748.mount: Deactivated successfully. Apr 16 00:23:53.853041 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Apr 16 00:23:53.864943 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:23:54.006466 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:23:54.016089 (kubelet)[1914]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:23:54.068927 kubelet[1914]: E0416 00:23:54.068868 1914 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:23:54.073726 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:23:54.073858 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:23:54.231185 containerd[1467]: time="2026-04-16T00:23:54.230947084Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:54.234107 containerd[1467]: time="2026-04-16T00:23:54.233991162Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.11: active requests=0, bytes read=27008885" Apr 16 00:23:54.235666 containerd[1467]: time="2026-04-16T00:23:54.235573311Z" level=info msg="ImageCreate event name:\"sha256:51b83c5cb2f791f72696c040be904535bad3c81a6ffc19a55013ac150a24d9b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:54.240662 containerd[1467]: time="2026-04-16T00:23:54.240587891Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:18e9f2b6e4d67c24941e14b2d41ec0aa6e5f628e39f2ef2163e176de85bbe39e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:54.242595 containerd[1467]: time="2026-04-16T00:23:54.242515572Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.11\" with image id \"sha256:51b83c5cb2f791f72696c040be904535bad3c81a6ffc19a55013ac150a24d9b0\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:18e9f2b6e4d67c24941e14b2d41ec0aa6e5f628e39f2ef2163e176de85bbe39e\", size \"27005386\" in 1.545070607s" Apr 16 00:23:54.242595 containerd[1467]: time="2026-04-16T00:23:54.242576603Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.11\" returns image reference \"sha256:51b83c5cb2f791f72696c040be904535bad3c81a6ffc19a55013ac150a24d9b0\"" Apr 16 00:23:54.243413 containerd[1467]: time="2026-04-16T00:23:54.243312650Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.11\"" Apr 16 00:23:55.270796 containerd[1467]: time="2026-04-16T00:23:55.270241277Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:55.272757 containerd[1467]: time="2026-04-16T00:23:55.272697955Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.11: active requests=0, bytes read=23297794" Apr 16 00:23:55.274877 containerd[1467]: time="2026-04-16T00:23:55.273189586Z" level=info msg="ImageCreate event name:\"sha256:df8bcecad66863646fb4016494163838761da38376bae5a7592e04041db8489a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:55.276636 containerd[1467]: time="2026-04-16T00:23:55.276587948Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7579451c5b3c2715da4a263c5d80a3367a24fdc12e86fde6851674d567d1dfb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:55.277990 containerd[1467]: time="2026-04-16T00:23:55.277951390Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.11\" with image id \"sha256:df8bcecad66863646fb4016494163838761da38376bae5a7592e04041db8489a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7579451c5b3c2715da4a263c5d80a3367a24fdc12e86fde6851674d567d1dfb2\", size \"24804413\" in 1.034599361s" Apr 16 00:23:55.278110 containerd[1467]: time="2026-04-16T00:23:55.278094377Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.11\" returns image reference \"sha256:df8bcecad66863646fb4016494163838761da38376bae5a7592e04041db8489a\"" Apr 16 00:23:55.279066 containerd[1467]: time="2026-04-16T00:23:55.279040223Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.11\"" Apr 16 00:23:56.178329 containerd[1467]: time="2026-04-16T00:23:56.177668024Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:56.180007 containerd[1467]: time="2026-04-16T00:23:56.179922228Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.11: active requests=0, bytes read=18141378" Apr 16 00:23:56.182302 containerd[1467]: time="2026-04-16T00:23:56.182230496Z" level=info msg="ImageCreate event name:\"sha256:8c8e25fd00e5c108fb9ab5490c25bfaeb0231b1c59f749dab4f5300f1c49995b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:56.188918 containerd[1467]: time="2026-04-16T00:23:56.188859688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5506f0f94c4d9aeb071664893aabc12166bcb7f775008a6fff02d004e6091d28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:56.191302 containerd[1467]: time="2026-04-16T00:23:56.191239508Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.11\" with image id \"sha256:8c8e25fd00e5c108fb9ab5490c25bfaeb0231b1c59f749dab4f5300f1c49995b\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5506f0f94c4d9aeb071664893aabc12166bcb7f775008a6fff02d004e6091d28\", size \"19648015\" in 911.657043ms" Apr 16 00:23:56.191460 containerd[1467]: time="2026-04-16T00:23:56.191443319Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.11\" returns image reference \"sha256:8c8e25fd00e5c108fb9ab5490c25bfaeb0231b1c59f749dab4f5300f1c49995b\"" Apr 16 00:23:56.192717 containerd[1467]: time="2026-04-16T00:23:56.192627606Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.11\"" Apr 16 00:23:57.049471 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1552061238.mount: Deactivated successfully. Apr 16 00:23:57.382187 containerd[1467]: time="2026-04-16T00:23:57.382138788Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:57.383432 containerd[1467]: time="2026-04-16T00:23:57.383390835Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.11: active requests=0, bytes read=28040534" Apr 16 00:23:57.384091 containerd[1467]: time="2026-04-16T00:23:57.383870077Z" level=info msg="ImageCreate event name:\"sha256:7ce14d6fb1e5134a578d2aaa327fd701273e3d222b9b8d88054dd86b87a7dc36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:57.387296 containerd[1467]: time="2026-04-16T00:23:57.386887548Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8d18637b5c5f58a4ca0163d3cf184e53d4c522963c242860562be7cb25e9303e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:57.387817 containerd[1467]: time="2026-04-16T00:23:57.387645548Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.11\" with image id \"sha256:7ce14d6fb1e5134a578d2aaa327fd701273e3d222b9b8d88054dd86b87a7dc36\", repo tag \"registry.k8s.io/kube-proxy:v1.33.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:8d18637b5c5f58a4ca0163d3cf184e53d4c522963c242860562be7cb25e9303e\", size \"28039527\" in 1.194817773s" Apr 16 00:23:57.387817 containerd[1467]: time="2026-04-16T00:23:57.387686885Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.11\" returns image reference \"sha256:7ce14d6fb1e5134a578d2aaa327fd701273e3d222b9b8d88054dd86b87a7dc36\"" Apr 16 00:23:57.388293 containerd[1467]: time="2026-04-16T00:23:57.388235756Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Apr 16 00:23:57.966417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4273329988.mount: Deactivated successfully. Apr 16 00:23:58.721552 containerd[1467]: time="2026-04-16T00:23:58.721466386Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:58.723868 containerd[1467]: time="2026-04-16T00:23:58.723797795Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Apr 16 00:23:58.725313 containerd[1467]: time="2026-04-16T00:23:58.724546814Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:58.728612 containerd[1467]: time="2026-04-16T00:23:58.728503231Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:58.730406 containerd[1467]: time="2026-04-16T00:23:58.729863333Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.34109107s" Apr 16 00:23:58.730406 containerd[1467]: time="2026-04-16T00:23:58.729906150Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Apr 16 00:23:58.731581 containerd[1467]: time="2026-04-16T00:23:58.731293223Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Apr 16 00:23:59.149743 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount877739649.mount: Deactivated successfully. Apr 16 00:23:59.159256 containerd[1467]: time="2026-04-16T00:23:59.158357480Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:59.160801 containerd[1467]: time="2026-04-16T00:23:59.160758747Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Apr 16 00:23:59.162429 containerd[1467]: time="2026-04-16T00:23:59.162344625Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:59.165975 containerd[1467]: time="2026-04-16T00:23:59.165521624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:23:59.166707 containerd[1467]: time="2026-04-16T00:23:59.166666096Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 435.338579ms" Apr 16 00:23:59.166707 containerd[1467]: time="2026-04-16T00:23:59.166706031Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Apr 16 00:23:59.167191 containerd[1467]: time="2026-04-16T00:23:59.167138434Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Apr 16 00:23:59.652614 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1374493213.mount: Deactivated successfully. Apr 16 00:24:00.509895 containerd[1467]: time="2026-04-16T00:24:00.509801781Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:00.511574 containerd[1467]: time="2026-04-16T00:24:00.511515593Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=21886470" Apr 16 00:24:00.513076 containerd[1467]: time="2026-04-16T00:24:00.513015570Z" level=info msg="ImageCreate event name:\"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:00.518236 containerd[1467]: time="2026-04-16T00:24:00.516724495Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:00.518236 containerd[1467]: time="2026-04-16T00:24:00.518075298Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"21882972\" in 1.350885685s" Apr 16 00:24:00.518236 containerd[1467]: time="2026-04-16T00:24:00.518120754Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\"" Apr 16 00:24:04.103092 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Apr 16 00:24:04.111862 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:24:04.232531 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:24:04.236659 (kubelet)[2085]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:24:04.280278 kubelet[2085]: E0416 00:24:04.277918 2085 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:24:04.281070 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:24:04.281206 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:24:05.532799 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:24:05.538796 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:24:05.594131 systemd[1]: Reloading requested from client PID 2098 ('systemctl') (unit session-7.scope)... Apr 16 00:24:05.594338 systemd[1]: Reloading... Apr 16 00:24:05.734312 zram_generator::config[2142]: No configuration found. Apr 16 00:24:05.813006 update_engine[1452]: I20260416 00:24:05.811444 1452 update_attempter.cc:509] Updating boot flags... Apr 16 00:24:05.830015 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:24:05.876298 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (2181) Apr 16 00:24:05.909029 systemd[1]: Reloading finished in 314 ms. Apr 16 00:24:05.964348 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (2183) Apr 16 00:24:06.007180 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:24:06.013890 systemd[1]: kubelet.service: Deactivated successfully. Apr 16 00:24:06.014145 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:24:06.021805 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:24:06.081696 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (2183) Apr 16 00:24:06.200553 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:24:06.203176 (kubelet)[2209]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 16 00:24:06.249328 kubelet[2209]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 00:24:06.249328 kubelet[2209]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 00:24:06.249328 kubelet[2209]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 00:24:06.249840 kubelet[2209]: I0416 00:24:06.249381 2209 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 00:24:07.054323 kubelet[2209]: I0416 00:24:07.054190 2209 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Apr 16 00:24:07.054323 kubelet[2209]: I0416 00:24:07.054228 2209 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 00:24:07.054614 kubelet[2209]: I0416 00:24:07.054540 2209 server.go:956] "Client rotation is on, will bootstrap in background" Apr 16 00:24:07.081803 kubelet[2209]: E0416 00:24:07.081738 2209 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://46.224.6.157:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 16 00:24:07.084659 kubelet[2209]: I0416 00:24:07.084424 2209 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 16 00:24:07.094255 kubelet[2209]: E0416 00:24:07.094180 2209 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 16 00:24:07.094255 kubelet[2209]: I0416 00:24:07.094237 2209 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Apr 16 00:24:07.097704 kubelet[2209]: I0416 00:24:07.097658 2209 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Apr 16 00:24:07.099536 kubelet[2209]: I0416 00:24:07.099452 2209 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 00:24:07.099829 kubelet[2209]: I0416 00:24:07.099537 2209 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-56c15b786d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 00:24:07.099829 kubelet[2209]: I0416 00:24:07.099797 2209 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 00:24:07.099829 kubelet[2209]: I0416 00:24:07.099810 2209 container_manager_linux.go:303] "Creating device plugin manager" Apr 16 00:24:07.100232 kubelet[2209]: I0416 00:24:07.100188 2209 state_mem.go:36] "Initialized new in-memory state store" Apr 16 00:24:07.104784 kubelet[2209]: I0416 00:24:07.104710 2209 kubelet.go:480] "Attempting to sync node with API server" Apr 16 00:24:07.104784 kubelet[2209]: I0416 00:24:07.104753 2209 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 00:24:07.104784 kubelet[2209]: I0416 00:24:07.104791 2209 kubelet.go:386] "Adding apiserver pod source" Apr 16 00:24:07.107940 kubelet[2209]: I0416 00:24:07.106567 2209 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 00:24:07.113719 kubelet[2209]: E0416 00:24:07.113678 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://46.224.6.157:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-n-56c15b786d&limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 00:24:07.116570 kubelet[2209]: E0416 00:24:07.116502 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.224.6.157:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 00:24:07.116801 kubelet[2209]: I0416 00:24:07.116785 2209 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 16 00:24:07.117732 kubelet[2209]: I0416 00:24:07.117708 2209 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 00:24:07.117947 kubelet[2209]: W0416 00:24:07.117935 2209 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 16 00:24:07.121127 kubelet[2209]: I0416 00:24:07.121107 2209 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 00:24:07.121301 kubelet[2209]: I0416 00:24:07.121289 2209 server.go:1289] "Started kubelet" Apr 16 00:24:07.122775 kubelet[2209]: I0416 00:24:07.122748 2209 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 00:24:07.128622 kubelet[2209]: E0416 00:24:07.126820 2209 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://46.224.6.157:6443/api/v1/namespaces/default/events\": dial tcp 46.224.6.157:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-56c15b786d.18a6ae9276868526 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-56c15b786d,UID:ci-4081-3-6-n-56c15b786d,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-56c15b786d,},FirstTimestamp:2026-04-16 00:24:07.121224998 +0000 UTC m=+0.914245623,LastTimestamp:2026-04-16 00:24:07.121224998 +0000 UTC m=+0.914245623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-56c15b786d,}" Apr 16 00:24:07.129426 kubelet[2209]: I0416 00:24:07.129396 2209 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 00:24:07.130773 kubelet[2209]: I0416 00:24:07.130748 2209 server.go:317] "Adding debug handlers to kubelet server" Apr 16 00:24:07.135616 kubelet[2209]: I0416 00:24:07.134728 2209 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 00:24:07.135616 kubelet[2209]: I0416 00:24:07.135006 2209 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 00:24:07.135616 kubelet[2209]: I0416 00:24:07.135152 2209 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 00:24:07.135616 kubelet[2209]: I0416 00:24:07.135262 2209 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 16 00:24:07.135616 kubelet[2209]: E0416 00:24:07.135469 2209 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-56c15b786d\" not found" Apr 16 00:24:07.139772 kubelet[2209]: I0416 00:24:07.139717 2209 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 00:24:07.139906 kubelet[2209]: I0416 00:24:07.139804 2209 reconciler.go:26] "Reconciler: start to sync state" Apr 16 00:24:07.142575 kubelet[2209]: E0416 00:24:07.142518 2209 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.6.157:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-56c15b786d?timeout=10s\": dial tcp 46.224.6.157:6443: connect: connection refused" interval="200ms" Apr 16 00:24:07.145771 kubelet[2209]: E0416 00:24:07.145316 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.224.6.157:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 00:24:07.147292 kubelet[2209]: E0416 00:24:07.147173 2209 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 16 00:24:07.147656 kubelet[2209]: I0416 00:24:07.147630 2209 factory.go:223] Registration of the containerd container factory successfully Apr 16 00:24:07.147755 kubelet[2209]: I0416 00:24:07.147741 2209 factory.go:223] Registration of the systemd container factory successfully Apr 16 00:24:07.149298 kubelet[2209]: I0416 00:24:07.147944 2209 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 16 00:24:07.157547 kubelet[2209]: I0416 00:24:07.157485 2209 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 00:24:07.158725 kubelet[2209]: I0416 00:24:07.158689 2209 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 00:24:07.158725 kubelet[2209]: I0416 00:24:07.158725 2209 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 00:24:07.158836 kubelet[2209]: I0416 00:24:07.158750 2209 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 00:24:07.158836 kubelet[2209]: I0416 00:24:07.158759 2209 kubelet.go:2436] "Starting kubelet main sync loop" Apr 16 00:24:07.158836 kubelet[2209]: E0416 00:24:07.158815 2209 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 16 00:24:07.165744 kubelet[2209]: E0416 00:24:07.165706 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.224.6.157:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 00:24:07.166802 kubelet[2209]: I0416 00:24:07.166766 2209 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 16 00:24:07.166930 kubelet[2209]: I0416 00:24:07.166916 2209 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 16 00:24:07.167012 kubelet[2209]: I0416 00:24:07.167003 2209 state_mem.go:36] "Initialized new in-memory state store" Apr 16 00:24:07.170646 kubelet[2209]: I0416 00:24:07.170612 2209 policy_none.go:49] "None policy: Start" Apr 16 00:24:07.170811 kubelet[2209]: I0416 00:24:07.170795 2209 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 00:24:07.170883 kubelet[2209]: I0416 00:24:07.170873 2209 state_mem.go:35] "Initializing new in-memory state store" Apr 16 00:24:07.180617 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Apr 16 00:24:07.199898 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Apr 16 00:24:07.204830 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Apr 16 00:24:07.215104 kubelet[2209]: E0416 00:24:07.215022 2209 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 00:24:07.215426 kubelet[2209]: I0416 00:24:07.215391 2209 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 00:24:07.215536 kubelet[2209]: I0416 00:24:07.215423 2209 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 00:24:07.218053 kubelet[2209]: I0416 00:24:07.217356 2209 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 00:24:07.220970 kubelet[2209]: E0416 00:24:07.220900 2209 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 16 00:24:07.221632 kubelet[2209]: E0416 00:24:07.220993 2209 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-n-56c15b786d\" not found" Apr 16 00:24:07.276682 systemd[1]: Created slice kubepods-burstable-pod76bc5201f781f260bbe53dd859aa756d.slice - libcontainer container kubepods-burstable-pod76bc5201f781f260bbe53dd859aa756d.slice. Apr 16 00:24:07.285857 kubelet[2209]: E0416 00:24:07.285804 2209 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-56c15b786d\" not found" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.291080 systemd[1]: Created slice kubepods-burstable-pod59b2bf97761f216fe0218931684217e1.slice - libcontainer container kubepods-burstable-pod59b2bf97761f216fe0218931684217e1.slice. Apr 16 00:24:07.294956 kubelet[2209]: E0416 00:24:07.294448 2209 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-56c15b786d\" not found" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.295970 systemd[1]: Created slice kubepods-burstable-pod64c8c139bb99c8f410339f1b416c1895.slice - libcontainer container kubepods-burstable-pod64c8c139bb99c8f410339f1b416c1895.slice. Apr 16 00:24:07.299548 kubelet[2209]: E0416 00:24:07.299494 2209 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-56c15b786d\" not found" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.318761 kubelet[2209]: I0416 00:24:07.318300 2209 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.320241 kubelet[2209]: E0416 00:24:07.318908 2209 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.6.157:6443/api/v1/nodes\": dial tcp 46.224.6.157:6443: connect: connection refused" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.341833 kubelet[2209]: I0416 00:24:07.341742 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/59b2bf97761f216fe0218931684217e1-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" (UID: \"59b2bf97761f216fe0218931684217e1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.341833 kubelet[2209]: I0416 00:24:07.341825 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.342311 kubelet[2209]: I0416 00:24:07.341885 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.342311 kubelet[2209]: I0416 00:24:07.341958 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.342311 kubelet[2209]: I0416 00:24:07.341997 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.342311 kubelet[2209]: I0416 00:24:07.342034 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/76bc5201f781f260bbe53dd859aa756d-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-56c15b786d\" (UID: \"76bc5201f781f260bbe53dd859aa756d\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.342311 kubelet[2209]: I0416 00:24:07.342077 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/59b2bf97761f216fe0218931684217e1-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" (UID: \"59b2bf97761f216fe0218931684217e1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.342656 kubelet[2209]: I0416 00:24:07.342114 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/59b2bf97761f216fe0218931684217e1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" (UID: \"59b2bf97761f216fe0218931684217e1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.342656 kubelet[2209]: I0416 00:24:07.342146 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.344131 kubelet[2209]: E0416 00:24:07.344091 2209 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.6.157:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-56c15b786d?timeout=10s\": dial tcp 46.224.6.157:6443: connect: connection refused" interval="400ms" Apr 16 00:24:07.521314 kubelet[2209]: I0416 00:24:07.521222 2209 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.521752 kubelet[2209]: E0416 00:24:07.521593 2209 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.6.157:6443/api/v1/nodes\": dial tcp 46.224.6.157:6443: connect: connection refused" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.588537 containerd[1467]: time="2026-04-16T00:24:07.588343482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-56c15b786d,Uid:76bc5201f781f260bbe53dd859aa756d,Namespace:kube-system,Attempt:0,}" Apr 16 00:24:07.596522 containerd[1467]: time="2026-04-16T00:24:07.596062443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-56c15b786d,Uid:59b2bf97761f216fe0218931684217e1,Namespace:kube-system,Attempt:0,}" Apr 16 00:24:07.601816 containerd[1467]: time="2026-04-16T00:24:07.601524002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-56c15b786d,Uid:64c8c139bb99c8f410339f1b416c1895,Namespace:kube-system,Attempt:0,}" Apr 16 00:24:07.745891 kubelet[2209]: E0416 00:24:07.745829 2209 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.6.157:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-56c15b786d?timeout=10s\": dial tcp 46.224.6.157:6443: connect: connection refused" interval="800ms" Apr 16 00:24:07.923758 kubelet[2209]: E0416 00:24:07.923135 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://46.224.6.157:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-n-56c15b786d&limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 00:24:07.925784 kubelet[2209]: I0416 00:24:07.925692 2209 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:07.926335 kubelet[2209]: E0416 00:24:07.926185 2209 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.6.157:6443/api/v1/nodes\": dial tcp 46.224.6.157:6443: connect: connection refused" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:08.050652 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4238725906.mount: Deactivated successfully. Apr 16 00:24:08.059482 containerd[1467]: time="2026-04-16T00:24:08.058309229Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:24:08.061203 containerd[1467]: time="2026-04-16T00:24:08.060434173Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:24:08.062252 containerd[1467]: time="2026-04-16T00:24:08.062219676Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Apr 16 00:24:08.063110 containerd[1467]: time="2026-04-16T00:24:08.063083521Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 16 00:24:08.064246 containerd[1467]: time="2026-04-16T00:24:08.064208027Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:24:08.065899 containerd[1467]: time="2026-04-16T00:24:08.065863619Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 16 00:24:08.066118 containerd[1467]: time="2026-04-16T00:24:08.066092274Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:24:08.069193 containerd[1467]: time="2026-04-16T00:24:08.069115990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:24:08.070443 containerd[1467]: time="2026-04-16T00:24:08.070404255Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 474.257472ms" Apr 16 00:24:08.073906 containerd[1467]: time="2026-04-16T00:24:08.073643223Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 485.195595ms" Apr 16 00:24:08.090304 containerd[1467]: time="2026-04-16T00:24:08.087920686Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 486.313823ms" Apr 16 00:24:08.210801 containerd[1467]: time="2026-04-16T00:24:08.210073593Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:08.210801 containerd[1467]: time="2026-04-16T00:24:08.210182699Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:08.210801 containerd[1467]: time="2026-04-16T00:24:08.210197462Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:08.210801 containerd[1467]: time="2026-04-16T00:24:08.210329174Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:08.212118 containerd[1467]: time="2026-04-16T00:24:08.211748190Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:08.212118 containerd[1467]: time="2026-04-16T00:24:08.211900866Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:08.212118 containerd[1467]: time="2026-04-16T00:24:08.211917430Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:08.212118 containerd[1467]: time="2026-04-16T00:24:08.212023295Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:08.212762 containerd[1467]: time="2026-04-16T00:24:08.212454077Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:08.212762 containerd[1467]: time="2026-04-16T00:24:08.212507930Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:08.212762 containerd[1467]: time="2026-04-16T00:24:08.212524094Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:08.212762 containerd[1467]: time="2026-04-16T00:24:08.212590789Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:08.216765 kubelet[2209]: E0416 00:24:08.216722 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.224.6.157:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 00:24:08.242929 systemd[1]: Started cri-containerd-cfc54a2fa455005aace672803ccd28d1ec1a48df5e820665e3d20fee0da9b240.scope - libcontainer container cfc54a2fa455005aace672803ccd28d1ec1a48df5e820665e3d20fee0da9b240. Apr 16 00:24:08.251358 systemd[1]: Started cri-containerd-7b6208f400f870fab0b5309573b55f9a5c827170711c9c71322555316cfc48c2.scope - libcontainer container 7b6208f400f870fab0b5309573b55f9a5c827170711c9c71322555316cfc48c2. Apr 16 00:24:08.253337 systemd[1]: Started cri-containerd-a009c48b099678ebbc88aac756c63705830c9236fb505f0796435ed09dfa748f.scope - libcontainer container a009c48b099678ebbc88aac756c63705830c9236fb505f0796435ed09dfa748f. Apr 16 00:24:08.303471 containerd[1467]: time="2026-04-16T00:24:08.303335773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-56c15b786d,Uid:59b2bf97761f216fe0218931684217e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"cfc54a2fa455005aace672803ccd28d1ec1a48df5e820665e3d20fee0da9b240\"" Apr 16 00:24:08.314503 containerd[1467]: time="2026-04-16T00:24:08.314363867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-56c15b786d,Uid:76bc5201f781f260bbe53dd859aa756d,Namespace:kube-system,Attempt:0,} returns sandbox id \"a009c48b099678ebbc88aac756c63705830c9236fb505f0796435ed09dfa748f\"" Apr 16 00:24:08.315160 containerd[1467]: time="2026-04-16T00:24:08.315125247Z" level=info msg="CreateContainer within sandbox \"cfc54a2fa455005aace672803ccd28d1ec1a48df5e820665e3d20fee0da9b240\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 16 00:24:08.321987 containerd[1467]: time="2026-04-16T00:24:08.321860203Z" level=info msg="CreateContainer within sandbox \"a009c48b099678ebbc88aac756c63705830c9236fb505f0796435ed09dfa748f\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 16 00:24:08.326540 containerd[1467]: time="2026-04-16T00:24:08.326419003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-56c15b786d,Uid:64c8c139bb99c8f410339f1b416c1895,Namespace:kube-system,Attempt:0,} returns sandbox id \"7b6208f400f870fab0b5309573b55f9a5c827170711c9c71322555316cfc48c2\"" Apr 16 00:24:08.331095 containerd[1467]: time="2026-04-16T00:24:08.330733666Z" level=info msg="CreateContainer within sandbox \"7b6208f400f870fab0b5309573b55f9a5c827170711c9c71322555316cfc48c2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 16 00:24:08.331239 kubelet[2209]: E0416 00:24:08.331055 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.224.6.157:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 00:24:08.345561 containerd[1467]: time="2026-04-16T00:24:08.345240984Z" level=info msg="CreateContainer within sandbox \"cfc54a2fa455005aace672803ccd28d1ec1a48df5e820665e3d20fee0da9b240\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f27630977eb6b3995620291ae42be0edab2b0485a3a6da838c4509af22d5cf1a\"" Apr 16 00:24:08.347128 containerd[1467]: time="2026-04-16T00:24:08.346848525Z" level=info msg="CreateContainer within sandbox \"a009c48b099678ebbc88aac756c63705830c9236fb505f0796435ed09dfa748f\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7a6eceb3b020bf427330d0521f8c40dd441082c15da8f156257ccfb71033fbd8\"" Apr 16 00:24:08.347720 containerd[1467]: time="2026-04-16T00:24:08.347686283Z" level=info msg="StartContainer for \"f27630977eb6b3995620291ae42be0edab2b0485a3a6da838c4509af22d5cf1a\"" Apr 16 00:24:08.358346 containerd[1467]: time="2026-04-16T00:24:08.358290956Z" level=info msg="CreateContainer within sandbox \"7b6208f400f870fab0b5309573b55f9a5c827170711c9c71322555316cfc48c2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943\"" Apr 16 00:24:08.358557 containerd[1467]: time="2026-04-16T00:24:08.358507207Z" level=info msg="StartContainer for \"7a6eceb3b020bf427330d0521f8c40dd441082c15da8f156257ccfb71033fbd8\"" Apr 16 00:24:08.360314 containerd[1467]: time="2026-04-16T00:24:08.360095344Z" level=info msg="StartContainer for \"70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943\"" Apr 16 00:24:08.386413 systemd[1]: Started cri-containerd-f27630977eb6b3995620291ae42be0edab2b0485a3a6da838c4509af22d5cf1a.scope - libcontainer container f27630977eb6b3995620291ae42be0edab2b0485a3a6da838c4509af22d5cf1a. Apr 16 00:24:08.396539 systemd[1]: Started cri-containerd-7a6eceb3b020bf427330d0521f8c40dd441082c15da8f156257ccfb71033fbd8.scope - libcontainer container 7a6eceb3b020bf427330d0521f8c40dd441082c15da8f156257ccfb71033fbd8. Apr 16 00:24:08.411586 systemd[1]: Started cri-containerd-70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943.scope - libcontainer container 70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943. Apr 16 00:24:08.455928 containerd[1467]: time="2026-04-16T00:24:08.455717724Z" level=info msg="StartContainer for \"7a6eceb3b020bf427330d0521f8c40dd441082c15da8f156257ccfb71033fbd8\" returns successfully" Apr 16 00:24:08.460939 containerd[1467]: time="2026-04-16T00:24:08.460725670Z" level=info msg="StartContainer for \"f27630977eb6b3995620291ae42be0edab2b0485a3a6da838c4509af22d5cf1a\" returns successfully" Apr 16 00:24:08.485285 containerd[1467]: time="2026-04-16T00:24:08.484687509Z" level=info msg="StartContainer for \"70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943\" returns successfully" Apr 16 00:24:08.492140 kubelet[2209]: E0416 00:24:08.492090 2209 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.224.6.157:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.224.6.157:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 00:24:08.547326 kubelet[2209]: E0416 00:24:08.547045 2209 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.6.157:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-56c15b786d?timeout=10s\": dial tcp 46.224.6.157:6443: connect: connection refused" interval="1.6s" Apr 16 00:24:08.732375 kubelet[2209]: I0416 00:24:08.731925 2209 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:09.177869 kubelet[2209]: E0416 00:24:09.177598 2209 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-56c15b786d\" not found" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:09.183870 kubelet[2209]: E0416 00:24:09.182965 2209 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-56c15b786d\" not found" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:09.187974 kubelet[2209]: E0416 00:24:09.187672 2209 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-56c15b786d\" not found" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.116360 kubelet[2209]: I0416 00:24:10.116071 2209 apiserver.go:52] "Watching apiserver" Apr 16 00:24:10.131963 kubelet[2209]: I0416 00:24:10.131714 2209 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.137812 kubelet[2209]: I0416 00:24:10.137384 2209 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.140233 kubelet[2209]: I0416 00:24:10.140158 2209 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 00:24:10.157479 kubelet[2209]: E0416 00:24:10.157232 2209 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-56c15b786d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.157479 kubelet[2209]: I0416 00:24:10.157279 2209 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.164382 kubelet[2209]: E0416 00:24:10.164335 2209 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.164382 kubelet[2209]: I0416 00:24:10.164374 2209 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.167816 kubelet[2209]: E0416 00:24:10.167753 2209 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.189847 kubelet[2209]: I0416 00:24:10.189796 2209 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.190301 kubelet[2209]: I0416 00:24:10.190254 2209 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.196505 kubelet[2209]: E0416 00:24:10.196459 2209 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:10.197025 kubelet[2209]: E0416 00:24:10.196671 2209 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-56c15b786d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:11.956471 kubelet[2209]: I0416 00:24:11.956420 2209 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:12.680412 systemd[1]: Reloading requested from client PID 2492 ('systemctl') (unit session-7.scope)... Apr 16 00:24:12.680438 systemd[1]: Reloading... Apr 16 00:24:12.800316 zram_generator::config[2541]: No configuration found. Apr 16 00:24:12.891852 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:24:12.985068 systemd[1]: Reloading finished in 304 ms. Apr 16 00:24:13.032128 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:24:13.049206 systemd[1]: kubelet.service: Deactivated successfully. Apr 16 00:24:13.049754 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:24:13.049853 systemd[1]: kubelet.service: Consumed 1.352s CPU time, 129.7M memory peak, 0B memory swap peak. Apr 16 00:24:13.058912 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:24:13.206576 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:24:13.218528 (kubelet)[2577]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 16 00:24:13.288282 kubelet[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 00:24:13.288282 kubelet[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 00:24:13.288282 kubelet[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 00:24:13.288282 kubelet[2577]: I0416 00:24:13.288178 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 00:24:13.300387 kubelet[2577]: I0416 00:24:13.299911 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Apr 16 00:24:13.300387 kubelet[2577]: I0416 00:24:13.299944 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 00:24:13.300595 kubelet[2577]: I0416 00:24:13.300407 2577 server.go:956] "Client rotation is on, will bootstrap in background" Apr 16 00:24:13.301947 kubelet[2577]: I0416 00:24:13.301910 2577 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 16 00:24:13.306843 kubelet[2577]: I0416 00:24:13.306344 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 16 00:24:13.313465 kubelet[2577]: E0416 00:24:13.313428 2577 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 16 00:24:13.313465 kubelet[2577]: I0416 00:24:13.313463 2577 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Apr 16 00:24:13.315999 kubelet[2577]: I0416 00:24:13.315960 2577 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Apr 16 00:24:13.316259 kubelet[2577]: I0416 00:24:13.316173 2577 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 00:24:13.316391 kubelet[2577]: I0416 00:24:13.316198 2577 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-56c15b786d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 00:24:13.316391 kubelet[2577]: I0416 00:24:13.316390 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 00:24:13.316391 kubelet[2577]: I0416 00:24:13.316399 2577 container_manager_linux.go:303] "Creating device plugin manager" Apr 16 00:24:13.318200 kubelet[2577]: I0416 00:24:13.316447 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 00:24:13.318200 kubelet[2577]: I0416 00:24:13.316615 2577 kubelet.go:480] "Attempting to sync node with API server" Apr 16 00:24:13.318200 kubelet[2577]: I0416 00:24:13.316629 2577 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 00:24:13.318200 kubelet[2577]: I0416 00:24:13.316653 2577 kubelet.go:386] "Adding apiserver pod source" Apr 16 00:24:13.318200 kubelet[2577]: I0416 00:24:13.316666 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 00:24:13.319772 kubelet[2577]: I0416 00:24:13.319734 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 16 00:24:13.321299 kubelet[2577]: I0416 00:24:13.321177 2577 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 00:24:13.328347 kubelet[2577]: I0416 00:24:13.326700 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 00:24:13.328347 kubelet[2577]: I0416 00:24:13.326769 2577 server.go:1289] "Started kubelet" Apr 16 00:24:13.330298 kubelet[2577]: I0416 00:24:13.328639 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 00:24:13.339545 kubelet[2577]: I0416 00:24:13.339259 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 00:24:13.340500 kubelet[2577]: I0416 00:24:13.340461 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 00:24:13.346623 kubelet[2577]: I0416 00:24:13.345979 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 00:24:13.346623 kubelet[2577]: I0416 00:24:13.346199 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 00:24:13.346623 kubelet[2577]: I0416 00:24:13.346427 2577 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 16 00:24:13.348135 kubelet[2577]: I0416 00:24:13.348110 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 00:24:13.348393 kubelet[2577]: E0416 00:24:13.348374 2577 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-56c15b786d\" not found" Apr 16 00:24:13.349031 kubelet[2577]: I0416 00:24:13.348997 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 00:24:13.349134 kubelet[2577]: I0416 00:24:13.349118 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 00:24:13.357298 kubelet[2577]: I0416 00:24:13.356146 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 00:24:13.357298 kubelet[2577]: I0416 00:24:13.357103 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 00:24:13.357298 kubelet[2577]: I0416 00:24:13.357136 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 00:24:13.357298 kubelet[2577]: I0416 00:24:13.357158 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 00:24:13.357298 kubelet[2577]: I0416 00:24:13.357164 2577 kubelet.go:2436] "Starting kubelet main sync loop" Apr 16 00:24:13.357298 kubelet[2577]: E0416 00:24:13.357208 2577 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 16 00:24:13.365124 kubelet[2577]: I0416 00:24:13.365075 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 00:24:13.365582 kubelet[2577]: I0416 00:24:13.365542 2577 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 16 00:24:13.378982 kubelet[2577]: I0416 00:24:13.378899 2577 factory.go:223] Registration of the containerd container factory successfully Apr 16 00:24:13.380932 kubelet[2577]: E0416 00:24:13.380481 2577 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 16 00:24:13.432443 kubelet[2577]: I0416 00:24:13.432389 2577 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 16 00:24:13.432443 kubelet[2577]: I0416 00:24:13.432441 2577 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 16 00:24:13.432594 kubelet[2577]: I0416 00:24:13.432463 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 00:24:13.432620 kubelet[2577]: I0416 00:24:13.432609 2577 state_mem.go:88] "Updated default CPUSet" cpuSet="" Apr 16 00:24:13.432644 kubelet[2577]: I0416 00:24:13.432620 2577 state_mem.go:96] "Updated CPUSet assignments" assignments={} Apr 16 00:24:13.432644 kubelet[2577]: I0416 00:24:13.432637 2577 policy_none.go:49] "None policy: Start" Apr 16 00:24:13.432697 kubelet[2577]: I0416 00:24:13.432646 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 00:24:13.432697 kubelet[2577]: I0416 00:24:13.432654 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 00:24:13.432742 kubelet[2577]: I0416 00:24:13.432732 2577 state_mem.go:75] "Updated machine memory state" Apr 16 00:24:13.437227 kubelet[2577]: E0416 00:24:13.436814 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 00:24:13.437227 kubelet[2577]: I0416 00:24:13.436992 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 00:24:13.437227 kubelet[2577]: I0416 00:24:13.437003 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 00:24:13.437797 kubelet[2577]: I0416 00:24:13.437781 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 00:24:13.439157 kubelet[2577]: E0416 00:24:13.439127 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 16 00:24:13.459131 kubelet[2577]: I0416 00:24:13.459090 2577 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.460564 kubelet[2577]: I0416 00:24:13.460412 2577 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.460564 kubelet[2577]: I0416 00:24:13.460418 2577 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.465440 kubelet[2577]: E0416 00:24:13.465362 2577 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-56c15b786d\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.541690 kubelet[2577]: I0416 00:24:13.540637 2577 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550176 kubelet[2577]: I0416 00:24:13.550014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550176 kubelet[2577]: I0416 00:24:13.550113 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550176 kubelet[2577]: I0416 00:24:13.550172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/59b2bf97761f216fe0218931684217e1-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" (UID: \"59b2bf97761f216fe0218931684217e1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550587 kubelet[2577]: I0416 00:24:13.550215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/59b2bf97761f216fe0218931684217e1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" (UID: \"59b2bf97761f216fe0218931684217e1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550587 kubelet[2577]: I0416 00:24:13.550252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550587 kubelet[2577]: I0416 00:24:13.550303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/76bc5201f781f260bbe53dd859aa756d-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-56c15b786d\" (UID: \"76bc5201f781f260bbe53dd859aa756d\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550587 kubelet[2577]: I0416 00:24:13.550358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/59b2bf97761f216fe0218931684217e1-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-56c15b786d\" (UID: \"59b2bf97761f216fe0218931684217e1\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550587 kubelet[2577]: I0416 00:24:13.550392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.550787 kubelet[2577]: I0416 00:24:13.550441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/64c8c139bb99c8f410339f1b416c1895-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-56c15b786d\" (UID: \"64c8c139bb99c8f410339f1b416c1895\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.551847 kubelet[2577]: I0416 00:24:13.551661 2577 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:13.551847 kubelet[2577]: I0416 00:24:13.551743 2577 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:14.318051 kubelet[2577]: I0416 00:24:14.317669 2577 apiserver.go:52] "Watching apiserver" Apr 16 00:24:14.349645 kubelet[2577]: I0416 00:24:14.349600 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 00:24:14.412649 kubelet[2577]: I0416 00:24:14.412540 2577 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:14.423631 kubelet[2577]: E0416 00:24:14.423552 2577 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-56c15b786d\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" Apr 16 00:24:14.445457 kubelet[2577]: I0416 00:24:14.445040 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-n-56c15b786d" podStartSLOduration=1.444948815 podStartE2EDuration="1.444948815s" podCreationTimestamp="2026-04-16 00:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:24:14.432622679 +0000 UTC m=+1.204722775" watchObservedRunningTime="2026-04-16 00:24:14.444948815 +0000 UTC m=+1.217048951" Apr 16 00:24:14.467526 kubelet[2577]: I0416 00:24:14.467471 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-56c15b786d" podStartSLOduration=1.4674563 podStartE2EDuration="1.4674563s" podCreationTimestamp="2026-04-16 00:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:24:14.446273253 +0000 UTC m=+1.218373389" watchObservedRunningTime="2026-04-16 00:24:14.4674563 +0000 UTC m=+1.239556396" Apr 16 00:24:14.467864 kubelet[2577]: I0416 00:24:14.467739 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-n-56c15b786d" podStartSLOduration=3.46773159 podStartE2EDuration="3.46773159s" podCreationTimestamp="2026-04-16 00:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:24:14.466070171 +0000 UTC m=+1.238170267" watchObservedRunningTime="2026-04-16 00:24:14.46773159 +0000 UTC m=+1.239831686" Apr 16 00:24:18.370742 kubelet[2577]: I0416 00:24:18.370703 2577 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 16 00:24:18.372375 containerd[1467]: time="2026-04-16T00:24:18.371559050Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 16 00:24:18.373421 kubelet[2577]: I0416 00:24:18.372006 2577 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 16 00:24:19.096337 systemd[1]: Created slice kubepods-besteffort-pod129f661b_5818_4c04_ae0b_6c6c1c2e6aeb.slice - libcontainer container kubepods-besteffort-pod129f661b_5818_4c04_ae0b_6c6c1c2e6aeb.slice. Apr 16 00:24:19.186738 kubelet[2577]: I0416 00:24:19.186552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/129f661b-5818-4c04-ae0b-6c6c1c2e6aeb-kube-proxy\") pod \"kube-proxy-5dvjz\" (UID: \"129f661b-5818-4c04-ae0b-6c6c1c2e6aeb\") " pod="kube-system/kube-proxy-5dvjz" Apr 16 00:24:19.186738 kubelet[2577]: I0416 00:24:19.186602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/129f661b-5818-4c04-ae0b-6c6c1c2e6aeb-xtables-lock\") pod \"kube-proxy-5dvjz\" (UID: \"129f661b-5818-4c04-ae0b-6c6c1c2e6aeb\") " pod="kube-system/kube-proxy-5dvjz" Apr 16 00:24:19.186738 kubelet[2577]: I0416 00:24:19.186629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/129f661b-5818-4c04-ae0b-6c6c1c2e6aeb-lib-modules\") pod \"kube-proxy-5dvjz\" (UID: \"129f661b-5818-4c04-ae0b-6c6c1c2e6aeb\") " pod="kube-system/kube-proxy-5dvjz" Apr 16 00:24:19.186738 kubelet[2577]: I0416 00:24:19.186652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqlh\" (UniqueName: \"kubernetes.io/projected/129f661b-5818-4c04-ae0b-6c6c1c2e6aeb-kube-api-access-6kqlh\") pod \"kube-proxy-5dvjz\" (UID: \"129f661b-5818-4c04-ae0b-6c6c1c2e6aeb\") " pod="kube-system/kube-proxy-5dvjz" Apr 16 00:24:19.302029 kubelet[2577]: E0416 00:24:19.301971 2577 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 16 00:24:19.302029 kubelet[2577]: E0416 00:24:19.302005 2577 projected.go:194] Error preparing data for projected volume kube-api-access-6kqlh for pod kube-system/kube-proxy-5dvjz: configmap "kube-root-ca.crt" not found Apr 16 00:24:19.302192 kubelet[2577]: E0416 00:24:19.302073 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/129f661b-5818-4c04-ae0b-6c6c1c2e6aeb-kube-api-access-6kqlh podName:129f661b-5818-4c04-ae0b-6c6c1c2e6aeb nodeName:}" failed. No retries permitted until 2026-04-16 00:24:19.802050626 +0000 UTC m=+6.574150722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6kqlh" (UniqueName: "kubernetes.io/projected/129f661b-5818-4c04-ae0b-6c6c1c2e6aeb-kube-api-access-6kqlh") pod "kube-proxy-5dvjz" (UID: "129f661b-5818-4c04-ae0b-6c6c1c2e6aeb") : configmap "kube-root-ca.crt" not found Apr 16 00:24:19.606530 systemd[1]: Created slice kubepods-besteffort-pod3ad4b044_b837_4c45_a81c_80fed92fee81.slice - libcontainer container kubepods-besteffort-pod3ad4b044_b837_4c45_a81c_80fed92fee81.slice. Apr 16 00:24:19.690907 kubelet[2577]: I0416 00:24:19.690824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3ad4b044-b837-4c45-a81c-80fed92fee81-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-hp75m\" (UID: \"3ad4b044-b837-4c45-a81c-80fed92fee81\") " pod="tigera-operator/tigera-operator-6bf85f8dd-hp75m" Apr 16 00:24:19.690907 kubelet[2577]: I0416 00:24:19.690906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2td4\" (UniqueName: \"kubernetes.io/projected/3ad4b044-b837-4c45-a81c-80fed92fee81-kube-api-access-d2td4\") pod \"tigera-operator-6bf85f8dd-hp75m\" (UID: \"3ad4b044-b837-4c45-a81c-80fed92fee81\") " pod="tigera-operator/tigera-operator-6bf85f8dd-hp75m" Apr 16 00:24:19.912789 containerd[1467]: time="2026-04-16T00:24:19.912610425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-hp75m,Uid:3ad4b044-b837-4c45-a81c-80fed92fee81,Namespace:tigera-operator,Attempt:0,}" Apr 16 00:24:19.942899 containerd[1467]: time="2026-04-16T00:24:19.942516281Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:19.942899 containerd[1467]: time="2026-04-16T00:24:19.942588292Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:19.942899 containerd[1467]: time="2026-04-16T00:24:19.942603894Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:19.942899 containerd[1467]: time="2026-04-16T00:24:19.942686386Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:19.964508 systemd[1]: Started cri-containerd-4caa3e2c59fdba7976294613ea5c0b07a32653d91bd6b7e2a40203aa062474ab.scope - libcontainer container 4caa3e2c59fdba7976294613ea5c0b07a32653d91bd6b7e2a40203aa062474ab. Apr 16 00:24:19.993754 containerd[1467]: time="2026-04-16T00:24:19.993696772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-hp75m,Uid:3ad4b044-b837-4c45-a81c-80fed92fee81,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4caa3e2c59fdba7976294613ea5c0b07a32653d91bd6b7e2a40203aa062474ab\"" Apr 16 00:24:19.996326 containerd[1467]: time="2026-04-16T00:24:19.996152971Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 16 00:24:20.009222 containerd[1467]: time="2026-04-16T00:24:20.008632992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5dvjz,Uid:129f661b-5818-4c04-ae0b-6c6c1c2e6aeb,Namespace:kube-system,Attempt:0,}" Apr 16 00:24:20.034712 containerd[1467]: time="2026-04-16T00:24:20.034198834Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:20.034712 containerd[1467]: time="2026-04-16T00:24:20.034308089Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:20.034712 containerd[1467]: time="2026-04-16T00:24:20.034334493Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:20.034712 containerd[1467]: time="2026-04-16T00:24:20.034467112Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:20.054469 systemd[1]: Started cri-containerd-f4a17d127a317e89dabe4c0d1629045e5941709ed118038a592a2bf2c4b4a223.scope - libcontainer container f4a17d127a317e89dabe4c0d1629045e5941709ed118038a592a2bf2c4b4a223. Apr 16 00:24:20.079169 containerd[1467]: time="2026-04-16T00:24:20.079073596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5dvjz,Uid:129f661b-5818-4c04-ae0b-6c6c1c2e6aeb,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4a17d127a317e89dabe4c0d1629045e5941709ed118038a592a2bf2c4b4a223\"" Apr 16 00:24:20.085971 containerd[1467]: time="2026-04-16T00:24:20.085932722Z" level=info msg="CreateContainer within sandbox \"f4a17d127a317e89dabe4c0d1629045e5941709ed118038a592a2bf2c4b4a223\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 16 00:24:20.102330 containerd[1467]: time="2026-04-16T00:24:20.102233378Z" level=info msg="CreateContainer within sandbox \"f4a17d127a317e89dabe4c0d1629045e5941709ed118038a592a2bf2c4b4a223\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b8328547e1078eedb456e669a350f3dbac839bdfadf2d0da4d9dcfb8f956db71\"" Apr 16 00:24:20.105337 containerd[1467]: time="2026-04-16T00:24:20.103536962Z" level=info msg="StartContainer for \"b8328547e1078eedb456e669a350f3dbac839bdfadf2d0da4d9dcfb8f956db71\"" Apr 16 00:24:20.131495 systemd[1]: Started cri-containerd-b8328547e1078eedb456e669a350f3dbac839bdfadf2d0da4d9dcfb8f956db71.scope - libcontainer container b8328547e1078eedb456e669a350f3dbac839bdfadf2d0da4d9dcfb8f956db71. Apr 16 00:24:20.164370 containerd[1467]: time="2026-04-16T00:24:20.163951953Z" level=info msg="StartContainer for \"b8328547e1078eedb456e669a350f3dbac839bdfadf2d0da4d9dcfb8f956db71\" returns successfully" Apr 16 00:24:20.456683 kubelet[2577]: I0416 00:24:20.456499 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5dvjz" podStartSLOduration=1.4564793630000001 podStartE2EDuration="1.456479363s" podCreationTimestamp="2026-04-16 00:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:24:20.441559421 +0000 UTC m=+7.213659557" watchObservedRunningTime="2026-04-16 00:24:20.456479363 +0000 UTC m=+7.228579459" Apr 16 00:24:21.783163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount489220546.mount: Deactivated successfully. Apr 16 00:24:22.243530 containerd[1467]: time="2026-04-16T00:24:22.243465796Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:22.244916 containerd[1467]: time="2026-04-16T00:24:22.244854338Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Apr 16 00:24:22.245992 containerd[1467]: time="2026-04-16T00:24:22.245917917Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:22.251483 containerd[1467]: time="2026-04-16T00:24:22.251407156Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:22.252567 containerd[1467]: time="2026-04-16T00:24:22.252530063Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.256339486s" Apr 16 00:24:22.252645 containerd[1467]: time="2026-04-16T00:24:22.252572028Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Apr 16 00:24:22.258297 containerd[1467]: time="2026-04-16T00:24:22.258222928Z" level=info msg="CreateContainer within sandbox \"4caa3e2c59fdba7976294613ea5c0b07a32653d91bd6b7e2a40203aa062474ab\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 16 00:24:22.274717 containerd[1467]: time="2026-04-16T00:24:22.274666361Z" level=info msg="CreateContainer within sandbox \"4caa3e2c59fdba7976294613ea5c0b07a32653d91bd6b7e2a40203aa062474ab\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c\"" Apr 16 00:24:22.275593 containerd[1467]: time="2026-04-16T00:24:22.275557998Z" level=info msg="StartContainer for \"01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c\"" Apr 16 00:24:22.312509 systemd[1]: Started cri-containerd-01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c.scope - libcontainer container 01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c. Apr 16 00:24:22.342384 containerd[1467]: time="2026-04-16T00:24:22.342326539Z" level=info msg="StartContainer for \"01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c\" returns successfully" Apr 16 00:24:26.276126 kubelet[2577]: I0416 00:24:26.276010 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-hp75m" podStartSLOduration=5.017446712 podStartE2EDuration="7.275912129s" podCreationTimestamp="2026-04-16 00:24:19 +0000 UTC" firstStartedPulling="2026-04-16 00:24:19.99532097 +0000 UTC m=+6.767421066" lastFinishedPulling="2026-04-16 00:24:22.253786387 +0000 UTC m=+9.025886483" observedRunningTime="2026-04-16 00:24:22.448441712 +0000 UTC m=+9.220541808" watchObservedRunningTime="2026-04-16 00:24:26.275912129 +0000 UTC m=+13.048012225" Apr 16 00:24:28.498630 sudo[1692]: pam_unix(sudo:session): session closed for user root Apr 16 00:24:28.516993 sshd[1689]: pam_unix(sshd:session): session closed for user core Apr 16 00:24:28.524610 systemd[1]: sshd@6-46.224.6.157:22-4.175.71.9:37818.service: Deactivated successfully. Apr 16 00:24:28.530155 systemd[1]: session-7.scope: Deactivated successfully. Apr 16 00:24:28.532423 systemd[1]: session-7.scope: Consumed 7.396s CPU time, 151.6M memory peak, 0B memory swap peak. Apr 16 00:24:28.533370 systemd-logind[1451]: Session 7 logged out. Waiting for processes to exit. Apr 16 00:24:28.534441 systemd-logind[1451]: Removed session 7. Apr 16 00:24:32.744948 systemd[1]: Created slice kubepods-besteffort-pod9c73994c_6429_4fc2_9f28_1b7d22c27b19.slice - libcontainer container kubepods-besteffort-pod9c73994c_6429_4fc2_9f28_1b7d22c27b19.slice. Apr 16 00:24:32.781855 kubelet[2577]: I0416 00:24:32.781699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4qdd\" (UniqueName: \"kubernetes.io/projected/9c73994c-6429-4fc2-9f28-1b7d22c27b19-kube-api-access-x4qdd\") pod \"calico-typha-9cd95957f-gdd7n\" (UID: \"9c73994c-6429-4fc2-9f28-1b7d22c27b19\") " pod="calico-system/calico-typha-9cd95957f-gdd7n" Apr 16 00:24:32.781855 kubelet[2577]: I0416 00:24:32.781753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9c73994c-6429-4fc2-9f28-1b7d22c27b19-typha-certs\") pod \"calico-typha-9cd95957f-gdd7n\" (UID: \"9c73994c-6429-4fc2-9f28-1b7d22c27b19\") " pod="calico-system/calico-typha-9cd95957f-gdd7n" Apr 16 00:24:32.781855 kubelet[2577]: I0416 00:24:32.781773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c73994c-6429-4fc2-9f28-1b7d22c27b19-tigera-ca-bundle\") pod \"calico-typha-9cd95957f-gdd7n\" (UID: \"9c73994c-6429-4fc2-9f28-1b7d22c27b19\") " pod="calico-system/calico-typha-9cd95957f-gdd7n" Apr 16 00:24:32.875543 systemd[1]: Created slice kubepods-besteffort-podd7a69996_b184_4d62_a4bf_fa3304b2b8a5.slice - libcontainer container kubepods-besteffort-podd7a69996_b184_4d62_a4bf_fa3304b2b8a5.slice. Apr 16 00:24:32.883548 kubelet[2577]: I0416 00:24:32.882741 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-nodeproc\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883548 kubelet[2577]: I0416 00:24:32.882777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-var-lib-calico\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883548 kubelet[2577]: I0416 00:24:32.882793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-cni-net-dir\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883548 kubelet[2577]: I0416 00:24:32.882807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-lib-modules\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883548 kubelet[2577]: I0416 00:24:32.882822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-cni-bin-dir\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883776 kubelet[2577]: I0416 00:24:32.882837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-cni-log-dir\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883776 kubelet[2577]: I0416 00:24:32.882865 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-node-certs\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883776 kubelet[2577]: I0416 00:24:32.882878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-xtables-lock\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883776 kubelet[2577]: I0416 00:24:32.882896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-flexvol-driver-host\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883776 kubelet[2577]: I0416 00:24:32.882912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-tigera-ca-bundle\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883890 kubelet[2577]: I0416 00:24:32.882929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl2w5\" (UniqueName: \"kubernetes.io/projected/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-kube-api-access-pl2w5\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883890 kubelet[2577]: I0416 00:24:32.882956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-bpffs\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883890 kubelet[2577]: I0416 00:24:32.882970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-var-run-calico\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883890 kubelet[2577]: I0416 00:24:32.882998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-policysync\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.883890 kubelet[2577]: I0416 00:24:32.883017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7a69996-b184-4d62-a4bf-fa3304b2b8a5-sys-fs\") pod \"calico-node-cfjdx\" (UID: \"d7a69996-b184-4d62-a4bf-fa3304b2b8a5\") " pod="calico-system/calico-node-cfjdx" Apr 16 00:24:32.983591 kubelet[2577]: E0416 00:24:32.983539 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:32.989585 kubelet[2577]: E0416 00:24:32.989538 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:32.989585 kubelet[2577]: W0416 00:24:32.989568 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:32.989585 kubelet[2577]: E0416 00:24:32.989591 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:32.993642 kubelet[2577]: E0416 00:24:32.993305 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:32.993642 kubelet[2577]: W0416 00:24:32.993336 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:32.993642 kubelet[2577]: E0416 00:24:32.993358 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.022768 kubelet[2577]: E0416 00:24:33.022637 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.022768 kubelet[2577]: W0416 00:24:33.022664 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.022768 kubelet[2577]: E0416 00:24:33.022685 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.052554 containerd[1467]: time="2026-04-16T00:24:33.052466827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9cd95957f-gdd7n,Uid:9c73994c-6429-4fc2-9f28-1b7d22c27b19,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:33.072865 kubelet[2577]: E0416 00:24:33.072657 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.072865 kubelet[2577]: W0416 00:24:33.072682 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.072865 kubelet[2577]: E0416 00:24:33.072703 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.072865 kubelet[2577]: E0416 00:24:33.072898 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.073497 kubelet[2577]: W0416 00:24:33.072910 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.073497 kubelet[2577]: E0416 00:24:33.073042 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.073497 kubelet[2577]: E0416 00:24:33.073223 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.073497 kubelet[2577]: W0416 00:24:33.073232 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.073497 kubelet[2577]: E0416 00:24:33.073241 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.076195 kubelet[2577]: E0416 00:24:33.074794 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.076195 kubelet[2577]: W0416 00:24:33.074813 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.076195 kubelet[2577]: E0416 00:24:33.074828 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.076195 kubelet[2577]: E0416 00:24:33.075076 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.076195 kubelet[2577]: W0416 00:24:33.075086 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.076195 kubelet[2577]: E0416 00:24:33.075114 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.076195 kubelet[2577]: E0416 00:24:33.075258 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.076195 kubelet[2577]: W0416 00:24:33.075282 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.076195 kubelet[2577]: E0416 00:24:33.075325 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.076195 kubelet[2577]: E0416 00:24:33.075569 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.077423 kubelet[2577]: W0416 00:24:33.075579 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.077423 kubelet[2577]: E0416 00:24:33.075590 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.077423 kubelet[2577]: E0416 00:24:33.075730 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.077423 kubelet[2577]: W0416 00:24:33.075737 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.077423 kubelet[2577]: E0416 00:24:33.075744 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.077423 kubelet[2577]: E0416 00:24:33.075875 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.077423 kubelet[2577]: W0416 00:24:33.075882 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.077423 kubelet[2577]: E0416 00:24:33.075889 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.077423 kubelet[2577]: E0416 00:24:33.076005 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.077423 kubelet[2577]: W0416 00:24:33.076015 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080324 kubelet[2577]: E0416 00:24:33.076022 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080324 kubelet[2577]: E0416 00:24:33.076330 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080324 kubelet[2577]: W0416 00:24:33.076343 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080324 kubelet[2577]: E0416 00:24:33.076353 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080324 kubelet[2577]: E0416 00:24:33.076622 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080324 kubelet[2577]: W0416 00:24:33.076632 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080324 kubelet[2577]: E0416 00:24:33.076643 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080324 kubelet[2577]: E0416 00:24:33.076847 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080324 kubelet[2577]: W0416 00:24:33.076888 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080324 kubelet[2577]: E0416 00:24:33.076897 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080561 kubelet[2577]: E0416 00:24:33.077065 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080561 kubelet[2577]: W0416 00:24:33.077074 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080561 kubelet[2577]: E0416 00:24:33.077089 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080561 kubelet[2577]: E0416 00:24:33.077405 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080561 kubelet[2577]: W0416 00:24:33.077417 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080561 kubelet[2577]: E0416 00:24:33.077428 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080561 kubelet[2577]: E0416 00:24:33.077667 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080561 kubelet[2577]: W0416 00:24:33.077677 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080561 kubelet[2577]: E0416 00:24:33.077687 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080561 kubelet[2577]: E0416 00:24:33.078000 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080780 kubelet[2577]: W0416 00:24:33.078041 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080780 kubelet[2577]: E0416 00:24:33.078061 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.080780 kubelet[2577]: E0416 00:24:33.080252 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.080780 kubelet[2577]: W0416 00:24:33.080586 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.080780 kubelet[2577]: E0416 00:24:33.080608 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.081678 kubelet[2577]: E0416 00:24:33.081655 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.081678 kubelet[2577]: W0416 00:24:33.081672 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.081899 kubelet[2577]: E0416 00:24:33.081686 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.082293 kubelet[2577]: E0416 00:24:33.082255 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.082606 kubelet[2577]: W0416 00:24:33.082574 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.082606 kubelet[2577]: E0416 00:24:33.082599 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.084599 kubelet[2577]: E0416 00:24:33.084538 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.084599 kubelet[2577]: W0416 00:24:33.084555 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.084599 kubelet[2577]: E0416 00:24:33.084568 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.085005 kubelet[2577]: I0416 00:24:33.084867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec-kubelet-dir\") pod \"csi-node-driver-ptvrb\" (UID: \"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec\") " pod="calico-system/csi-node-driver-ptvrb" Apr 16 00:24:33.086035 kubelet[2577]: E0416 00:24:33.086010 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.086035 kubelet[2577]: W0416 00:24:33.086034 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.086975 kubelet[2577]: E0416 00:24:33.086049 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.087424 kubelet[2577]: E0416 00:24:33.087400 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.087424 kubelet[2577]: W0416 00:24:33.087423 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.087525 kubelet[2577]: E0416 00:24:33.087442 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.088380 kubelet[2577]: E0416 00:24:33.088357 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.088380 kubelet[2577]: W0416 00:24:33.088380 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.088486 kubelet[2577]: E0416 00:24:33.088401 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.088486 kubelet[2577]: I0416 00:24:33.088433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec-socket-dir\") pod \"csi-node-driver-ptvrb\" (UID: \"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec\") " pod="calico-system/csi-node-driver-ptvrb" Apr 16 00:24:33.088677 kubelet[2577]: E0416 00:24:33.088664 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.088722 kubelet[2577]: W0416 00:24:33.088678 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.088722 kubelet[2577]: E0416 00:24:33.088694 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.088722 kubelet[2577]: I0416 00:24:33.088720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec-varrun\") pod \"csi-node-driver-ptvrb\" (UID: \"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec\") " pod="calico-system/csi-node-driver-ptvrb" Apr 16 00:24:33.088986 kubelet[2577]: E0416 00:24:33.088970 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.088986 kubelet[2577]: W0416 00:24:33.088985 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.089063 kubelet[2577]: E0416 00:24:33.088996 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.089063 kubelet[2577]: I0416 00:24:33.089018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrjxm\" (UniqueName: \"kubernetes.io/projected/2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec-kube-api-access-vrjxm\") pod \"csi-node-driver-ptvrb\" (UID: \"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec\") " pod="calico-system/csi-node-driver-ptvrb" Apr 16 00:24:33.090464 kubelet[2577]: E0416 00:24:33.090425 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.090464 kubelet[2577]: W0416 00:24:33.090453 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.090464 kubelet[2577]: E0416 00:24:33.090468 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.090582 kubelet[2577]: I0416 00:24:33.090494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec-registration-dir\") pod \"csi-node-driver-ptvrb\" (UID: \"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec\") " pod="calico-system/csi-node-driver-ptvrb" Apr 16 00:24:33.090822 kubelet[2577]: E0416 00:24:33.090783 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.090822 kubelet[2577]: W0416 00:24:33.090799 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.090822 kubelet[2577]: E0416 00:24:33.090811 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.091044 kubelet[2577]: E0416 00:24:33.091026 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.091044 kubelet[2577]: W0416 00:24:33.091036 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.091044 kubelet[2577]: E0416 00:24:33.091045 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.091356 kubelet[2577]: E0416 00:24:33.091321 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.091356 kubelet[2577]: W0416 00:24:33.091337 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.091356 kubelet[2577]: E0416 00:24:33.091348 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.091538 kubelet[2577]: E0416 00:24:33.091527 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.091538 kubelet[2577]: W0416 00:24:33.091538 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.091599 kubelet[2577]: E0416 00:24:33.091548 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.092746 kubelet[2577]: E0416 00:24:33.092648 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.092746 kubelet[2577]: W0416 00:24:33.092672 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.092746 kubelet[2577]: E0416 00:24:33.092684 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.093730 kubelet[2577]: E0416 00:24:33.093150 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.093730 kubelet[2577]: W0416 00:24:33.093168 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.093730 kubelet[2577]: E0416 00:24:33.093182 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.094390 kubelet[2577]: E0416 00:24:33.094365 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.094390 kubelet[2577]: W0416 00:24:33.094384 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.094390 kubelet[2577]: E0416 00:24:33.094397 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.095829 kubelet[2577]: E0416 00:24:33.095798 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.095829 kubelet[2577]: W0416 00:24:33.095815 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.095829 kubelet[2577]: E0416 00:24:33.095840 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.099472 containerd[1467]: time="2026-04-16T00:24:33.099261875Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:33.099472 containerd[1467]: time="2026-04-16T00:24:33.099435011Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:33.099656 containerd[1467]: time="2026-04-16T00:24:33.099559303Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:33.100418 containerd[1467]: time="2026-04-16T00:24:33.100372420Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:33.122554 systemd[1]: Started cri-containerd-38a17641d8d8d1adb7332662746f39ca4e13d9e7ef055e57d11b88e8564a8d4d.scope - libcontainer container 38a17641d8d8d1adb7332662746f39ca4e13d9e7ef055e57d11b88e8564a8d4d. Apr 16 00:24:33.163260 containerd[1467]: time="2026-04-16T00:24:33.163214540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9cd95957f-gdd7n,Uid:9c73994c-6429-4fc2-9f28-1b7d22c27b19,Namespace:calico-system,Attempt:0,} returns sandbox id \"38a17641d8d8d1adb7332662746f39ca4e13d9e7ef055e57d11b88e8564a8d4d\"" Apr 16 00:24:33.166347 containerd[1467]: time="2026-04-16T00:24:33.166054727Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 16 00:24:33.181547 containerd[1467]: time="2026-04-16T00:24:33.181487621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cfjdx,Uid:d7a69996-b184-4d62-a4bf-fa3304b2b8a5,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:33.192564 kubelet[2577]: E0416 00:24:33.192442 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.192564 kubelet[2577]: W0416 00:24:33.192483 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.192564 kubelet[2577]: E0416 00:24:33.192515 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.193422 kubelet[2577]: E0416 00:24:33.193392 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.194889 kubelet[2577]: W0416 00:24:33.193425 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.194889 kubelet[2577]: E0416 00:24:33.193451 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.195203 kubelet[2577]: E0416 00:24:33.195182 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.195260 kubelet[2577]: W0416 00:24:33.195208 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.195260 kubelet[2577]: E0416 00:24:33.195222 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.195491 kubelet[2577]: E0416 00:24:33.195476 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.195491 kubelet[2577]: W0416 00:24:33.195488 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.195554 kubelet[2577]: E0416 00:24:33.195499 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.195797 kubelet[2577]: E0416 00:24:33.195751 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.195797 kubelet[2577]: W0416 00:24:33.195763 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.195872 kubelet[2577]: E0416 00:24:33.195774 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.196035 kubelet[2577]: E0416 00:24:33.196017 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.196086 kubelet[2577]: W0416 00:24:33.196040 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.196086 kubelet[2577]: E0416 00:24:33.196052 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.196438 kubelet[2577]: E0416 00:24:33.196261 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.196438 kubelet[2577]: W0416 00:24:33.196309 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.196438 kubelet[2577]: E0416 00:24:33.196322 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.196636 kubelet[2577]: E0416 00:24:33.196540 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.196636 kubelet[2577]: W0416 00:24:33.196550 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.196636 kubelet[2577]: E0416 00:24:33.196558 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.197974 kubelet[2577]: E0416 00:24:33.197837 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.197974 kubelet[2577]: W0416 00:24:33.197856 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.197974 kubelet[2577]: E0416 00:24:33.197869 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.198244 kubelet[2577]: E0416 00:24:33.198203 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.198244 kubelet[2577]: W0416 00:24:33.198216 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.198244 kubelet[2577]: E0416 00:24:33.198227 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.199208 kubelet[2577]: E0416 00:24:33.199187 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.199208 kubelet[2577]: W0416 00:24:33.199205 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.199741 kubelet[2577]: E0416 00:24:33.199217 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.199741 kubelet[2577]: E0416 00:24:33.199448 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.199741 kubelet[2577]: W0416 00:24:33.199457 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.199741 kubelet[2577]: E0416 00:24:33.199466 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.200048 kubelet[2577]: E0416 00:24:33.200017 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.200048 kubelet[2577]: W0416 00:24:33.200045 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.200175 kubelet[2577]: E0416 00:24:33.200058 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.200694 kubelet[2577]: E0416 00:24:33.200671 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.200694 kubelet[2577]: W0416 00:24:33.200689 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.200811 kubelet[2577]: E0416 00:24:33.200701 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.201672 kubelet[2577]: E0416 00:24:33.201390 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.201672 kubelet[2577]: W0416 00:24:33.201409 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.201672 kubelet[2577]: E0416 00:24:33.201421 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.201672 kubelet[2577]: E0416 00:24:33.201619 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.201672 kubelet[2577]: W0416 00:24:33.201628 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.201672 kubelet[2577]: E0416 00:24:33.201637 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.201961 kubelet[2577]: E0416 00:24:33.201822 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.201961 kubelet[2577]: W0416 00:24:33.201832 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.201961 kubelet[2577]: E0416 00:24:33.201841 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.202571 kubelet[2577]: E0416 00:24:33.202195 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.202571 kubelet[2577]: W0416 00:24:33.202214 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.202571 kubelet[2577]: E0416 00:24:33.202226 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.203366 kubelet[2577]: E0416 00:24:33.203322 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.203366 kubelet[2577]: W0416 00:24:33.203343 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.203366 kubelet[2577]: E0416 00:24:33.203357 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.204188 kubelet[2577]: E0416 00:24:33.203656 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.204188 kubelet[2577]: W0416 00:24:33.203673 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.204188 kubelet[2577]: E0416 00:24:33.203689 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.207296 kubelet[2577]: E0416 00:24:33.204499 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.207296 kubelet[2577]: W0416 00:24:33.204527 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.207296 kubelet[2577]: E0416 00:24:33.204540 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.207296 kubelet[2577]: E0416 00:24:33.204872 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.207296 kubelet[2577]: W0416 00:24:33.204883 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.207296 kubelet[2577]: E0416 00:24:33.204911 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.207296 kubelet[2577]: E0416 00:24:33.205144 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.207296 kubelet[2577]: W0416 00:24:33.205158 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.207296 kubelet[2577]: E0416 00:24:33.205170 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.207296 kubelet[2577]: E0416 00:24:33.205500 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.207601 kubelet[2577]: W0416 00:24:33.205512 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.207601 kubelet[2577]: E0416 00:24:33.205544 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.207601 kubelet[2577]: E0416 00:24:33.206025 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.207601 kubelet[2577]: W0416 00:24:33.206039 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.207601 kubelet[2577]: E0416 00:24:33.206052 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.221213 kubelet[2577]: E0416 00:24:33.221182 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:33.221213 kubelet[2577]: W0416 00:24:33.221207 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:33.221400 kubelet[2577]: E0416 00:24:33.221227 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:33.223084 containerd[1467]: time="2026-04-16T00:24:33.222726346Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:33.223084 containerd[1467]: time="2026-04-16T00:24:33.222792392Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:33.223084 containerd[1467]: time="2026-04-16T00:24:33.222807874Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:33.223084 containerd[1467]: time="2026-04-16T00:24:33.222898202Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:33.245580 systemd[1]: Started cri-containerd-d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e.scope - libcontainer container d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e. Apr 16 00:24:33.271316 containerd[1467]: time="2026-04-16T00:24:33.271241757Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cfjdx,Uid:d7a69996-b184-4d62-a4bf-fa3304b2b8a5,Namespace:calico-system,Attempt:0,} returns sandbox id \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\"" Apr 16 00:24:34.652820 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2636162126.mount: Deactivated successfully. Apr 16 00:24:35.120375 containerd[1467]: time="2026-04-16T00:24:35.120319301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:35.122358 containerd[1467]: time="2026-04-16T00:24:35.122292758Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Apr 16 00:24:35.123751 containerd[1467]: time="2026-04-16T00:24:35.123697284Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:35.126699 containerd[1467]: time="2026-04-16T00:24:35.126344522Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:35.127278 containerd[1467]: time="2026-04-16T00:24:35.127222361Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 1.96112679s" Apr 16 00:24:35.127356 containerd[1467]: time="2026-04-16T00:24:35.127317370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Apr 16 00:24:35.129606 containerd[1467]: time="2026-04-16T00:24:35.129555371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 16 00:24:35.145701 containerd[1467]: time="2026-04-16T00:24:35.145616815Z" level=info msg="CreateContainer within sandbox \"38a17641d8d8d1adb7332662746f39ca4e13d9e7ef055e57d11b88e8564a8d4d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 16 00:24:35.167169 containerd[1467]: time="2026-04-16T00:24:35.166884847Z" level=info msg="CreateContainer within sandbox \"38a17641d8d8d1adb7332662746f39ca4e13d9e7ef055e57d11b88e8564a8d4d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"db47d436534364adb3637751dbebbff245d56e816721f6bd71fe4baaedb0446d\"" Apr 16 00:24:35.168754 containerd[1467]: time="2026-04-16T00:24:35.168696890Z" level=info msg="StartContainer for \"db47d436534364adb3637751dbebbff245d56e816721f6bd71fe4baaedb0446d\"" Apr 16 00:24:35.199528 systemd[1]: Started cri-containerd-db47d436534364adb3637751dbebbff245d56e816721f6bd71fe4baaedb0446d.scope - libcontainer container db47d436534364adb3637751dbebbff245d56e816721f6bd71fe4baaedb0446d. Apr 16 00:24:35.238179 containerd[1467]: time="2026-04-16T00:24:35.238052326Z" level=info msg="StartContainer for \"db47d436534364adb3637751dbebbff245d56e816721f6bd71fe4baaedb0446d\" returns successfully" Apr 16 00:24:35.359589 kubelet[2577]: E0416 00:24:35.359540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:35.496708 kubelet[2577]: E0416 00:24:35.496582 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.496708 kubelet[2577]: W0416 00:24:35.496618 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.496708 kubelet[2577]: E0416 00:24:35.496641 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.496878 kubelet[2577]: E0416 00:24:35.496824 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.496901 kubelet[2577]: W0416 00:24:35.496833 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.496901 kubelet[2577]: E0416 00:24:35.496887 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.497710 kubelet[2577]: E0416 00:24:35.497658 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.497710 kubelet[2577]: W0416 00:24:35.497701 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.497710 kubelet[2577]: E0416 00:24:35.497716 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.498562 kubelet[2577]: E0416 00:24:35.498540 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.498562 kubelet[2577]: W0416 00:24:35.498556 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.498562 kubelet[2577]: E0416 00:24:35.498569 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.498768 kubelet[2577]: E0416 00:24:35.498748 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.498768 kubelet[2577]: W0416 00:24:35.498767 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.498828 kubelet[2577]: E0416 00:24:35.498777 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.498929 kubelet[2577]: E0416 00:24:35.498918 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.498929 kubelet[2577]: W0416 00:24:35.498928 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.499028 kubelet[2577]: E0416 00:24:35.498939 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.499101 kubelet[2577]: E0416 00:24:35.499090 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.499101 kubelet[2577]: W0416 00:24:35.499100 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.499160 kubelet[2577]: E0416 00:24:35.499108 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.499294 kubelet[2577]: E0416 00:24:35.499260 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.499339 kubelet[2577]: W0416 00:24:35.499299 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.499339 kubelet[2577]: E0416 00:24:35.499309 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.499488 kubelet[2577]: E0416 00:24:35.499475 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.499488 kubelet[2577]: W0416 00:24:35.499488 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.499551 kubelet[2577]: E0416 00:24:35.499497 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.499655 kubelet[2577]: E0416 00:24:35.499643 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.499655 kubelet[2577]: W0416 00:24:35.499653 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.499723 kubelet[2577]: E0416 00:24:35.499662 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.499917 kubelet[2577]: E0416 00:24:35.499894 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.499917 kubelet[2577]: W0416 00:24:35.499914 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.500931 kubelet[2577]: E0416 00:24:35.500350 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.500931 kubelet[2577]: E0416 00:24:35.500629 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.500931 kubelet[2577]: W0416 00:24:35.500639 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.500931 kubelet[2577]: E0416 00:24:35.500649 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.500931 kubelet[2577]: E0416 00:24:35.500832 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.500931 kubelet[2577]: W0416 00:24:35.500841 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.500931 kubelet[2577]: E0416 00:24:35.500849 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.502430 kubelet[2577]: E0416 00:24:35.502393 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.502430 kubelet[2577]: W0416 00:24:35.502420 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.502806 kubelet[2577]: E0416 00:24:35.502437 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.502806 kubelet[2577]: E0416 00:24:35.502670 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.502806 kubelet[2577]: W0416 00:24:35.502678 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.502806 kubelet[2577]: E0416 00:24:35.502729 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.516011 kubelet[2577]: E0416 00:24:35.515841 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.516011 kubelet[2577]: W0416 00:24:35.515868 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.516011 kubelet[2577]: E0416 00:24:35.515889 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.516389 kubelet[2577]: E0416 00:24:35.516317 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.516389 kubelet[2577]: W0416 00:24:35.516330 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.516389 kubelet[2577]: E0416 00:24:35.516342 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.516689 kubelet[2577]: E0416 00:24:35.516670 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.516689 kubelet[2577]: W0416 00:24:35.516687 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.516898 kubelet[2577]: E0416 00:24:35.516699 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.516898 kubelet[2577]: E0416 00:24:35.516886 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.516898 kubelet[2577]: W0416 00:24:35.516895 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.516991 kubelet[2577]: E0416 00:24:35.516904 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.517478 kubelet[2577]: E0416 00:24:35.517448 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.517478 kubelet[2577]: W0416 00:24:35.517468 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.517478 kubelet[2577]: E0416 00:24:35.517481 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.519871 kubelet[2577]: E0416 00:24:35.519839 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.519871 kubelet[2577]: W0416 00:24:35.519863 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.520443 kubelet[2577]: E0416 00:24:35.519886 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.522865 kubelet[2577]: E0416 00:24:35.522839 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.522865 kubelet[2577]: W0416 00:24:35.522861 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.523152 kubelet[2577]: E0416 00:24:35.522881 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.523193 kubelet[2577]: E0416 00:24:35.523177 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.523193 kubelet[2577]: W0416 00:24:35.523188 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.523244 kubelet[2577]: E0416 00:24:35.523198 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.523832 kubelet[2577]: E0416 00:24:35.523802 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.523832 kubelet[2577]: W0416 00:24:35.523828 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.523991 kubelet[2577]: E0416 00:24:35.523843 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.524065 kubelet[2577]: E0416 00:24:35.524049 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.524065 kubelet[2577]: W0416 00:24:35.524060 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.524230 kubelet[2577]: E0416 00:24:35.524070 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.524368 kubelet[2577]: E0416 00:24:35.524351 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.524368 kubelet[2577]: W0416 00:24:35.524366 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.524467 kubelet[2577]: E0416 00:24:35.524376 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.524617 kubelet[2577]: E0416 00:24:35.524603 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.524617 kubelet[2577]: W0416 00:24:35.524614 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.524738 kubelet[2577]: E0416 00:24:35.524624 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.525019 kubelet[2577]: E0416 00:24:35.524999 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.525019 kubelet[2577]: W0416 00:24:35.525016 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.525449 kubelet[2577]: E0416 00:24:35.525028 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.525449 kubelet[2577]: E0416 00:24:35.525203 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.525449 kubelet[2577]: W0416 00:24:35.525211 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.525449 kubelet[2577]: E0416 00:24:35.525219 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.525449 kubelet[2577]: E0416 00:24:35.525396 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.525449 kubelet[2577]: W0416 00:24:35.525405 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.525449 kubelet[2577]: E0416 00:24:35.525414 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.525781 kubelet[2577]: E0416 00:24:35.525751 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.525781 kubelet[2577]: W0416 00:24:35.525772 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.525844 kubelet[2577]: E0416 00:24:35.525784 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.527129 kubelet[2577]: E0416 00:24:35.526049 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.527129 kubelet[2577]: W0416 00:24:35.526064 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.527129 kubelet[2577]: E0416 00:24:35.526076 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:35.527443 kubelet[2577]: E0416 00:24:35.527374 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:35.527443 kubelet[2577]: W0416 00:24:35.527391 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:35.527443 kubelet[2577]: E0416 00:24:35.527409 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.481970 kubelet[2577]: I0416 00:24:36.481910 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:24:36.510016 kubelet[2577]: E0416 00:24:36.509962 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.510016 kubelet[2577]: W0416 00:24:36.509997 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.510016 kubelet[2577]: E0416 00:24:36.510024 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.510233 kubelet[2577]: E0416 00:24:36.510204 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.510233 kubelet[2577]: W0416 00:24:36.510213 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.510233 kubelet[2577]: E0416 00:24:36.510221 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.510836 kubelet[2577]: E0416 00:24:36.510495 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.510836 kubelet[2577]: W0416 00:24:36.510511 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.510836 kubelet[2577]: E0416 00:24:36.510522 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.510836 kubelet[2577]: E0416 00:24:36.510680 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.510836 kubelet[2577]: W0416 00:24:36.510688 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.510836 kubelet[2577]: E0416 00:24:36.510697 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.511531 kubelet[2577]: E0416 00:24:36.510931 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.511531 kubelet[2577]: W0416 00:24:36.510942 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.511531 kubelet[2577]: E0416 00:24:36.510951 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.511531 kubelet[2577]: E0416 00:24:36.511499 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.511531 kubelet[2577]: W0416 00:24:36.511516 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.511686 kubelet[2577]: E0416 00:24:36.511530 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.511773 kubelet[2577]: E0416 00:24:36.511741 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.511814 kubelet[2577]: W0416 00:24:36.511778 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.511814 kubelet[2577]: E0416 00:24:36.511791 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.512601 kubelet[2577]: E0416 00:24:36.511999 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.512601 kubelet[2577]: W0416 00:24:36.512007 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.512601 kubelet[2577]: E0416 00:24:36.512017 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.512601 kubelet[2577]: E0416 00:24:36.512375 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.512601 kubelet[2577]: W0416 00:24:36.512388 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.512601 kubelet[2577]: E0416 00:24:36.512407 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.512795 kubelet[2577]: E0416 00:24:36.512610 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.512795 kubelet[2577]: W0416 00:24:36.512619 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.512795 kubelet[2577]: E0416 00:24:36.512628 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.512795 kubelet[2577]: E0416 00:24:36.512775 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.512795 kubelet[2577]: W0416 00:24:36.512782 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.512795 kubelet[2577]: E0416 00:24:36.512791 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.512948 kubelet[2577]: E0416 00:24:36.512931 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.512948 kubelet[2577]: W0416 00:24:36.512937 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.512948 kubelet[2577]: E0416 00:24:36.512945 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.513175 kubelet[2577]: E0416 00:24:36.513095 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.513175 kubelet[2577]: W0416 00:24:36.513110 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.513175 kubelet[2577]: E0416 00:24:36.513119 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.513497 kubelet[2577]: E0416 00:24:36.513477 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.513497 kubelet[2577]: W0416 00:24:36.513495 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.513576 kubelet[2577]: E0416 00:24:36.513505 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.514057 kubelet[2577]: E0416 00:24:36.513699 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.514057 kubelet[2577]: W0416 00:24:36.513713 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.514057 kubelet[2577]: E0416 00:24:36.513723 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.529407 kubelet[2577]: E0416 00:24:36.529330 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.529407 kubelet[2577]: W0416 00:24:36.529374 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.529407 kubelet[2577]: E0416 00:24:36.529401 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.530252 kubelet[2577]: E0416 00:24:36.530030 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.530252 kubelet[2577]: W0416 00:24:36.530054 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.530252 kubelet[2577]: E0416 00:24:36.530072 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.530252 kubelet[2577]: E0416 00:24:36.530319 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.530601 kubelet[2577]: W0416 00:24:36.530329 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.530601 kubelet[2577]: E0416 00:24:36.530340 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.531050 kubelet[2577]: E0416 00:24:36.531013 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.531050 kubelet[2577]: W0416 00:24:36.531035 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.531050 kubelet[2577]: E0416 00:24:36.531051 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.531254 kubelet[2577]: E0416 00:24:36.531239 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.531254 kubelet[2577]: W0416 00:24:36.531250 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.531337 kubelet[2577]: E0416 00:24:36.531264 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.531602 kubelet[2577]: E0416 00:24:36.531577 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.531602 kubelet[2577]: W0416 00:24:36.531593 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.531846 kubelet[2577]: E0416 00:24:36.531606 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.532332 kubelet[2577]: E0416 00:24:36.531977 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.532332 kubelet[2577]: W0416 00:24:36.531992 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.532332 kubelet[2577]: E0416 00:24:36.532003 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.532679 kubelet[2577]: E0416 00:24:36.532649 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.532679 kubelet[2577]: W0416 00:24:36.532669 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.532933 kubelet[2577]: E0416 00:24:36.532682 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.533775 kubelet[2577]: E0416 00:24:36.533346 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.533775 kubelet[2577]: W0416 00:24:36.533379 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.533775 kubelet[2577]: E0416 00:24:36.533393 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.533775 kubelet[2577]: E0416 00:24:36.533589 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.533775 kubelet[2577]: W0416 00:24:36.533597 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.533775 kubelet[2577]: E0416 00:24:36.533609 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.533775 kubelet[2577]: E0416 00:24:36.533750 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.533775 kubelet[2577]: W0416 00:24:36.533757 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.533775 kubelet[2577]: E0416 00:24:36.533765 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.534564 kubelet[2577]: E0416 00:24:36.534539 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.534564 kubelet[2577]: W0416 00:24:36.534563 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.534660 kubelet[2577]: E0416 00:24:36.534579 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.535244 kubelet[2577]: E0416 00:24:36.535223 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.535244 kubelet[2577]: W0416 00:24:36.535240 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.535685 kubelet[2577]: E0416 00:24:36.535253 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.535981 kubelet[2577]: E0416 00:24:36.535767 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.535981 kubelet[2577]: W0416 00:24:36.535786 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.535981 kubelet[2577]: E0416 00:24:36.535799 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.536082 kubelet[2577]: E0416 00:24:36.536005 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.536082 kubelet[2577]: W0416 00:24:36.536013 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.536082 kubelet[2577]: E0416 00:24:36.536023 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.536293 kubelet[2577]: E0416 00:24:36.536233 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.536293 kubelet[2577]: W0416 00:24:36.536246 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.536293 kubelet[2577]: E0416 00:24:36.536256 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.536799 kubelet[2577]: E0416 00:24:36.536782 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.536799 kubelet[2577]: W0416 00:24:36.536798 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.536864 kubelet[2577]: E0416 00:24:36.536808 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.537177 kubelet[2577]: E0416 00:24:36.537160 2577 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:24:36.537177 kubelet[2577]: W0416 00:24:36.537174 2577 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:24:36.537574 kubelet[2577]: E0416 00:24:36.537186 2577 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:24:36.584882 containerd[1467]: time="2026-04-16T00:24:36.583957512Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:36.586829 containerd[1467]: time="2026-04-16T00:24:36.586778600Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Apr 16 00:24:36.588402 containerd[1467]: time="2026-04-16T00:24:36.588285933Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:36.594162 containerd[1467]: time="2026-04-16T00:24:36.594077962Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:36.596593 containerd[1467]: time="2026-04-16T00:24:36.595994611Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.466398356s" Apr 16 00:24:36.596593 containerd[1467]: time="2026-04-16T00:24:36.596044215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Apr 16 00:24:36.604754 containerd[1467]: time="2026-04-16T00:24:36.604681295Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 16 00:24:36.634456 containerd[1467]: time="2026-04-16T00:24:36.634297980Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6\"" Apr 16 00:24:36.635156 containerd[1467]: time="2026-04-16T00:24:36.635116772Z" level=info msg="StartContainer for \"36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6\"" Apr 16 00:24:36.692550 systemd[1]: Started cri-containerd-36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6.scope - libcontainer container 36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6. Apr 16 00:24:36.741963 containerd[1467]: time="2026-04-16T00:24:36.741650742Z" level=info msg="StartContainer for \"36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6\" returns successfully" Apr 16 00:24:36.762210 systemd[1]: cri-containerd-36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6.scope: Deactivated successfully. Apr 16 00:24:36.920223 containerd[1467]: time="2026-04-16T00:24:36.920094757Z" level=info msg="shim disconnected" id=36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6 namespace=k8s.io Apr 16 00:24:36.920223 containerd[1467]: time="2026-04-16T00:24:36.920214087Z" level=warning msg="cleaning up after shim disconnected" id=36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6 namespace=k8s.io Apr 16 00:24:36.920223 containerd[1467]: time="2026-04-16T00:24:36.920230049Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:24:37.360540 kubelet[2577]: E0416 00:24:37.358263 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:37.491023 containerd[1467]: time="2026-04-16T00:24:37.490948229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 16 00:24:37.518752 kubelet[2577]: I0416 00:24:37.518615 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-9cd95957f-gdd7n" podStartSLOduration=3.555742342 podStartE2EDuration="5.518579288s" podCreationTimestamp="2026-04-16 00:24:32 +0000 UTC" firstStartedPulling="2026-04-16 00:24:33.165509956 +0000 UTC m=+19.937610012" lastFinishedPulling="2026-04-16 00:24:35.128346902 +0000 UTC m=+21.900446958" observedRunningTime="2026-04-16 00:24:35.544692814 +0000 UTC m=+22.316792950" watchObservedRunningTime="2026-04-16 00:24:37.518579288 +0000 UTC m=+24.290679504" Apr 16 00:24:37.621583 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-36051cf502407ebf2f348d7c1595399896fcca00a113e084e9675ff16cb3fcb6-rootfs.mount: Deactivated successfully. Apr 16 00:24:39.358003 kubelet[2577]: E0416 00:24:39.357805 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:41.359100 kubelet[2577]: E0416 00:24:41.359047 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:42.535122 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2884588764.mount: Deactivated successfully. Apr 16 00:24:42.565322 containerd[1467]: time="2026-04-16T00:24:42.564644551Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:42.566174 containerd[1467]: time="2026-04-16T00:24:42.565946493Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Apr 16 00:24:42.568304 containerd[1467]: time="2026-04-16T00:24:42.567207993Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:42.569667 containerd[1467]: time="2026-04-16T00:24:42.569606221Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:42.571069 containerd[1467]: time="2026-04-16T00:24:42.570350439Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 5.079095785s" Apr 16 00:24:42.571069 containerd[1467]: time="2026-04-16T00:24:42.570388442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Apr 16 00:24:42.576968 containerd[1467]: time="2026-04-16T00:24:42.576902274Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 16 00:24:42.604198 containerd[1467]: time="2026-04-16T00:24:42.604118532Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78\"" Apr 16 00:24:42.608379 containerd[1467]: time="2026-04-16T00:24:42.607515759Z" level=info msg="StartContainer for \"99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78\"" Apr 16 00:24:42.646600 systemd[1]: Started cri-containerd-99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78.scope - libcontainer container 99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78. Apr 16 00:24:42.681607 containerd[1467]: time="2026-04-16T00:24:42.681535254Z" level=info msg="StartContainer for \"99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78\" returns successfully" Apr 16 00:24:42.785353 systemd[1]: cri-containerd-99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78.scope: Deactivated successfully. Apr 16 00:24:43.000123 containerd[1467]: time="2026-04-16T00:24:42.999875142Z" level=info msg="shim disconnected" id=99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78 namespace=k8s.io Apr 16 00:24:43.000123 containerd[1467]: time="2026-04-16T00:24:42.999960829Z" level=warning msg="cleaning up after shim disconnected" id=99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78 namespace=k8s.io Apr 16 00:24:43.000123 containerd[1467]: time="2026-04-16T00:24:42.999979831Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:24:43.014330 containerd[1467]: time="2026-04-16T00:24:43.012493840Z" level=warning msg="cleanup warnings time=\"2026-04-16T00:24:43Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Apr 16 00:24:43.360491 kubelet[2577]: E0416 00:24:43.360148 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:43.509249 containerd[1467]: time="2026-04-16T00:24:43.508798652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 16 00:24:43.538038 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-99b4c9ad939a924230cdc9608fe2553639f8fdbcd729a4b5290f963e015abc78-rootfs.mount: Deactivated successfully. Apr 16 00:24:45.359315 kubelet[2577]: E0416 00:24:45.358085 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:45.974546 containerd[1467]: time="2026-04-16T00:24:45.974435488Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:45.976951 containerd[1467]: time="2026-04-16T00:24:45.976893913Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Apr 16 00:24:45.978655 containerd[1467]: time="2026-04-16T00:24:45.978573399Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:45.982558 containerd[1467]: time="2026-04-16T00:24:45.982492173Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:45.984393 containerd[1467]: time="2026-04-16T00:24:45.984305869Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.475086264s" Apr 16 00:24:45.984393 containerd[1467]: time="2026-04-16T00:24:45.984357993Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Apr 16 00:24:45.989225 containerd[1467]: time="2026-04-16T00:24:45.989178515Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 16 00:24:46.009778 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3340485153.mount: Deactivated successfully. Apr 16 00:24:46.013543 containerd[1467]: time="2026-04-16T00:24:46.013495487Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99\"" Apr 16 00:24:46.015481 containerd[1467]: time="2026-04-16T00:24:46.015436311Z" level=info msg="StartContainer for \"aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99\"" Apr 16 00:24:46.056551 systemd[1]: Started cri-containerd-aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99.scope - libcontainer container aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99. Apr 16 00:24:46.091651 containerd[1467]: time="2026-04-16T00:24:46.091584909Z" level=info msg="StartContainer for \"aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99\" returns successfully" Apr 16 00:24:46.766846 systemd[1]: cri-containerd-aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99.scope: Deactivated successfully. Apr 16 00:24:46.851355 kubelet[2577]: I0416 00:24:46.848800 2577 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Apr 16 00:24:46.892961 containerd[1467]: time="2026-04-16T00:24:46.892699942Z" level=info msg="shim disconnected" id=aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99 namespace=k8s.io Apr 16 00:24:46.892961 containerd[1467]: time="2026-04-16T00:24:46.892853993Z" level=warning msg="cleaning up after shim disconnected" id=aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99 namespace=k8s.io Apr 16 00:24:46.892961 containerd[1467]: time="2026-04-16T00:24:46.892864114Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:24:46.941375 systemd[1]: Created slice kubepods-burstable-pod5ade9d9c_9070_44d0_8989_12094cc3969e.slice - libcontainer container kubepods-burstable-pod5ade9d9c_9070_44d0_8989_12094cc3969e.slice. Apr 16 00:24:46.958703 systemd[1]: Created slice kubepods-burstable-pod801cc257_2986_4d78_aabb_a2b3b76027fd.slice - libcontainer container kubepods-burstable-pod801cc257_2986_4d78_aabb_a2b3b76027fd.slice. Apr 16 00:24:46.968474 systemd[1]: Created slice kubepods-besteffort-pod085a0cd1_f75e_4874_b3c4_63142da8b2f2.slice - libcontainer container kubepods-besteffort-pod085a0cd1_f75e_4874_b3c4_63142da8b2f2.slice. Apr 16 00:24:46.973996 systemd[1]: Created slice kubepods-besteffort-pod071c8c0b_162d_4eea_a8c8_a1554ee321bf.slice - libcontainer container kubepods-besteffort-pod071c8c0b_162d_4eea_a8c8_a1554ee321bf.slice. Apr 16 00:24:46.986499 systemd[1]: Created slice kubepods-besteffort-pod4679abf7_027e_48d1_9202_b1dbdd5b8949.slice - libcontainer container kubepods-besteffort-pod4679abf7_027e_48d1_9202_b1dbdd5b8949.slice. Apr 16 00:24:46.992418 systemd[1]: Created slice kubepods-besteffort-podeb8562da_f5ee_41ba_a284_e8ce170cf70d.slice - libcontainer container kubepods-besteffort-podeb8562da_f5ee_41ba_a284_e8ce170cf70d.slice. Apr 16 00:24:47.002257 systemd[1]: Created slice kubepods-besteffort-podee00d6a3_52e5_4c2d_97b5_3b5102dc1e84.slice - libcontainer container kubepods-besteffort-podee00d6a3_52e5_4c2d_97b5_3b5102dc1e84.slice. Apr 16 00:24:47.006254 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-aff472c46cf7a0bb70ae4ebc44408eee7ed4d18dd3506a447e487d34fc351d99-rootfs.mount: Deactivated successfully. Apr 16 00:24:47.021569 kubelet[2577]: I0416 00:24:47.020515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kj4\" (UniqueName: \"kubernetes.io/projected/085a0cd1-f75e-4874-b3c4-63142da8b2f2-kube-api-access-44kj4\") pod \"whisker-5fc887d9f7-rvjtn\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " pod="calico-system/whisker-5fc887d9f7-rvjtn" Apr 16 00:24:47.021569 kubelet[2577]: I0416 00:24:47.020582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071c8c0b-162d-4eea-a8c8-a1554ee321bf-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-xvg2d\" (UID: \"071c8c0b-162d-4eea-a8c8-a1554ee321bf\") " pod="calico-system/goldmane-5b85766d88-xvg2d" Apr 16 00:24:47.021569 kubelet[2577]: I0416 00:24:47.020614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w7wh\" (UniqueName: \"kubernetes.io/projected/071c8c0b-162d-4eea-a8c8-a1554ee321bf-kube-api-access-5w7wh\") pod \"goldmane-5b85766d88-xvg2d\" (UID: \"071c8c0b-162d-4eea-a8c8-a1554ee321bf\") " pod="calico-system/goldmane-5b85766d88-xvg2d" Apr 16 00:24:47.021569 kubelet[2577]: I0416 00:24:47.020646 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84-tigera-ca-bundle\") pod \"calico-kube-controllers-5bcb8465df-zbc8b\" (UID: \"ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84\") " pod="calico-system/calico-kube-controllers-5bcb8465df-zbc8b" Apr 16 00:24:47.021569 kubelet[2577]: I0416 00:24:47.020677 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-ca-bundle\") pod \"whisker-5fc887d9f7-rvjtn\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " pod="calico-system/whisker-5fc887d9f7-rvjtn" Apr 16 00:24:47.021884 kubelet[2577]: I0416 00:24:47.020706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2kx\" (UniqueName: \"kubernetes.io/projected/801cc257-2986-4d78-aabb-a2b3b76027fd-kube-api-access-mr2kx\") pod \"coredns-674b8bbfcf-7nrz4\" (UID: \"801cc257-2986-4d78-aabb-a2b3b76027fd\") " pod="kube-system/coredns-674b8bbfcf-7nrz4" Apr 16 00:24:47.021884 kubelet[2577]: I0416 00:24:47.020735 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/071c8c0b-162d-4eea-a8c8-a1554ee321bf-goldmane-key-pair\") pod \"goldmane-5b85766d88-xvg2d\" (UID: \"071c8c0b-162d-4eea-a8c8-a1554ee321bf\") " pod="calico-system/goldmane-5b85766d88-xvg2d" Apr 16 00:24:47.021884 kubelet[2577]: I0416 00:24:47.020777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ade9d9c-9070-44d0-8989-12094cc3969e-config-volume\") pod \"coredns-674b8bbfcf-pjng2\" (UID: \"5ade9d9c-9070-44d0-8989-12094cc3969e\") " pod="kube-system/coredns-674b8bbfcf-pjng2" Apr 16 00:24:47.021884 kubelet[2577]: I0416 00:24:47.020808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/eb8562da-f5ee-41ba-a284-e8ce170cf70d-calico-apiserver-certs\") pod \"calico-apiserver-688cbf68ff-cmgfm\" (UID: \"eb8562da-f5ee-41ba-a284-e8ce170cf70d\") " pod="calico-system/calico-apiserver-688cbf68ff-cmgfm" Apr 16 00:24:47.021884 kubelet[2577]: I0416 00:24:47.020850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vzs8\" (UniqueName: \"kubernetes.io/projected/4679abf7-027e-48d1-9202-b1dbdd5b8949-kube-api-access-6vzs8\") pod \"calico-apiserver-688cbf68ff-qht6h\" (UID: \"4679abf7-027e-48d1-9202-b1dbdd5b8949\") " pod="calico-system/calico-apiserver-688cbf68ff-qht6h" Apr 16 00:24:47.022086 kubelet[2577]: I0416 00:24:47.020890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59v4\" (UniqueName: \"kubernetes.io/projected/5ade9d9c-9070-44d0-8989-12094cc3969e-kube-api-access-x59v4\") pod \"coredns-674b8bbfcf-pjng2\" (UID: \"5ade9d9c-9070-44d0-8989-12094cc3969e\") " pod="kube-system/coredns-674b8bbfcf-pjng2" Apr 16 00:24:47.022086 kubelet[2577]: I0416 00:24:47.020921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hd6\" (UniqueName: \"kubernetes.io/projected/ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84-kube-api-access-m9hd6\") pod \"calico-kube-controllers-5bcb8465df-zbc8b\" (UID: \"ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84\") " pod="calico-system/calico-kube-controllers-5bcb8465df-zbc8b" Apr 16 00:24:47.022086 kubelet[2577]: I0416 00:24:47.020953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-nginx-config\") pod \"whisker-5fc887d9f7-rvjtn\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " pod="calico-system/whisker-5fc887d9f7-rvjtn" Apr 16 00:24:47.022086 kubelet[2577]: I0416 00:24:47.020982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-backend-key-pair\") pod \"whisker-5fc887d9f7-rvjtn\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " pod="calico-system/whisker-5fc887d9f7-rvjtn" Apr 16 00:24:47.022086 kubelet[2577]: I0416 00:24:47.021011 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071c8c0b-162d-4eea-a8c8-a1554ee321bf-config\") pod \"goldmane-5b85766d88-xvg2d\" (UID: \"071c8c0b-162d-4eea-a8c8-a1554ee321bf\") " pod="calico-system/goldmane-5b85766d88-xvg2d" Apr 16 00:24:47.022863 kubelet[2577]: I0416 00:24:47.021420 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4679abf7-027e-48d1-9202-b1dbdd5b8949-calico-apiserver-certs\") pod \"calico-apiserver-688cbf68ff-qht6h\" (UID: \"4679abf7-027e-48d1-9202-b1dbdd5b8949\") " pod="calico-system/calico-apiserver-688cbf68ff-qht6h" Apr 16 00:24:47.022863 kubelet[2577]: I0416 00:24:47.021458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801cc257-2986-4d78-aabb-a2b3b76027fd-config-volume\") pod \"coredns-674b8bbfcf-7nrz4\" (UID: \"801cc257-2986-4d78-aabb-a2b3b76027fd\") " pod="kube-system/coredns-674b8bbfcf-7nrz4" Apr 16 00:24:47.022863 kubelet[2577]: I0416 00:24:47.021485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmkb6\" (UniqueName: \"kubernetes.io/projected/eb8562da-f5ee-41ba-a284-e8ce170cf70d-kube-api-access-bmkb6\") pod \"calico-apiserver-688cbf68ff-cmgfm\" (UID: \"eb8562da-f5ee-41ba-a284-e8ce170cf70d\") " pod="calico-system/calico-apiserver-688cbf68ff-cmgfm" Apr 16 00:24:47.254592 containerd[1467]: time="2026-04-16T00:24:47.253991449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pjng2,Uid:5ade9d9c-9070-44d0-8989-12094cc3969e,Namespace:kube-system,Attempt:0,}" Apr 16 00:24:47.264642 containerd[1467]: time="2026-04-16T00:24:47.264539459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7nrz4,Uid:801cc257-2986-4d78-aabb-a2b3b76027fd,Namespace:kube-system,Attempt:0,}" Apr 16 00:24:47.272652 containerd[1467]: time="2026-04-16T00:24:47.272364311Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fc887d9f7-rvjtn,Uid:085a0cd1-f75e-4874-b3c4-63142da8b2f2,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:47.283305 containerd[1467]: time="2026-04-16T00:24:47.282854438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-xvg2d,Uid:071c8c0b-162d-4eea-a8c8-a1554ee321bf,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:47.292441 containerd[1467]: time="2026-04-16T00:24:47.292361053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-qht6h,Uid:4679abf7-027e-48d1-9202-b1dbdd5b8949,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:47.308813 containerd[1467]: time="2026-04-16T00:24:47.308494312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-cmgfm,Uid:eb8562da-f5ee-41ba-a284-e8ce170cf70d,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:47.311831 containerd[1467]: time="2026-04-16T00:24:47.311783672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcb8465df-zbc8b,Uid:ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:47.372758 systemd[1]: Created slice kubepods-besteffort-pod2a1e3207_cc48_4c2e_99f2_9f1e71ba31ec.slice - libcontainer container kubepods-besteffort-pod2a1e3207_cc48_4c2e_99f2_9f1e71ba31ec.slice. Apr 16 00:24:47.437936 containerd[1467]: time="2026-04-16T00:24:47.437707794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptvrb,Uid:2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:47.552919 containerd[1467]: time="2026-04-16T00:24:47.552523345Z" level=error msg="Failed to destroy network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.620808 containerd[1467]: time="2026-04-16T00:24:47.620647923Z" level=error msg="Failed to destroy network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.621061 containerd[1467]: time="2026-04-16T00:24:47.621009070Z" level=error msg="encountered an error cleaning up failed sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.621537 containerd[1467]: time="2026-04-16T00:24:47.621087755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-cmgfm,Uid:eb8562da-f5ee-41ba-a284-e8ce170cf70d,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.621537 containerd[1467]: time="2026-04-16T00:24:47.621201924Z" level=error msg="Failed to destroy network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.623740 kubelet[2577]: E0416 00:24:47.623332 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.623740 kubelet[2577]: E0416 00:24:47.623533 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-688cbf68ff-cmgfm" Apr 16 00:24:47.623740 kubelet[2577]: E0416 00:24:47.623558 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-688cbf68ff-cmgfm" Apr 16 00:24:47.623896 kubelet[2577]: E0416 00:24:47.623614 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-688cbf68ff-cmgfm_calico-system(eb8562da-f5ee-41ba-a284-e8ce170cf70d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-688cbf68ff-cmgfm_calico-system(eb8562da-f5ee-41ba-a284-e8ce170cf70d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-688cbf68ff-cmgfm" podUID="eb8562da-f5ee-41ba-a284-e8ce170cf70d" Apr 16 00:24:47.628004 containerd[1467]: time="2026-04-16T00:24:47.627625913Z" level=error msg="encountered an error cleaning up failed sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.628004 containerd[1467]: time="2026-04-16T00:24:47.627702319Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pjng2,Uid:5ade9d9c-9070-44d0-8989-12094cc3969e,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.628004 containerd[1467]: time="2026-04-16T00:24:47.627893813Z" level=error msg="encountered an error cleaning up failed sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.628004 containerd[1467]: time="2026-04-16T00:24:47.627921695Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fc887d9f7-rvjtn,Uid:085a0cd1-f75e-4874-b3c4-63142da8b2f2,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.628332 kubelet[2577]: E0416 00:24:47.628171 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.628332 kubelet[2577]: E0416 00:24:47.628224 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fc887d9f7-rvjtn" Apr 16 00:24:47.628332 kubelet[2577]: E0416 00:24:47.628243 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fc887d9f7-rvjtn" Apr 16 00:24:47.628432 kubelet[2577]: E0416 00:24:47.628313 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5fc887d9f7-rvjtn_calico-system(085a0cd1-f75e-4874-b3c4-63142da8b2f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5fc887d9f7-rvjtn_calico-system(085a0cd1-f75e-4874-b3c4-63142da8b2f2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5fc887d9f7-rvjtn" podUID="085a0cd1-f75e-4874-b3c4-63142da8b2f2" Apr 16 00:24:47.628432 kubelet[2577]: E0416 00:24:47.628357 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.628432 kubelet[2577]: E0416 00:24:47.628374 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-pjng2" Apr 16 00:24:47.628520 kubelet[2577]: E0416 00:24:47.628386 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-pjng2" Apr 16 00:24:47.628520 kubelet[2577]: E0416 00:24:47.628411 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-pjng2_kube-system(5ade9d9c-9070-44d0-8989-12094cc3969e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-pjng2_kube-system(5ade9d9c-9070-44d0-8989-12094cc3969e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-pjng2" podUID="5ade9d9c-9070-44d0-8989-12094cc3969e" Apr 16 00:24:47.650702 containerd[1467]: time="2026-04-16T00:24:47.650329172Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 16 00:24:47.651063 containerd[1467]: time="2026-04-16T00:24:47.651028464Z" level=error msg="Failed to destroy network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.653765 containerd[1467]: time="2026-04-16T00:24:47.653567769Z" level=error msg="encountered an error cleaning up failed sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.653765 containerd[1467]: time="2026-04-16T00:24:47.653642415Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7nrz4,Uid:801cc257-2986-4d78-aabb-a2b3b76027fd,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.653923 kubelet[2577]: E0416 00:24:47.653854 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.653923 kubelet[2577]: E0416 00:24:47.653907 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-7nrz4" Apr 16 00:24:47.653987 kubelet[2577]: E0416 00:24:47.653925 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-7nrz4" Apr 16 00:24:47.653987 kubelet[2577]: E0416 00:24:47.653969 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-7nrz4_kube-system(801cc257-2986-4d78-aabb-a2b3b76027fd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-7nrz4_kube-system(801cc257-2986-4d78-aabb-a2b3b76027fd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-7nrz4" podUID="801cc257-2986-4d78-aabb-a2b3b76027fd" Apr 16 00:24:47.667590 containerd[1467]: time="2026-04-16T00:24:47.667533910Z" level=error msg="Failed to destroy network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.667934 containerd[1467]: time="2026-04-16T00:24:47.667899736Z" level=error msg="encountered an error cleaning up failed sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.667980 containerd[1467]: time="2026-04-16T00:24:47.667958941Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-qht6h,Uid:4679abf7-027e-48d1-9202-b1dbdd5b8949,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.668532 kubelet[2577]: E0416 00:24:47.668401 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.668532 kubelet[2577]: E0416 00:24:47.668478 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-688cbf68ff-qht6h" Apr 16 00:24:47.668653 kubelet[2577]: E0416 00:24:47.668559 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-688cbf68ff-qht6h" Apr 16 00:24:47.669322 kubelet[2577]: E0416 00:24:47.669154 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-688cbf68ff-qht6h_calico-system(4679abf7-027e-48d1-9202-b1dbdd5b8949)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-688cbf68ff-qht6h_calico-system(4679abf7-027e-48d1-9202-b1dbdd5b8949)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-688cbf68ff-qht6h" podUID="4679abf7-027e-48d1-9202-b1dbdd5b8949" Apr 16 00:24:47.687922 containerd[1467]: time="2026-04-16T00:24:47.687830113Z" level=error msg="Failed to destroy network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.688229 containerd[1467]: time="2026-04-16T00:24:47.688199060Z" level=error msg="encountered an error cleaning up failed sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.689450 containerd[1467]: time="2026-04-16T00:24:47.688264185Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptvrb,Uid:2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.689952 kubelet[2577]: E0416 00:24:47.689832 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.689952 kubelet[2577]: E0416 00:24:47.689896 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ptvrb" Apr 16 00:24:47.689952 kubelet[2577]: E0416 00:24:47.689916 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ptvrb" Apr 16 00:24:47.690143 kubelet[2577]: E0416 00:24:47.689969 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ptvrb_calico-system(2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ptvrb_calico-system(2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ptvrb" podUID="2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec" Apr 16 00:24:47.704218 containerd[1467]: time="2026-04-16T00:24:47.704090701Z" level=error msg="Failed to destroy network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.704924 containerd[1467]: time="2026-04-16T00:24:47.704750509Z" level=error msg="encountered an error cleaning up failed sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.704924 containerd[1467]: time="2026-04-16T00:24:47.704808714Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-xvg2d,Uid:071c8c0b-162d-4eea-a8c8-a1554ee321bf,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.705080 kubelet[2577]: E0416 00:24:47.705022 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.705131 kubelet[2577]: E0416 00:24:47.705116 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-xvg2d" Apr 16 00:24:47.705159 kubelet[2577]: E0416 00:24:47.705141 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-xvg2d" Apr 16 00:24:47.705472 kubelet[2577]: E0416 00:24:47.705191 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-xvg2d_calico-system(071c8c0b-162d-4eea-a8c8-a1554ee321bf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-xvg2d_calico-system(071c8c0b-162d-4eea-a8c8-a1554ee321bf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-xvg2d" podUID="071c8c0b-162d-4eea-a8c8-a1554ee321bf" Apr 16 00:24:47.712293 containerd[1467]: time="2026-04-16T00:24:47.712170092Z" level=info msg="CreateContainer within sandbox \"d58aa48d41351da91156659bfaad5bd7da94ce62c752726ad1ca853f0dc06b5e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7aadc8b71b40cd61af65c3ccec17997df765d949cf0f504f818bdc1ad30479b0\"" Apr 16 00:24:47.713790 containerd[1467]: time="2026-04-16T00:24:47.713586635Z" level=info msg="StartContainer for \"7aadc8b71b40cd61af65c3ccec17997df765d949cf0f504f818bdc1ad30479b0\"" Apr 16 00:24:47.716773 containerd[1467]: time="2026-04-16T00:24:47.716688782Z" level=error msg="Failed to destroy network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.717551 containerd[1467]: time="2026-04-16T00:24:47.717518002Z" level=error msg="encountered an error cleaning up failed sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.717764 containerd[1467]: time="2026-04-16T00:24:47.717663293Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcb8465df-zbc8b,Uid:ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.717882 kubelet[2577]: E0416 00:24:47.717843 2577 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:24:47.717933 kubelet[2577]: E0416 00:24:47.717902 2577 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bcb8465df-zbc8b" Apr 16 00:24:47.717933 kubelet[2577]: E0416 00:24:47.717922 2577 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bcb8465df-zbc8b" Apr 16 00:24:47.718337 kubelet[2577]: E0416 00:24:47.718023 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5bcb8465df-zbc8b_calico-system(ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5bcb8465df-zbc8b_calico-system(ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5bcb8465df-zbc8b" podUID="ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84" Apr 16 00:24:47.742491 systemd[1]: Started cri-containerd-7aadc8b71b40cd61af65c3ccec17997df765d949cf0f504f818bdc1ad30479b0.scope - libcontainer container 7aadc8b71b40cd61af65c3ccec17997df765d949cf0f504f818bdc1ad30479b0. Apr 16 00:24:47.779973 containerd[1467]: time="2026-04-16T00:24:47.779676465Z" level=info msg="StartContainer for \"7aadc8b71b40cd61af65c3ccec17997df765d949cf0f504f818bdc1ad30479b0\" returns successfully" Apr 16 00:24:48.566357 kubelet[2577]: I0416 00:24:48.566328 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:24:48.569217 containerd[1467]: time="2026-04-16T00:24:48.567784308Z" level=info msg="StopPodSandbox for \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\"" Apr 16 00:24:48.569217 containerd[1467]: time="2026-04-16T00:24:48.567977282Z" level=info msg="Ensure that sandbox 04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce in task-service has been cleanup successfully" Apr 16 00:24:48.570446 kubelet[2577]: I0416 00:24:48.570421 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:24:48.571632 containerd[1467]: time="2026-04-16T00:24:48.571559101Z" level=info msg="StopPodSandbox for \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\"" Apr 16 00:24:48.572201 containerd[1467]: time="2026-04-16T00:24:48.571840281Z" level=info msg="Ensure that sandbox b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d in task-service has been cleanup successfully" Apr 16 00:24:48.578626 kubelet[2577]: I0416 00:24:48.578591 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:24:48.585637 containerd[1467]: time="2026-04-16T00:24:48.583529565Z" level=info msg="StopPodSandbox for \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\"" Apr 16 00:24:48.585637 containerd[1467]: time="2026-04-16T00:24:48.583701257Z" level=info msg="Ensure that sandbox 096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076 in task-service has been cleanup successfully" Apr 16 00:24:48.593153 kubelet[2577]: I0416 00:24:48.593102 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:24:48.605058 containerd[1467]: time="2026-04-16T00:24:48.605007875Z" level=info msg="StopPodSandbox for \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\"" Apr 16 00:24:48.606344 containerd[1467]: time="2026-04-16T00:24:48.605260213Z" level=info msg="Ensure that sandbox 5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914 in task-service has been cleanup successfully" Apr 16 00:24:48.609903 kubelet[2577]: I0416 00:24:48.609860 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:24:48.615758 containerd[1467]: time="2026-04-16T00:24:48.614891628Z" level=info msg="StopPodSandbox for \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\"" Apr 16 00:24:48.615758 containerd[1467]: time="2026-04-16T00:24:48.615073242Z" level=info msg="Ensure that sandbox 2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4 in task-service has been cleanup successfully" Apr 16 00:24:48.631410 kubelet[2577]: I0416 00:24:48.631313 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:24:48.635565 containerd[1467]: time="2026-04-16T00:24:48.632977934Z" level=info msg="StopPodSandbox for \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\"" Apr 16 00:24:48.635565 containerd[1467]: time="2026-04-16T00:24:48.633246993Z" level=info msg="Ensure that sandbox 9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f in task-service has been cleanup successfully" Apr 16 00:24:48.638742 kubelet[2577]: I0416 00:24:48.638159 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:24:48.641193 containerd[1467]: time="2026-04-16T00:24:48.640259019Z" level=info msg="StopPodSandbox for \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\"" Apr 16 00:24:48.641193 containerd[1467]: time="2026-04-16T00:24:48.640564801Z" level=info msg="Ensure that sandbox 89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44 in task-service has been cleanup successfully" Apr 16 00:24:48.641888 kubelet[2577]: I0416 00:24:48.641828 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:24:48.644462 containerd[1467]: time="2026-04-16T00:24:48.644399798Z" level=info msg="StopPodSandbox for \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\"" Apr 16 00:24:48.689538 containerd[1467]: time="2026-04-16T00:24:48.689037020Z" level=info msg="Ensure that sandbox 1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509 in task-service has been cleanup successfully" Apr 16 00:24:48.768276 kubelet[2577]: I0416 00:24:48.768198 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cfjdx" podStartSLOduration=4.055520754 podStartE2EDuration="16.768176932s" podCreationTimestamp="2026-04-16 00:24:32 +0000 UTC" firstStartedPulling="2026-04-16 00:24:33.272461072 +0000 UTC m=+20.044561168" lastFinishedPulling="2026-04-16 00:24:45.98511725 +0000 UTC m=+32.757217346" observedRunningTime="2026-04-16 00:24:48.610308498 +0000 UTC m=+35.382408674" watchObservedRunningTime="2026-04-16 00:24:48.768176932 +0000 UTC m=+35.540276988" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.841 [INFO][3791] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.842 [INFO][3791] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" iface="eth0" netns="/var/run/netns/cni-1fd503bf-a08a-5c30-70b7-50f23b7abb87" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.842 [INFO][3791] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" iface="eth0" netns="/var/run/netns/cni-1fd503bf-a08a-5c30-70b7-50f23b7abb87" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.842 [INFO][3791] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" iface="eth0" netns="/var/run/netns/cni-1fd503bf-a08a-5c30-70b7-50f23b7abb87" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.842 [INFO][3791] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.842 [INFO][3791] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.945 [INFO][3855] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.945 [INFO][3855] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.945 [INFO][3855] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.988 [WARNING][3855] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.988 [INFO][3855] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:48.991 [INFO][3855] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.005919 containerd[1467]: 2026-04-16 00:24:49.002 [INFO][3791] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:24:49.014702 containerd[1467]: time="2026-04-16T00:24:49.014651991Z" level=info msg="TearDown network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\" successfully" Apr 16 00:24:49.016530 systemd[1]: run-netns-cni\x2d1fd503bf\x2da08a\x2d5c30\x2d70b7\x2d50f23b7abb87.mount: Deactivated successfully. Apr 16 00:24:49.018825 containerd[1467]: time="2026-04-16T00:24:49.018427180Z" level=info msg="StopPodSandbox for \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\" returns successfully" Apr 16 00:24:49.020856 containerd[1467]: time="2026-04-16T00:24:49.020825271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-qht6h,Uid:4679abf7-027e-48d1-9202-b1dbdd5b8949,Namespace:calico-system,Attempt:1,}" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.793 [INFO][3743] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.793 [INFO][3743] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" iface="eth0" netns="/var/run/netns/cni-97cdcb00-6c1b-c647-1ed0-e0e9a54e3339" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.793 [INFO][3743] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" iface="eth0" netns="/var/run/netns/cni-97cdcb00-6c1b-c647-1ed0-e0e9a54e3339" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.794 [INFO][3743] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" iface="eth0" netns="/var/run/netns/cni-97cdcb00-6c1b-c647-1ed0-e0e9a54e3339" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.794 [INFO][3743] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.794 [INFO][3743] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.963 [INFO][3850] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.964 [INFO][3850] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:48.992 [INFO][3850] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:49.054 [WARNING][3850] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:49.054 [INFO][3850] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:49.066 [INFO][3850] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.100065 containerd[1467]: 2026-04-16 00:24:49.092 [INFO][3743] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:24:49.106812 containerd[1467]: time="2026-04-16T00:24:49.106588709Z" level=info msg="TearDown network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\" successfully" Apr 16 00:24:49.108454 containerd[1467]: time="2026-04-16T00:24:49.106976737Z" level=info msg="StopPodSandbox for \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\" returns successfully" Apr 16 00:24:49.109312 systemd[1]: run-netns-cni\x2d97cdcb00\x2d6c1b\x2dc647\x2d1ed0\x2de0e9a54e3339.mount: Deactivated successfully. Apr 16 00:24:49.115177 containerd[1467]: time="2026-04-16T00:24:49.114730770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcb8465df-zbc8b,Uid:ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84,Namespace:calico-system,Attempt:1,}" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.773 [INFO][3769] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.774 [INFO][3769] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" iface="eth0" netns="/var/run/netns/cni-f4e14f13-5731-f4f7-5f62-94fda79074e7" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.774 [INFO][3769] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" iface="eth0" netns="/var/run/netns/cni-f4e14f13-5731-f4f7-5f62-94fda79074e7" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.774 [INFO][3769] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" iface="eth0" netns="/var/run/netns/cni-f4e14f13-5731-f4f7-5f62-94fda79074e7" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.774 [INFO][3769] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.774 [INFO][3769] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.980 [INFO][3841] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:48.980 [INFO][3841] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:49.068 [INFO][3841] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:49.119 [WARNING][3841] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:49.119 [INFO][3841] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:49.140 [INFO][3841] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.166712 containerd[1467]: 2026-04-16 00:24:49.162 [INFO][3769] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:24:49.168737 containerd[1467]: time="2026-04-16T00:24:49.168691779Z" level=info msg="TearDown network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\" successfully" Apr 16 00:24:49.168737 containerd[1467]: time="2026-04-16T00:24:49.168732022Z" level=info msg="StopPodSandbox for \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\" returns successfully" Apr 16 00:24:49.171284 containerd[1467]: time="2026-04-16T00:24:49.169826900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7nrz4,Uid:801cc257-2986-4d78-aabb-a2b3b76027fd,Namespace:kube-system,Attempt:1,}" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.036 [INFO][3813] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.037 [INFO][3813] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" iface="eth0" netns="/var/run/netns/cni-7f9b7e98-eb8f-5b5d-95cf-e086661ba946" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.039 [INFO][3813] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" iface="eth0" netns="/var/run/netns/cni-7f9b7e98-eb8f-5b5d-95cf-e086661ba946" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.039 [INFO][3813] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" iface="eth0" netns="/var/run/netns/cni-7f9b7e98-eb8f-5b5d-95cf-e086661ba946" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.039 [INFO][3813] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.040 [INFO][3813] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.140 [INFO][3891] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.140 [INFO][3891] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.140 [INFO][3891] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.219 [WARNING][3891] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.219 [INFO][3891] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.228 [INFO][3891] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.243333 containerd[1467]: 2026-04-16 00:24:49.238 [INFO][3813] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:24:49.245114 containerd[1467]: time="2026-04-16T00:24:49.245061827Z" level=info msg="TearDown network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\" successfully" Apr 16 00:24:49.245114 containerd[1467]: time="2026-04-16T00:24:49.245107510Z" level=info msg="StopPodSandbox for \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\" returns successfully" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.770 [INFO][3778] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.773 [INFO][3778] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" iface="eth0" netns="/var/run/netns/cni-8b48dfc5-1f7b-8547-0a7e-8b0eae616aed" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.775 [INFO][3778] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" iface="eth0" netns="/var/run/netns/cni-8b48dfc5-1f7b-8547-0a7e-8b0eae616aed" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.775 [INFO][3778] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" iface="eth0" netns="/var/run/netns/cni-8b48dfc5-1f7b-8547-0a7e-8b0eae616aed" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.775 [INFO][3778] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.775 [INFO][3778] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.989 [INFO][3840] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:48.990 [INFO][3840] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:49.228 [INFO][3840] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:49.254 [WARNING][3840] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:49.255 [INFO][3840] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:49.268 [INFO][3840] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.280966 containerd[1467]: 2026-04-16 00:24:49.276 [INFO][3778] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:24:49.285186 containerd[1467]: time="2026-04-16T00:24:49.285010597Z" level=info msg="TearDown network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\" successfully" Apr 16 00:24:49.285186 containerd[1467]: time="2026-04-16T00:24:49.285070481Z" level=info msg="StopPodSandbox for \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\" returns successfully" Apr 16 00:24:49.286744 containerd[1467]: time="2026-04-16T00:24:49.286710638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pjng2,Uid:5ade9d9c-9070-44d0-8989-12094cc3969e,Namespace:kube-system,Attempt:1,}" Apr 16 00:24:49.354539 kubelet[2577]: I0416 00:24:49.354495 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44kj4\" (UniqueName: \"kubernetes.io/projected/085a0cd1-f75e-4874-b3c4-63142da8b2f2-kube-api-access-44kj4\") pod \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " Apr 16 00:24:49.354539 kubelet[2577]: I0416 00:24:49.354552 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-backend-key-pair\") pod \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " Apr 16 00:24:49.354539 kubelet[2577]: I0416 00:24:49.354588 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-ca-bundle\") pod \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " Apr 16 00:24:49.354539 kubelet[2577]: I0416 00:24:49.354613 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-nginx-config\") pod \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\" (UID: \"085a0cd1-f75e-4874-b3c4-63142da8b2f2\") " Apr 16 00:24:49.370327 kubelet[2577]: I0416 00:24:49.367450 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "085a0cd1-f75e-4874-b3c4-63142da8b2f2" (UID: "085a0cd1-f75e-4874-b3c4-63142da8b2f2"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 00:24:49.372480 kubelet[2577]: I0416 00:24:49.371657 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "085a0cd1-f75e-4874-b3c4-63142da8b2f2" (UID: "085a0cd1-f75e-4874-b3c4-63142da8b2f2"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 00:24:49.382033 kubelet[2577]: I0416 00:24:49.381983 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "085a0cd1-f75e-4874-b3c4-63142da8b2f2" (UID: "085a0cd1-f75e-4874-b3c4-63142da8b2f2"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 00:24:49.384132 kubelet[2577]: I0416 00:24:49.384062 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085a0cd1-f75e-4874-b3c4-63142da8b2f2-kube-api-access-44kj4" (OuterVolumeSpecName: "kube-api-access-44kj4") pod "085a0cd1-f75e-4874-b3c4-63142da8b2f2" (UID: "085a0cd1-f75e-4874-b3c4-63142da8b2f2"). InnerVolumeSpecName "kube-api-access-44kj4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 00:24:49.401829 systemd[1]: Removed slice kubepods-besteffort-pod085a0cd1_f75e_4874_b3c4_63142da8b2f2.slice - libcontainer container kubepods-besteffort-pod085a0cd1_f75e_4874_b3c4_63142da8b2f2.slice. Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:48.865 [INFO][3750] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:48.866 [INFO][3750] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" iface="eth0" netns="/var/run/netns/cni-baec5674-6791-c372-a403-6843c99144ec" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:48.867 [INFO][3750] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" iface="eth0" netns="/var/run/netns/cni-baec5674-6791-c372-a403-6843c99144ec" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:48.872 [INFO][3750] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" iface="eth0" netns="/var/run/netns/cni-baec5674-6791-c372-a403-6843c99144ec" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:48.872 [INFO][3750] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:48.872 [INFO][3750] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:49.071 [INFO][3860] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:49.086 [INFO][3860] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:49.268 [INFO][3860] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:49.305 [WARNING][3860] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:49.355 [INFO][3860] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:49.370 [INFO][3860] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.409581 containerd[1467]: 2026-04-16 00:24:49.391 [INFO][3750] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:24:49.421193 containerd[1467]: time="2026-04-16T00:24:49.419135045Z" level=info msg="TearDown network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\" successfully" Apr 16 00:24:49.421445 containerd[1467]: time="2026-04-16T00:24:49.421406887Z" level=info msg="StopPodSandbox for \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\" returns successfully" Apr 16 00:24:49.422796 containerd[1467]: time="2026-04-16T00:24:49.422606612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptvrb,Uid:2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec,Namespace:calico-system,Attempt:1,}" Apr 16 00:24:49.455004 kubelet[2577]: I0416 00:24:49.454954 2577 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-nginx-config\") on node \"ci-4081-3-6-n-56c15b786d\" DevicePath \"\"" Apr 16 00:24:49.455004 kubelet[2577]: I0416 00:24:49.454998 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-44kj4\" (UniqueName: \"kubernetes.io/projected/085a0cd1-f75e-4874-b3c4-63142da8b2f2-kube-api-access-44kj4\") on node \"ci-4081-3-6-n-56c15b786d\" DevicePath \"\"" Apr 16 00:24:49.455004 kubelet[2577]: I0416 00:24:49.455013 2577 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-backend-key-pair\") on node \"ci-4081-3-6-n-56c15b786d\" DevicePath \"\"" Apr 16 00:24:49.455596 kubelet[2577]: I0416 00:24:49.455027 2577 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/085a0cd1-f75e-4874-b3c4-63142da8b2f2-whisker-ca-bundle\") on node \"ci-4081-3-6-n-56c15b786d\" DevicePath \"\"" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:48.973 [INFO][3801] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:48.974 [INFO][3801] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" iface="eth0" netns="/var/run/netns/cni-f182fda5-b3b4-e683-36e9-9a66993b3e7e" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:48.975 [INFO][3801] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" iface="eth0" netns="/var/run/netns/cni-f182fda5-b3b4-e683-36e9-9a66993b3e7e" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:48.976 [INFO][3801] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" iface="eth0" netns="/var/run/netns/cni-f182fda5-b3b4-e683-36e9-9a66993b3e7e" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:48.976 [INFO][3801] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:48.976 [INFO][3801] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:49.089 [INFO][3876] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:49.089 [INFO][3876] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:49.373 [INFO][3876] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:49.407 [WARNING][3876] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:49.407 [INFO][3876] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:49.413 [INFO][3876] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.462081 containerd[1467]: 2026-04-16 00:24:49.433 [INFO][3801] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:24:49.463040 containerd[1467]: time="2026-04-16T00:24:49.462911568Z" level=info msg="TearDown network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\" successfully" Apr 16 00:24:49.463040 containerd[1467]: time="2026-04-16T00:24:49.462957091Z" level=info msg="StopPodSandbox for \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\" returns successfully" Apr 16 00:24:49.464436 containerd[1467]: time="2026-04-16T00:24:49.464026767Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-cmgfm,Uid:eb8562da-f5ee-41ba-a284-e8ce170cf70d,Namespace:calico-system,Attempt:1,}" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.002 [INFO][3829] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.002 [INFO][3829] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" iface="eth0" netns="/var/run/netns/cni-af8db94a-ca41-4bd2-2952-8c3a2a55348c" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.004 [INFO][3829] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" iface="eth0" netns="/var/run/netns/cni-af8db94a-ca41-4bd2-2952-8c3a2a55348c" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.005 [INFO][3829] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" iface="eth0" netns="/var/run/netns/cni-af8db94a-ca41-4bd2-2952-8c3a2a55348c" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.005 [INFO][3829] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.005 [INFO][3829] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.137 [INFO][3883] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.155 [INFO][3883] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.414 [INFO][3883] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.447 [WARNING][3883] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.447 [INFO][3883] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.463 [INFO][3883] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.511878 containerd[1467]: 2026-04-16 00:24:49.478 [INFO][3829] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:24:49.545528 containerd[1467]: time="2026-04-16T00:24:49.544837772Z" level=info msg="TearDown network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\" successfully" Apr 16 00:24:49.545528 containerd[1467]: time="2026-04-16T00:24:49.544871414Z" level=info msg="StopPodSandbox for \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\" returns successfully" Apr 16 00:24:49.547077 containerd[1467]: time="2026-04-16T00:24:49.546893199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-xvg2d,Uid:071c8c0b-162d-4eea-a8c8-a1554ee321bf,Namespace:calico-system,Attempt:1,}" Apr 16 00:24:49.607914 systemd-networkd[1382]: calidb50fab48aa: Link UP Apr 16 00:24:49.620911 systemd-networkd[1382]: calidb50fab48aa: Gained carrier Apr 16 00:24:49.651755 kubelet[2577]: I0416 00:24:49.651582 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.223 [ERROR][3913] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.272 [INFO][3913] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0 calico-kube-controllers-5bcb8465df- calico-system ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84 910 0 2026-04-16 00:24:33 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5bcb8465df projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d calico-kube-controllers-5bcb8465df-zbc8b eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidb50fab48aa [] [] }} ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.272 [INFO][3913] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.433 [INFO][3948] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" HandleID="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.481 [INFO][3948] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" HandleID="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000307f20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"calico-kube-controllers-5bcb8465df-zbc8b", "timestamp":"2026-04-16 00:24:49.433713205 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400034e160)} Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.481 [INFO][3948] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.481 [INFO][3948] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.481 [INFO][3948] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.488 [INFO][3948] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.502 [INFO][3948] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.523 [INFO][3948] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.529 [INFO][3948] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.535 [INFO][3948] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.535 [INFO][3948] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.542 [INFO][3948] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44 Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.552 [INFO][3948] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.565 [INFO][3948] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.1/26] block=192.168.72.0/26 handle="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.568 [INFO][3948] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.1/26] handle="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.568 [INFO][3948] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.734734 containerd[1467]: 2026-04-16 00:24:49.568 [INFO][3948] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.1/26] IPv6=[] ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" HandleID="k8s-pod-network.830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.737758 containerd[1467]: 2026-04-16 00:24:49.586 [INFO][3913] cni-plugin/k8s.go 418: Populated endpoint ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0", GenerateName:"calico-kube-controllers-5bcb8465df-", Namespace:"calico-system", SelfLink:"", UID:"ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bcb8465df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"calico-kube-controllers-5bcb8465df-zbc8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidb50fab48aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:49.737758 containerd[1467]: 2026-04-16 00:24:49.586 [INFO][3913] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.1/32] ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.737758 containerd[1467]: 2026-04-16 00:24:49.586 [INFO][3913] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb50fab48aa ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.737758 containerd[1467]: 2026-04-16 00:24:49.666 [INFO][3913] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.737758 containerd[1467]: 2026-04-16 00:24:49.670 [INFO][3913] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0", GenerateName:"calico-kube-controllers-5bcb8465df-", Namespace:"calico-system", SelfLink:"", UID:"ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bcb8465df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44", Pod:"calico-kube-controllers-5bcb8465df-zbc8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidb50fab48aa", MAC:"8a:10:74:81:20:77", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:49.737758 containerd[1467]: 2026-04-16 00:24:49.718 [INFO][3913] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44" Namespace="calico-system" Pod="calico-kube-controllers-5bcb8465df-zbc8b" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:24:49.800845 systemd[1]: Created slice kubepods-besteffort-pod9d1533e4_c092_4636_9d1e_9189de61a887.slice - libcontainer container kubepods-besteffort-pod9d1533e4_c092_4636_9d1e_9189de61a887.slice. Apr 16 00:24:49.830379 systemd-networkd[1382]: calidf2d11505cc: Link UP Apr 16 00:24:49.831534 systemd-networkd[1382]: calidf2d11505cc: Gained carrier Apr 16 00:24:49.860341 kubelet[2577]: I0416 00:24:49.860040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d1533e4-c092-4636-9d1e-9189de61a887-whisker-ca-bundle\") pod \"whisker-656455b6cd-4wsm8\" (UID: \"9d1533e4-c092-4636-9d1e-9189de61a887\") " pod="calico-system/whisker-656455b6cd-4wsm8" Apr 16 00:24:49.860341 kubelet[2577]: I0416 00:24:49.860098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d1533e4-c092-4636-9d1e-9189de61a887-whisker-backend-key-pair\") pod \"whisker-656455b6cd-4wsm8\" (UID: \"9d1533e4-c092-4636-9d1e-9189de61a887\") " pod="calico-system/whisker-656455b6cd-4wsm8" Apr 16 00:24:49.860341 kubelet[2577]: I0416 00:24:49.860123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjpv2\" (UniqueName: \"kubernetes.io/projected/9d1533e4-c092-4636-9d1e-9189de61a887-kube-api-access-tjpv2\") pod \"whisker-656455b6cd-4wsm8\" (UID: \"9d1533e4-c092-4636-9d1e-9189de61a887\") " pod="calico-system/whisker-656455b6cd-4wsm8" Apr 16 00:24:49.860341 kubelet[2577]: I0416 00:24:49.860143 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/9d1533e4-c092-4636-9d1e-9189de61a887-nginx-config\") pod \"whisker-656455b6cd-4wsm8\" (UID: \"9d1533e4-c092-4636-9d1e-9189de61a887\") " pod="calico-system/whisker-656455b6cd-4wsm8" Apr 16 00:24:49.874329 containerd[1467]: time="2026-04-16T00:24:49.874022615Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:49.874329 containerd[1467]: time="2026-04-16T00:24:49.874078619Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:49.874329 containerd[1467]: time="2026-04-16T00:24:49.874090099Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:49.874329 containerd[1467]: time="2026-04-16T00:24:49.874166145Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.238 [ERROR][3902] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.314 [INFO][3902] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0 calico-apiserver-688cbf68ff- calico-system 4679abf7-027e-48d1-9202-b1dbdd5b8949 911 0 2026-04-16 00:24:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:688cbf68ff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d calico-apiserver-688cbf68ff-qht6h eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calidf2d11505cc [] [] }} ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.352 [INFO][3902] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.491 [INFO][3960] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" HandleID="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.522 [INFO][3960] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" HandleID="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000273dc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"calico-apiserver-688cbf68ff-qht6h", "timestamp":"2026-04-16 00:24:49.491566652 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400010c2c0)} Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.522 [INFO][3960] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.571 [INFO][3960] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.571 [INFO][3960] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.590 [INFO][3960] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.666 [INFO][3960] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.689 [INFO][3960] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.707 [INFO][3960] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.721 [INFO][3960] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.721 [INFO][3960] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.736 [INFO][3960] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42 Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.756 [INFO][3960] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.807 [INFO][3960] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.2/26] block=192.168.72.0/26 handle="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.807 [INFO][3960] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.2/26] handle="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.807 [INFO][3960] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:49.898107 containerd[1467]: 2026-04-16 00:24:49.807 [INFO][3960] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.2/26] IPv6=[] ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" HandleID="k8s-pod-network.4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.898990 containerd[1467]: 2026-04-16 00:24:49.813 [INFO][3902] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"4679abf7-027e-48d1-9202-b1dbdd5b8949", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"calico-apiserver-688cbf68ff-qht6h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calidf2d11505cc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:49.898990 containerd[1467]: 2026-04-16 00:24:49.814 [INFO][3902] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.2/32] ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.898990 containerd[1467]: 2026-04-16 00:24:49.815 [INFO][3902] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidf2d11505cc ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.898990 containerd[1467]: 2026-04-16 00:24:49.836 [INFO][3902] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.898990 containerd[1467]: 2026-04-16 00:24:49.850 [INFO][3902] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"4679abf7-027e-48d1-9202-b1dbdd5b8949", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42", Pod:"calico-apiserver-688cbf68ff-qht6h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calidf2d11505cc", MAC:"96:ef:b5:1f:b6:1c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:49.898990 containerd[1467]: 2026-04-16 00:24:49.881 [INFO][3902] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-qht6h" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:24:49.971509 systemd[1]: Started cri-containerd-830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44.scope - libcontainer container 830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44. Apr 16 00:24:50.025055 systemd-networkd[1382]: califbd1c1f0916: Link UP Apr 16 00:24:50.027797 systemd-networkd[1382]: califbd1c1f0916: Gained carrier Apr 16 00:24:50.040623 systemd[1]: run-netns-cni\x2dbaec5674\x2d6791\x2dc372\x2da403\x2d6843c99144ec.mount: Deactivated successfully. Apr 16 00:24:50.040722 systemd[1]: run-netns-cni\x2daf8db94a\x2dca41\x2d4bd2\x2d2952\x2d8c3a2a55348c.mount: Deactivated successfully. Apr 16 00:24:50.040768 systemd[1]: run-netns-cni\x2df182fda5\x2db3b4\x2de683\x2d36e9\x2d9a66993b3e7e.mount: Deactivated successfully. Apr 16 00:24:50.040812 systemd[1]: run-netns-cni\x2d7f9b7e98\x2deb8f\x2d5b5d\x2d95cf\x2de086661ba946.mount: Deactivated successfully. Apr 16 00:24:50.040855 systemd[1]: run-netns-cni\x2df4e14f13\x2d5731\x2df4f7\x2d5f62\x2d94fda79074e7.mount: Deactivated successfully. Apr 16 00:24:50.040899 systemd[1]: run-netns-cni\x2d8b48dfc5\x2d1f7b\x2d8547\x2d0a7e\x2d8b0eae616aed.mount: Deactivated successfully. Apr 16 00:24:50.040950 systemd[1]: var-lib-kubelet-pods-085a0cd1\x2df75e\x2d4874\x2db3c4\x2d63142da8b2f2-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d44kj4.mount: Deactivated successfully. Apr 16 00:24:50.041005 systemd[1]: var-lib-kubelet-pods-085a0cd1\x2df75e\x2d4874\x2db3c4\x2d63142da8b2f2-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.307 [ERROR][3923] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.433 [INFO][3923] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0 coredns-674b8bbfcf- kube-system 801cc257-2986-4d78-aabb-a2b3b76027fd 909 0 2026-04-16 00:24:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d coredns-674b8bbfcf-7nrz4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califbd1c1f0916 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.434 [INFO][3923] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.667 [INFO][3972] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" HandleID="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.723 [INFO][3972] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" HandleID="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003bf880), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"coredns-674b8bbfcf-7nrz4", "timestamp":"2026-04-16 00:24:49.667585128 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000ca580)} Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.723 [INFO][3972] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.807 [INFO][3972] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.807 [INFO][3972] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.848 [INFO][3972] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.893 [INFO][3972] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.923 [INFO][3972] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.931 [INFO][3972] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.941 [INFO][3972] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.942 [INFO][3972] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.958 [INFO][3972] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.980 [INFO][3972] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.998 [INFO][3972] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.3/26] block=192.168.72.0/26 handle="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.998 [INFO][3972] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.3/26] handle="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.999 [INFO][3972] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:50.207357 containerd[1467]: 2026-04-16 00:24:49.999 [INFO][3972] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.3/26] IPv6=[] ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" HandleID="k8s-pod-network.0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:50.210707 containerd[1467]: 2026-04-16 00:24:50.009 [INFO][3923] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"801cc257-2986-4d78-aabb-a2b3b76027fd", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"coredns-674b8bbfcf-7nrz4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califbd1c1f0916", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.210707 containerd[1467]: 2026-04-16 00:24:50.010 [INFO][3923] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.3/32] ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:50.210707 containerd[1467]: 2026-04-16 00:24:50.010 [INFO][3923] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califbd1c1f0916 ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:50.210707 containerd[1467]: 2026-04-16 00:24:50.029 [INFO][3923] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:50.210707 containerd[1467]: 2026-04-16 00:24:50.070 [INFO][3923] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"801cc257-2986-4d78-aabb-a2b3b76027fd", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e", Pod:"coredns-674b8bbfcf-7nrz4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califbd1c1f0916", MAC:"c6:be:67:19:c4:74", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.210707 containerd[1467]: 2026-04-16 00:24:50.091 [INFO][3923] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e" Namespace="kube-system" Pod="coredns-674b8bbfcf-7nrz4" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:24:50.216935 containerd[1467]: time="2026-04-16T00:24:50.216890662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-656455b6cd-4wsm8,Uid:9d1533e4-c092-4636-9d1e-9189de61a887,Namespace:calico-system,Attempt:0,}" Apr 16 00:24:50.262099 containerd[1467]: time="2026-04-16T00:24:50.261480128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcb8465df-zbc8b,Uid:ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84,Namespace:calico-system,Attempt:1,} returns sandbox id \"830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44\"" Apr 16 00:24:50.284792 containerd[1467]: time="2026-04-16T00:24:50.283764660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 16 00:24:50.311329 systemd-networkd[1382]: cali206a03830d6: Link UP Apr 16 00:24:50.313918 systemd-networkd[1382]: cali206a03830d6: Gained carrier Apr 16 00:24:50.332132 containerd[1467]: time="2026-04-16T00:24:50.330437472Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:50.332132 containerd[1467]: time="2026-04-16T00:24:50.330515318Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:50.332132 containerd[1467]: time="2026-04-16T00:24:50.330530119Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.332132 containerd[1467]: time="2026-04-16T00:24:50.330623366Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:49.775 [ERROR][3999] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:49.881 [INFO][3999] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0 coredns-674b8bbfcf- kube-system 5ade9d9c-9070-44d0-8989-12094cc3969e 908 0 2026-04-16 00:24:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d coredns-674b8bbfcf-pjng2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali206a03830d6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:49.881 [INFO][3999] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.144 [INFO][4125] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" HandleID="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.171 [INFO][4125] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" HandleID="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000414660), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"coredns-674b8bbfcf-pjng2", "timestamp":"2026-04-16 00:24:50.144750333 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002002c0)} Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.171 [INFO][4125] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.171 [INFO][4125] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.171 [INFO][4125] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.178 [INFO][4125] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.197 [INFO][4125] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.216 [INFO][4125] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.223 [INFO][4125] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.228 [INFO][4125] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.228 [INFO][4125] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.233 [INFO][4125] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961 Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.244 [INFO][4125] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4125] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.4/26] block=192.168.72.0/26 handle="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4125] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.4/26] handle="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4125] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:50.365558 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4125] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.4/26] IPv6=[] ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" HandleID="k8s-pod-network.3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:50.367307 containerd[1467]: 2026-04-16 00:24:50.283 [INFO][3999] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5ade9d9c-9070-44d0-8989-12094cc3969e", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"coredns-674b8bbfcf-pjng2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali206a03830d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.367307 containerd[1467]: 2026-04-16 00:24:50.283 [INFO][3999] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.4/32] ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:50.367307 containerd[1467]: 2026-04-16 00:24:50.283 [INFO][3999] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali206a03830d6 ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:50.367307 containerd[1467]: 2026-04-16 00:24:50.315 [INFO][3999] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:50.367307 containerd[1467]: 2026-04-16 00:24:50.323 [INFO][3999] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5ade9d9c-9070-44d0-8989-12094cc3969e", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961", Pod:"coredns-674b8bbfcf-pjng2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali206a03830d6", MAC:"a2:9f:27:b3:54:f8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.367307 containerd[1467]: 2026-04-16 00:24:50.345 [INFO][3999] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961" Namespace="kube-system" Pod="coredns-674b8bbfcf-pjng2" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:24:50.427296 systemd[1]: Started cri-containerd-4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42.scope - libcontainer container 4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42. Apr 16 00:24:50.437361 containerd[1467]: time="2026-04-16T00:24:50.437008471Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:50.439344 containerd[1467]: time="2026-04-16T00:24:50.437657316Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:50.439344 containerd[1467]: time="2026-04-16T00:24:50.437683598Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.439344 containerd[1467]: time="2026-04-16T00:24:50.437895853Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.456394 systemd-networkd[1382]: cali32e8cb1a426: Link UP Apr 16 00:24:50.458787 systemd-networkd[1382]: cali32e8cb1a426: Gained carrier Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:49.791 [ERROR][4051] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:49.872 [INFO][4051] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0 calico-apiserver-688cbf68ff- calico-system eb8562da-f5ee-41ba-a284-e8ce170cf70d 914 0 2026-04-16 00:24:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:688cbf68ff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d calico-apiserver-688cbf68ff-cmgfm eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali32e8cb1a426 [] [] }} ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:49.872 [INFO][4051] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.223 [INFO][4121] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" HandleID="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4121] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" HandleID="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000373e70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"calico-apiserver-688cbf68ff-cmgfm", "timestamp":"2026-04-16 00:24:50.223103501 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002229a0)} Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4121] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4121] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.271 [INFO][4121] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.281 [INFO][4121] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.308 [INFO][4121] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.329 [INFO][4121] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.336 [INFO][4121] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.346 [INFO][4121] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.346 [INFO][4121] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.362 [INFO][4121] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2 Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.403 [INFO][4121] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.421 [INFO][4121] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.5/26] block=192.168.72.0/26 handle="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.421 [INFO][4121] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.5/26] handle="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.421 [INFO][4121] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:50.517686 containerd[1467]: 2026-04-16 00:24:50.421 [INFO][4121] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.5/26] IPv6=[] ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" HandleID="k8s-pod-network.8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:50.518849 containerd[1467]: 2026-04-16 00:24:50.434 [INFO][4051] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"eb8562da-f5ee-41ba-a284-e8ce170cf70d", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"calico-apiserver-688cbf68ff-cmgfm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali32e8cb1a426", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.518849 containerd[1467]: 2026-04-16 00:24:50.438 [INFO][4051] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.5/32] ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:50.518849 containerd[1467]: 2026-04-16 00:24:50.439 [INFO][4051] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali32e8cb1a426 ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:50.518849 containerd[1467]: 2026-04-16 00:24:50.458 [INFO][4051] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:50.518849 containerd[1467]: 2026-04-16 00:24:50.463 [INFO][4051] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"eb8562da-f5ee-41ba-a284-e8ce170cf70d", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2", Pod:"calico-apiserver-688cbf68ff-cmgfm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali32e8cb1a426", MAC:"16:be:72:f9:fe:29", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.518849 containerd[1467]: 2026-04-16 00:24:50.503 [INFO][4051] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2" Namespace="calico-system" Pod="calico-apiserver-688cbf68ff-cmgfm" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:24:50.518849 containerd[1467]: time="2026-04-16T00:24:50.517312496Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:50.518849 containerd[1467]: time="2026-04-16T00:24:50.517376060Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:50.518849 containerd[1467]: time="2026-04-16T00:24:50.517390781Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.518849 containerd[1467]: time="2026-04-16T00:24:50.517488988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.531936 systemd[1]: Started cri-containerd-0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e.scope - libcontainer container 0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e. Apr 16 00:24:50.599693 systemd-networkd[1382]: calif64780672a1: Link UP Apr 16 00:24:50.602686 systemd-networkd[1382]: calif64780672a1: Gained carrier Apr 16 00:24:50.618282 systemd[1]: Started cri-containerd-3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961.scope - libcontainer container 3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961. Apr 16 00:24:50.649355 containerd[1467]: time="2026-04-16T00:24:50.647858185Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:50.649355 containerd[1467]: time="2026-04-16T00:24:50.647938151Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:50.649355 containerd[1467]: time="2026-04-16T00:24:50.647953032Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.649355 containerd[1467]: time="2026-04-16T00:24:50.648048038Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:49.730 [ERROR][3987] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:49.847 [INFO][3987] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0 csi-node-driver- calico-system 2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec 912 0 2026-04-16 00:24:32 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d csi-node-driver-ptvrb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif64780672a1 [] [] }} ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:49.851 [INFO][3987] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.152 [INFO][4119] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" HandleID="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.182 [INFO][4119] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" HandleID="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000267f30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"csi-node-driver-ptvrb", "timestamp":"2026-04-16 00:24:50.152112652 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000ce6e0)} Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.182 [INFO][4119] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.421 [INFO][4119] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.422 [INFO][4119] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.430 [INFO][4119] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.452 [INFO][4119] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.489 [INFO][4119] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.501 [INFO][4119] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.514 [INFO][4119] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.514 [INFO][4119] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.522 [INFO][4119] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.537 [INFO][4119] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.562 [INFO][4119] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.6/26] block=192.168.72.0/26 handle="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.562 [INFO][4119] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.6/26] handle="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.562 [INFO][4119] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:50.686971 containerd[1467]: 2026-04-16 00:24:50.562 [INFO][4119] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.6/26] IPv6=[] ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" HandleID="k8s-pod-network.51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:50.688144 containerd[1467]: 2026-04-16 00:24:50.580 [INFO][3987] cni-plugin/k8s.go 418: Populated endpoint ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"csi-node-driver-ptvrb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif64780672a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.688144 containerd[1467]: 2026-04-16 00:24:50.580 [INFO][3987] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.6/32] ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:50.688144 containerd[1467]: 2026-04-16 00:24:50.581 [INFO][3987] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif64780672a1 ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:50.688144 containerd[1467]: 2026-04-16 00:24:50.607 [INFO][3987] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:50.688144 containerd[1467]: 2026-04-16 00:24:50.610 [INFO][3987] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc", Pod:"csi-node-driver-ptvrb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif64780672a1", MAC:"3e:c8:7a:7f:43:af", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.688144 containerd[1467]: 2026-04-16 00:24:50.662 [INFO][3987] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc" Namespace="calico-system" Pod="csi-node-driver-ptvrb" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:24:50.732126 containerd[1467]: time="2026-04-16T00:24:50.729406458Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:50.732126 containerd[1467]: time="2026-04-16T00:24:50.729458502Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:50.732126 containerd[1467]: time="2026-04-16T00:24:50.729469222Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.732126 containerd[1467]: time="2026-04-16T00:24:50.729548708Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.763454 systemd[1]: Started cri-containerd-8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2.scope - libcontainer container 8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2. Apr 16 00:24:50.769416 systemd-networkd[1382]: calie3dd520123b: Link UP Apr 16 00:24:50.769666 systemd-networkd[1382]: calie3dd520123b: Gained carrier Apr 16 00:24:50.789101 systemd[1]: Started cri-containerd-51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc.scope - libcontainer container 51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc. Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:49.864 [ERROR][4064] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:49.895 [INFO][4064] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0 goldmane-5b85766d88- calico-system 071c8c0b-162d-4eea-a8c8-a1554ee321bf 915 0 2026-04-16 00:24:31 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d goldmane-5b85766d88-xvg2d eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie3dd520123b [] [] }} ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:49.895 [INFO][4064] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.168 [INFO][4132] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" HandleID="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.215 [INFO][4132] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" HandleID="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004b0a90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"goldmane-5b85766d88-xvg2d", "timestamp":"2026-04-16 00:24:50.168699343 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004406e0)} Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.215 [INFO][4132] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.563 [INFO][4132] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.563 [INFO][4132] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.573 [INFO][4132] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.596 [INFO][4132] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.613 [INFO][4132] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.622 [INFO][4132] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.634 [INFO][4132] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.636 [INFO][4132] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.643 [INFO][4132] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.672 [INFO][4132] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.719 [INFO][4132] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.7/26] block=192.168.72.0/26 handle="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.719 [INFO][4132] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.7/26] handle="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.719 [INFO][4132] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:50.794981 containerd[1467]: 2026-04-16 00:24:50.719 [INFO][4132] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.7/26] IPv6=[] ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" HandleID="k8s-pod-network.77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:50.795835 containerd[1467]: 2026-04-16 00:24:50.739 [INFO][4064] cni-plugin/k8s.go 418: Populated endpoint ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"071c8c0b-162d-4eea-a8c8-a1554ee321bf", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"goldmane-5b85766d88-xvg2d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie3dd520123b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.795835 containerd[1467]: 2026-04-16 00:24:50.739 [INFO][4064] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.7/32] ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:50.795835 containerd[1467]: 2026-04-16 00:24:50.739 [INFO][4064] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie3dd520123b ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:50.795835 containerd[1467]: 2026-04-16 00:24:50.768 [INFO][4064] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:50.795835 containerd[1467]: 2026-04-16 00:24:50.775 [INFO][4064] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"071c8c0b-162d-4eea-a8c8-a1554ee321bf", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b", Pod:"goldmane-5b85766d88-xvg2d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie3dd520123b", MAC:"9a:63:35:0f:67:d3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:50.795835 containerd[1467]: 2026-04-16 00:24:50.791 [INFO][4064] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b" Namespace="calico-system" Pod="goldmane-5b85766d88-xvg2d" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:24:50.863876 containerd[1467]: time="2026-04-16T00:24:50.863781858Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:50.864116 containerd[1467]: time="2026-04-16T00:24:50.863964830Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:50.864116 containerd[1467]: time="2026-04-16T00:24:50.864003913Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.864440 containerd[1467]: time="2026-04-16T00:24:50.864396581Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:50.898824 containerd[1467]: time="2026-04-16T00:24:50.898784687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-qht6h,Uid:4679abf7-027e-48d1-9202-b1dbdd5b8949,Namespace:calico-system,Attempt:1,} returns sandbox id \"4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42\"" Apr 16 00:24:50.905482 containerd[1467]: time="2026-04-16T00:24:50.905153736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-7nrz4,Uid:801cc257-2986-4d78-aabb-a2b3b76027fd,Namespace:kube-system,Attempt:1,} returns sandbox id \"0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e\"" Apr 16 00:24:50.937101 containerd[1467]: time="2026-04-16T00:24:50.937038345Z" level=info msg="CreateContainer within sandbox \"0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 16 00:24:50.969616 systemd[1]: Started cri-containerd-77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b.scope - libcontainer container 77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b. Apr 16 00:24:50.977988 containerd[1467]: time="2026-04-16T00:24:50.977659451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-pjng2,Uid:5ade9d9c-9070-44d0-8989-12094cc3969e,Namespace:kube-system,Attempt:1,} returns sandbox id \"3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961\"" Apr 16 00:24:50.983895 systemd-networkd[1382]: cali5a27aa82ef7: Link UP Apr 16 00:24:50.984142 systemd-networkd[1382]: cali5a27aa82ef7: Gained carrier Apr 16 00:24:50.997687 containerd[1467]: time="2026-04-16T00:24:50.997446127Z" level=info msg="CreateContainer within sandbox \"3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.530 [ERROR][4215] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.584 [INFO][4215] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0 whisker-656455b6cd- calico-system 9d1533e4-c092-4636-9d1e-9189de61a887 936 0 2026-04-16 00:24:49 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:656455b6cd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-n-56c15b786d whisker-656455b6cd-4wsm8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali5a27aa82ef7 [] [] }} ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.584 [INFO][4215] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.802 [INFO][4347] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" HandleID="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.848 [INFO][4347] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" HandleID="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003f2d00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-56c15b786d", "pod":"whisker-656455b6cd-4wsm8", "timestamp":"2026-04-16 00:24:50.802958887 +0000 UTC"}, Hostname:"ci-4081-3-6-n-56c15b786d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000192000)} Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.848 [INFO][4347] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.848 [INFO][4347] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.848 [INFO][4347] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-56c15b786d' Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.880 [INFO][4347] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.902 [INFO][4347] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.921 [INFO][4347] ipam/ipam.go 526: Trying affinity for 192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.927 [INFO][4347] ipam/ipam.go 160: Attempting to load block cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.934 [INFO][4347] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.934 [INFO][4347] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.938 [INFO][4347] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19 Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.950 [INFO][4347] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.970 [INFO][4347] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.72.8/26] block=192.168.72.0/26 handle="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.971 [INFO][4347] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.72.8/26] handle="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" host="ci-4081-3-6-n-56c15b786d" Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.971 [INFO][4347] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:24:51.021286 containerd[1467]: 2026-04-16 00:24:50.971 [INFO][4347] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.72.8/26] IPv6=[] ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" HandleID="k8s-pod-network.ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" Apr 16 00:24:51.021840 containerd[1467]: 2026-04-16 00:24:50.976 [INFO][4215] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0", GenerateName:"whisker-656455b6cd-", Namespace:"calico-system", SelfLink:"", UID:"9d1533e4-c092-4636-9d1e-9189de61a887", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"656455b6cd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"", Pod:"whisker-656455b6cd-4wsm8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5a27aa82ef7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:51.021840 containerd[1467]: 2026-04-16 00:24:50.977 [INFO][4215] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.8/32] ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" Apr 16 00:24:51.021840 containerd[1467]: 2026-04-16 00:24:50.977 [INFO][4215] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5a27aa82ef7 ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" Apr 16 00:24:51.021840 containerd[1467]: 2026-04-16 00:24:50.980 [INFO][4215] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" Apr 16 00:24:51.021840 containerd[1467]: 2026-04-16 00:24:50.980 [INFO][4215] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0", GenerateName:"whisker-656455b6cd-", Namespace:"calico-system", SelfLink:"", UID:"9d1533e4-c092-4636-9d1e-9189de61a887", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"656455b6cd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19", Pod:"whisker-656455b6cd-4wsm8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5a27aa82ef7", MAC:"da:42:21:05:0b:a4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:24:51.021840 containerd[1467]: 2026-04-16 00:24:51.002 [INFO][4215] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19" Namespace="calico-system" Pod="whisker-656455b6cd-4wsm8" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--656455b6cd--4wsm8-eth0" Apr 16 00:24:51.040848 containerd[1467]: time="2026-04-16T00:24:51.040543137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptvrb,Uid:2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec,Namespace:calico-system,Attempt:1,} returns sandbox id \"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc\"" Apr 16 00:24:51.063519 containerd[1467]: time="2026-04-16T00:24:51.063201399Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:24:51.063519 containerd[1467]: time="2026-04-16T00:24:51.063291885Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:24:51.063519 containerd[1467]: time="2026-04-16T00:24:51.063312087Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:51.063519 containerd[1467]: time="2026-04-16T00:24:51.063440736Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:24:51.102663 containerd[1467]: time="2026-04-16T00:24:51.101981786Z" level=info msg="CreateContainer within sandbox \"0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e5dff9cd9fc662d1c6f4faa01491cf28a8e42bb43127a0760ac8113c364ae99b\"" Apr 16 00:24:51.103657 systemd-networkd[1382]: califbd1c1f0916: Gained IPv6LL Apr 16 00:24:51.104841 systemd-networkd[1382]: calidf2d11505cc: Gained IPv6LL Apr 16 00:24:51.110022 containerd[1467]: time="2026-04-16T00:24:51.109604798Z" level=info msg="StartContainer for \"e5dff9cd9fc662d1c6f4faa01491cf28a8e42bb43127a0760ac8113c364ae99b\"" Apr 16 00:24:51.113925 systemd[1]: Started cri-containerd-ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19.scope - libcontainer container ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19. Apr 16 00:24:51.115116 containerd[1467]: time="2026-04-16T00:24:51.114995374Z" level=info msg="CreateContainer within sandbox \"3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1324c2a3103cb931374a95a17661d0886ff185dc3547ebe283b45333482a568b\"" Apr 16 00:24:51.117110 containerd[1467]: time="2026-04-16T00:24:51.116566484Z" level=info msg="StartContainer for \"1324c2a3103cb931374a95a17661d0886ff185dc3547ebe283b45333482a568b\"" Apr 16 00:24:51.188133 kubelet[2577]: I0416 00:24:51.184973 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:24:51.223893 containerd[1467]: time="2026-04-16T00:24:51.221875395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-688cbf68ff-cmgfm,Uid:eb8562da-f5ee-41ba-a284-e8ce170cf70d,Namespace:calico-system,Attempt:1,} returns sandbox id \"8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2\"" Apr 16 00:24:51.236621 containerd[1467]: time="2026-04-16T00:24:51.235356776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-656455b6cd-4wsm8,Uid:9d1533e4-c092-4636-9d1e-9189de61a887,Namespace:calico-system,Attempt:0,} returns sandbox id \"ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19\"" Apr 16 00:24:51.282073 systemd[1]: Started cri-containerd-1324c2a3103cb931374a95a17661d0886ff185dc3547ebe283b45333482a568b.scope - libcontainer container 1324c2a3103cb931374a95a17661d0886ff185dc3547ebe283b45333482a568b. Apr 16 00:24:51.293574 systemd[1]: Started cri-containerd-e5dff9cd9fc662d1c6f4faa01491cf28a8e42bb43127a0760ac8113c364ae99b.scope - libcontainer container e5dff9cd9fc662d1c6f4faa01491cf28a8e42bb43127a0760ac8113c364ae99b. Apr 16 00:24:51.335362 containerd[1467]: time="2026-04-16T00:24:51.335261110Z" level=info msg="StartContainer for \"e5dff9cd9fc662d1c6f4faa01491cf28a8e42bb43127a0760ac8113c364ae99b\" returns successfully" Apr 16 00:24:51.364678 kubelet[2577]: I0416 00:24:51.364538 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085a0cd1-f75e-4874-b3c4-63142da8b2f2" path="/var/lib/kubelet/pods/085a0cd1-f75e-4874-b3c4-63142da8b2f2/volumes" Apr 16 00:24:51.367641 containerd[1467]: time="2026-04-16T00:24:51.367134175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-xvg2d,Uid:071c8c0b-162d-4eea-a8c8-a1554ee321bf,Namespace:calico-system,Attempt:1,} returns sandbox id \"77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b\"" Apr 16 00:24:51.405589 containerd[1467]: time="2026-04-16T00:24:51.405538576Z" level=info msg="StartContainer for \"1324c2a3103cb931374a95a17661d0886ff185dc3547ebe283b45333482a568b\" returns successfully" Apr 16 00:24:51.487691 systemd-networkd[1382]: calidb50fab48aa: Gained IPv6LL Apr 16 00:24:51.738683 kubelet[2577]: I0416 00:24:51.737878 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-7nrz4" podStartSLOduration=32.737855253 podStartE2EDuration="32.737855253s" podCreationTimestamp="2026-04-16 00:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:24:51.719976245 +0000 UTC m=+38.492076341" watchObservedRunningTime="2026-04-16 00:24:51.737855253 +0000 UTC m=+38.509955389" Apr 16 00:24:51.807902 systemd-networkd[1382]: cali32e8cb1a426: Gained IPv6LL Apr 16 00:24:51.999678 systemd-networkd[1382]: calif64780672a1: Gained IPv6LL Apr 16 00:24:52.191535 systemd-networkd[1382]: cali206a03830d6: Gained IPv6LL Apr 16 00:24:52.305312 kernel: calico-node[4659]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Apr 16 00:24:52.639576 systemd-networkd[1382]: calie3dd520123b: Gained IPv6LL Apr 16 00:24:52.706325 systemd-networkd[1382]: vxlan.calico: Link UP Apr 16 00:24:52.706336 systemd-networkd[1382]: vxlan.calico: Gained carrier Apr 16 00:24:52.769526 kubelet[2577]: I0416 00:24:52.768868 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-pjng2" podStartSLOduration=33.768843968 podStartE2EDuration="33.768843968s" podCreationTimestamp="2026-04-16 00:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:24:51.737679921 +0000 UTC m=+38.509779977" watchObservedRunningTime="2026-04-16 00:24:52.768843968 +0000 UTC m=+39.540944064" Apr 16 00:24:52.831620 systemd-networkd[1382]: cali5a27aa82ef7: Gained IPv6LL Apr 16 00:24:53.791921 systemd-networkd[1382]: vxlan.calico: Gained IPv6LL Apr 16 00:24:54.229049 containerd[1467]: time="2026-04-16T00:24:54.228998500Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:54.230456 containerd[1467]: time="2026-04-16T00:24:54.230396835Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Apr 16 00:24:54.230786 containerd[1467]: time="2026-04-16T00:24:54.230733377Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:54.239602 containerd[1467]: time="2026-04-16T00:24:54.239251475Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:54.240137 containerd[1467]: time="2026-04-16T00:24:54.240086092Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.956279069s" Apr 16 00:24:54.240137 containerd[1467]: time="2026-04-16T00:24:54.240120614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Apr 16 00:24:54.242861 containerd[1467]: time="2026-04-16T00:24:54.242545179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 16 00:24:54.267961 containerd[1467]: time="2026-04-16T00:24:54.267914340Z" level=info msg="CreateContainer within sandbox \"830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 16 00:24:54.287459 containerd[1467]: time="2026-04-16T00:24:54.287382741Z" level=info msg="CreateContainer within sandbox \"830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"e6206b62e51fdfddbeadb99e69613b68c8edefc09c695b70d1442a48dbf66e37\"" Apr 16 00:24:54.291307 containerd[1467]: time="2026-04-16T00:24:54.289336074Z" level=info msg="StartContainer for \"e6206b62e51fdfddbeadb99e69613b68c8edefc09c695b70d1442a48dbf66e37\"" Apr 16 00:24:54.327015 systemd[1]: Started cri-containerd-e6206b62e51fdfddbeadb99e69613b68c8edefc09c695b70d1442a48dbf66e37.scope - libcontainer container e6206b62e51fdfddbeadb99e69613b68c8edefc09c695b70d1442a48dbf66e37. Apr 16 00:24:54.370391 containerd[1467]: time="2026-04-16T00:24:54.370342850Z" level=info msg="StartContainer for \"e6206b62e51fdfddbeadb99e69613b68c8edefc09c695b70d1442a48dbf66e37\" returns successfully" Apr 16 00:24:54.841950 kubelet[2577]: I0416 00:24:54.841854 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5bcb8465df-zbc8b" podStartSLOduration=17.881153185 podStartE2EDuration="21.841825879s" podCreationTimestamp="2026-04-16 00:24:33 +0000 UTC" firstStartedPulling="2026-04-16 00:24:50.281124154 +0000 UTC m=+37.053224210" lastFinishedPulling="2026-04-16 00:24:54.241796808 +0000 UTC m=+41.013896904" observedRunningTime="2026-04-16 00:24:54.769302118 +0000 UTC m=+41.541402214" watchObservedRunningTime="2026-04-16 00:24:54.841825879 +0000 UTC m=+41.613926015" Apr 16 00:24:56.503303 containerd[1467]: time="2026-04-16T00:24:56.502491503Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:56.504926 containerd[1467]: time="2026-04-16T00:24:56.504889023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Apr 16 00:24:56.506837 containerd[1467]: time="2026-04-16T00:24:56.506805431Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:56.510496 containerd[1467]: time="2026-04-16T00:24:56.510437753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:56.514299 containerd[1467]: time="2026-04-16T00:24:56.512391243Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 2.269799381s" Apr 16 00:24:56.514299 containerd[1467]: time="2026-04-16T00:24:56.512430326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 16 00:24:56.519883 containerd[1467]: time="2026-04-16T00:24:56.519645968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 16 00:24:56.536843 containerd[1467]: time="2026-04-16T00:24:56.536801192Z" level=info msg="CreateContainer within sandbox \"4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 16 00:24:56.558676 containerd[1467]: time="2026-04-16T00:24:56.558628329Z" level=info msg="CreateContainer within sandbox \"4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ab6ebf8308f7d1f29f93943dedb7b2c2a4a816f589b80bd1a37c5ad1150b89d3\"" Apr 16 00:24:56.560336 containerd[1467]: time="2026-04-16T00:24:56.559909775Z" level=info msg="StartContainer for \"ab6ebf8308f7d1f29f93943dedb7b2c2a4a816f589b80bd1a37c5ad1150b89d3\"" Apr 16 00:24:56.602514 systemd[1]: Started cri-containerd-ab6ebf8308f7d1f29f93943dedb7b2c2a4a816f589b80bd1a37c5ad1150b89d3.scope - libcontainer container ab6ebf8308f7d1f29f93943dedb7b2c2a4a816f589b80bd1a37c5ad1150b89d3. Apr 16 00:24:56.640177 containerd[1467]: time="2026-04-16T00:24:56.640118288Z" level=info msg="StartContainer for \"ab6ebf8308f7d1f29f93943dedb7b2c2a4a816f589b80bd1a37c5ad1150b89d3\" returns successfully" Apr 16 00:24:57.943652 kubelet[2577]: I0416 00:24:57.941253 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-688cbf68ff-qht6h" podStartSLOduration=21.323650315 podStartE2EDuration="26.941236646s" podCreationTimestamp="2026-04-16 00:24:31 +0000 UTC" firstStartedPulling="2026-04-16 00:24:50.901574444 +0000 UTC m=+37.673674540" lastFinishedPulling="2026-04-16 00:24:56.519160775 +0000 UTC m=+43.291260871" observedRunningTime="2026-04-16 00:24:56.783360487 +0000 UTC m=+43.555460583" watchObservedRunningTime="2026-04-16 00:24:57.941236646 +0000 UTC m=+44.713336742" Apr 16 00:24:58.081546 containerd[1467]: time="2026-04-16T00:24:58.081477105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:58.083945 containerd[1467]: time="2026-04-16T00:24:58.083584615Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Apr 16 00:24:58.086863 containerd[1467]: time="2026-04-16T00:24:58.084984581Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:58.090365 containerd[1467]: time="2026-04-16T00:24:58.089499873Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:58.092051 containerd[1467]: time="2026-04-16T00:24:58.091993653Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.572303643s" Apr 16 00:24:58.092051 containerd[1467]: time="2026-04-16T00:24:58.092042892Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Apr 16 00:24:58.095043 containerd[1467]: time="2026-04-16T00:24:58.095006661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 16 00:24:58.103938 containerd[1467]: time="2026-04-16T00:24:58.102859352Z" level=info msg="CreateContainer within sandbox \"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 16 00:24:58.125239 containerd[1467]: time="2026-04-16T00:24:58.125189056Z" level=info msg="CreateContainer within sandbox \"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"f3b113b30642c3339705b47e79a302873bca72af2207ecce7f0f4dd582dff581\"" Apr 16 00:24:58.126147 containerd[1467]: time="2026-04-16T00:24:58.126097754Z" level=info msg="StartContainer for \"f3b113b30642c3339705b47e79a302873bca72af2207ecce7f0f4dd582dff581\"" Apr 16 00:24:58.174502 systemd[1]: Started cri-containerd-f3b113b30642c3339705b47e79a302873bca72af2207ecce7f0f4dd582dff581.scope - libcontainer container f3b113b30642c3339705b47e79a302873bca72af2207ecce7f0f4dd582dff581. Apr 16 00:24:58.214961 containerd[1467]: time="2026-04-16T00:24:58.213715651Z" level=info msg="StartContainer for \"f3b113b30642c3339705b47e79a302873bca72af2207ecce7f0f4dd582dff581\" returns successfully" Apr 16 00:24:58.505513 containerd[1467]: time="2026-04-16T00:24:58.504506671Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:24:58.505801 containerd[1467]: time="2026-04-16T00:24:58.505670603Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 16 00:24:58.508614 containerd[1467]: time="2026-04-16T00:24:58.508537174Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 412.492178ms" Apr 16 00:24:58.508614 containerd[1467]: time="2026-04-16T00:24:58.508605412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 16 00:24:58.510480 containerd[1467]: time="2026-04-16T00:24:58.510443088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 16 00:24:58.515834 containerd[1467]: time="2026-04-16T00:24:58.515625924Z" level=info msg="CreateContainer within sandbox \"8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 16 00:24:58.547506 containerd[1467]: time="2026-04-16T00:24:58.547445760Z" level=info msg="CreateContainer within sandbox \"8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d36e8109b883c218a8ef65289c2884151521e09cb1a1bfb7c0179d390b57c4ca\"" Apr 16 00:24:58.548262 containerd[1467]: time="2026-04-16T00:24:58.548231141Z" level=info msg="StartContainer for \"d36e8109b883c218a8ef65289c2884151521e09cb1a1bfb7c0179d390b57c4ca\"" Apr 16 00:24:58.607531 systemd[1]: Started cri-containerd-d36e8109b883c218a8ef65289c2884151521e09cb1a1bfb7c0179d390b57c4ca.scope - libcontainer container d36e8109b883c218a8ef65289c2884151521e09cb1a1bfb7c0179d390b57c4ca. Apr 16 00:24:58.645977 containerd[1467]: time="2026-04-16T00:24:58.645904797Z" level=info msg="StartContainer for \"d36e8109b883c218a8ef65289c2884151521e09cb1a1bfb7c0179d390b57c4ca\" returns successfully" Apr 16 00:24:59.777614 kubelet[2577]: I0416 00:24:59.777478 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:25:00.178079 containerd[1467]: time="2026-04-16T00:25:00.177990036Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:00.179460 containerd[1467]: time="2026-04-16T00:25:00.179333090Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Apr 16 00:25:00.182242 containerd[1467]: time="2026-04-16T00:25:00.181036577Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:00.184142 containerd[1467]: time="2026-04-16T00:25:00.184098917Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:00.184843 containerd[1467]: time="2026-04-16T00:25:00.184802424Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.674315216s" Apr 16 00:25:00.184843 containerd[1467]: time="2026-04-16T00:25:00.184841143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Apr 16 00:25:00.187461 containerd[1467]: time="2026-04-16T00:25:00.187418173Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 16 00:25:00.191122 containerd[1467]: time="2026-04-16T00:25:00.191079301Z" level=info msg="CreateContainer within sandbox \"ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 16 00:25:00.211956 containerd[1467]: time="2026-04-16T00:25:00.210577841Z" level=info msg="CreateContainer within sandbox \"ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"173344829fb5ef92d8d2e4ea7c3b7ad7de854e38e296f6c183f7b041046d6d25\"" Apr 16 00:25:00.213227 containerd[1467]: time="2026-04-16T00:25:00.213069633Z" level=info msg="StartContainer for \"173344829fb5ef92d8d2e4ea7c3b7ad7de854e38e296f6c183f7b041046d6d25\"" Apr 16 00:25:00.215411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1005506434.mount: Deactivated successfully. Apr 16 00:25:00.267481 systemd[1]: Started cri-containerd-173344829fb5ef92d8d2e4ea7c3b7ad7de854e38e296f6c183f7b041046d6d25.scope - libcontainer container 173344829fb5ef92d8d2e4ea7c3b7ad7de854e38e296f6c183f7b041046d6d25. Apr 16 00:25:00.308593 containerd[1467]: time="2026-04-16T00:25:00.307719229Z" level=info msg="StartContainer for \"173344829fb5ef92d8d2e4ea7c3b7ad7de854e38e296f6c183f7b041046d6d25\" returns successfully" Apr 16 00:25:02.797892 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2051177413.mount: Deactivated successfully. Apr 16 00:25:03.033658 kubelet[2577]: I0416 00:25:03.031939 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:25:03.248812 containerd[1467]: time="2026-04-16T00:25:03.248006853Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:03.250907 containerd[1467]: time="2026-04-16T00:25:03.250854536Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Apr 16 00:25:03.252677 containerd[1467]: time="2026-04-16T00:25:03.252638872Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:03.259557 containerd[1467]: time="2026-04-16T00:25:03.259499142Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:03.261114 containerd[1467]: time="2026-04-16T00:25:03.261043761Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 3.073358953s" Apr 16 00:25:03.261344 containerd[1467]: time="2026-04-16T00:25:03.261324477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Apr 16 00:25:03.264482 containerd[1467]: time="2026-04-16T00:25:03.264329918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 16 00:25:03.269627 containerd[1467]: time="2026-04-16T00:25:03.269539409Z" level=info msg="CreateContainer within sandbox \"77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 16 00:25:03.300605 containerd[1467]: time="2026-04-16T00:25:03.300553679Z" level=info msg="CreateContainer within sandbox \"77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"7c4076c6cc5976804a71894cb0b8c4bc2e26e1a9008027ab1f0465c5c1699e56\"" Apr 16 00:25:03.302086 containerd[1467]: time="2026-04-16T00:25:03.302012980Z" level=info msg="StartContainer for \"7c4076c6cc5976804a71894cb0b8c4bc2e26e1a9008027ab1f0465c5c1699e56\"" Apr 16 00:25:03.310174 kubelet[2577]: I0416 00:25:03.308361 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-688cbf68ff-cmgfm" podStartSLOduration=25.035442478 podStartE2EDuration="32.308344936s" podCreationTimestamp="2026-04-16 00:24:31 +0000 UTC" firstStartedPulling="2026-04-16 00:24:51.236920645 +0000 UTC m=+38.009020741" lastFinishedPulling="2026-04-16 00:24:58.509823143 +0000 UTC m=+45.281923199" observedRunningTime="2026-04-16 00:24:58.784452311 +0000 UTC m=+45.556552407" watchObservedRunningTime="2026-04-16 00:25:03.308344936 +0000 UTC m=+50.080445032" Apr 16 00:25:03.353093 systemd[1]: Started cri-containerd-7c4076c6cc5976804a71894cb0b8c4bc2e26e1a9008027ab1f0465c5c1699e56.scope - libcontainer container 7c4076c6cc5976804a71894cb0b8c4bc2e26e1a9008027ab1f0465c5c1699e56. Apr 16 00:25:03.412206 containerd[1467]: time="2026-04-16T00:25:03.412055766Z" level=info msg="StartContainer for \"7c4076c6cc5976804a71894cb0b8c4bc2e26e1a9008027ab1f0465c5c1699e56\" returns successfully" Apr 16 00:25:03.824299 kubelet[2577]: I0416 00:25:03.820359 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-xvg2d" podStartSLOduration=20.927620606 podStartE2EDuration="32.82034277s" podCreationTimestamp="2026-04-16 00:24:31 +0000 UTC" firstStartedPulling="2026-04-16 00:24:51.370483129 +0000 UTC m=+38.142583185" lastFinishedPulling="2026-04-16 00:25:03.263205253 +0000 UTC m=+50.035305349" observedRunningTime="2026-04-16 00:25:03.819286424 +0000 UTC m=+50.591386640" watchObservedRunningTime="2026-04-16 00:25:03.82034277 +0000 UTC m=+50.592442906" Apr 16 00:25:03.830107 systemd[1]: run-containerd-runc-k8s.io-7c4076c6cc5976804a71894cb0b8c4bc2e26e1a9008027ab1f0465c5c1699e56-runc.9vy3OS.mount: Deactivated successfully. Apr 16 00:25:05.168812 containerd[1467]: time="2026-04-16T00:25:05.168484121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:05.170867 containerd[1467]: time="2026-04-16T00:25:05.170649301Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Apr 16 00:25:05.173296 containerd[1467]: time="2026-04-16T00:25:05.172047088Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:05.175304 containerd[1467]: time="2026-04-16T00:25:05.175238778Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:05.176945 containerd[1467]: time="2026-04-16T00:25:05.176877282Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.912178889s" Apr 16 00:25:05.176945 containerd[1467]: time="2026-04-16T00:25:05.176941842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Apr 16 00:25:05.179582 containerd[1467]: time="2026-04-16T00:25:05.179539657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 16 00:25:05.185860 containerd[1467]: time="2026-04-16T00:25:05.185816639Z" level=info msg="CreateContainer within sandbox \"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 16 00:25:05.210627 containerd[1467]: time="2026-04-16T00:25:05.210576207Z" level=info msg="CreateContainer within sandbox \"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"747bb2f2f2148fb927f6d3e31c8da89d2bdb5ce412d786df9e4adce4adaed43d\"" Apr 16 00:25:05.211406 containerd[1467]: time="2026-04-16T00:25:05.211333800Z" level=info msg="StartContainer for \"747bb2f2f2148fb927f6d3e31c8da89d2bdb5ce412d786df9e4adce4adaed43d\"" Apr 16 00:25:05.251527 systemd[1]: Started cri-containerd-747bb2f2f2148fb927f6d3e31c8da89d2bdb5ce412d786df9e4adce4adaed43d.scope - libcontainer container 747bb2f2f2148fb927f6d3e31c8da89d2bdb5ce412d786df9e4adce4adaed43d. Apr 16 00:25:05.284967 containerd[1467]: time="2026-04-16T00:25:05.284882312Z" level=info msg="StartContainer for \"747bb2f2f2148fb927f6d3e31c8da89d2bdb5ce412d786df9e4adce4adaed43d\" returns successfully" Apr 16 00:25:05.477697 kubelet[2577]: I0416 00:25:05.477569 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 16 00:25:05.477697 kubelet[2577]: I0416 00:25:05.477608 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 16 00:25:07.338565 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1032817527.mount: Deactivated successfully. Apr 16 00:25:07.357500 containerd[1467]: time="2026-04-16T00:25:07.357442652Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:07.360312 containerd[1467]: time="2026-04-16T00:25:07.360088997Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Apr 16 00:25:07.362235 containerd[1467]: time="2026-04-16T00:25:07.362154585Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:07.365172 containerd[1467]: time="2026-04-16T00:25:07.365086248Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:25:07.366004 containerd[1467]: time="2026-04-16T00:25:07.365960883Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.186375506s" Apr 16 00:25:07.366004 containerd[1467]: time="2026-04-16T00:25:07.366005003Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Apr 16 00:25:07.372684 containerd[1467]: time="2026-04-16T00:25:07.372637885Z" level=info msg="CreateContainer within sandbox \"ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 16 00:25:07.394624 containerd[1467]: time="2026-04-16T00:25:07.394423320Z" level=info msg="CreateContainer within sandbox \"ac4583594e847d0ebc688f4979698640115185eb587bed79fb8699871957ab19\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"caedc1d70796c2a04e09d7b8beadd3594bdb1756469ad41c1cc596aa678d8ffc\"" Apr 16 00:25:07.395373 containerd[1467]: time="2026-04-16T00:25:07.395346035Z" level=info msg="StartContainer for \"caedc1d70796c2a04e09d7b8beadd3594bdb1756469ad41c1cc596aa678d8ffc\"" Apr 16 00:25:07.431505 systemd[1]: Started cri-containerd-caedc1d70796c2a04e09d7b8beadd3594bdb1756469ad41c1cc596aa678d8ffc.scope - libcontainer container caedc1d70796c2a04e09d7b8beadd3594bdb1756469ad41c1cc596aa678d8ffc. Apr 16 00:25:07.471998 containerd[1467]: time="2026-04-16T00:25:07.471903076Z" level=info msg="StartContainer for \"caedc1d70796c2a04e09d7b8beadd3594bdb1756469ad41c1cc596aa678d8ffc\" returns successfully" Apr 16 00:25:07.830485 kubelet[2577]: I0416 00:25:07.829704 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-ptvrb" podStartSLOduration=21.693307791 podStartE2EDuration="35.829643024s" podCreationTimestamp="2026-04-16 00:24:32 +0000 UTC" firstStartedPulling="2026-04-16 00:24:51.042454071 +0000 UTC m=+37.814554167" lastFinishedPulling="2026-04-16 00:25:05.178789304 +0000 UTC m=+51.950889400" observedRunningTime="2026-04-16 00:25:05.821409172 +0000 UTC m=+52.593509268" watchObservedRunningTime="2026-04-16 00:25:07.829643024 +0000 UTC m=+54.601743160" Apr 16 00:25:07.830485 kubelet[2577]: I0416 00:25:07.829886 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-656455b6cd-4wsm8" podStartSLOduration=2.706444404 podStartE2EDuration="18.829880143s" podCreationTimestamp="2026-04-16 00:24:49 +0000 UTC" firstStartedPulling="2026-04-16 00:24:51.243954176 +0000 UTC m=+38.016054272" lastFinishedPulling="2026-04-16 00:25:07.367389955 +0000 UTC m=+54.139490011" observedRunningTime="2026-04-16 00:25:07.827261558 +0000 UTC m=+54.599361734" watchObservedRunningTime="2026-04-16 00:25:07.829880143 +0000 UTC m=+54.601980199" Apr 16 00:25:13.378982 containerd[1467]: time="2026-04-16T00:25:13.378930020Z" level=info msg="StopPodSandbox for \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\"" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.424 [WARNING][5294] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"071c8c0b-162d-4eea-a8c8-a1554ee321bf", ResourceVersion:"1060", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b", Pod:"goldmane-5b85766d88-xvg2d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie3dd520123b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.424 [INFO][5294] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.424 [INFO][5294] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" iface="eth0" netns="" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.424 [INFO][5294] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.424 [INFO][5294] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.473 [INFO][5301] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.473 [INFO][5301] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.474 [INFO][5301] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.491 [WARNING][5301] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.491 [INFO][5301] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.493 [INFO][5301] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:13.498361 containerd[1467]: 2026-04-16 00:25:13.495 [INFO][5294] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.498361 containerd[1467]: time="2026-04-16T00:25:13.498295839Z" level=info msg="TearDown network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\" successfully" Apr 16 00:25:13.498361 containerd[1467]: time="2026-04-16T00:25:13.498326799Z" level=info msg="StopPodSandbox for \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\" returns successfully" Apr 16 00:25:13.501624 containerd[1467]: time="2026-04-16T00:25:13.501589812Z" level=info msg="RemovePodSandbox for \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\"" Apr 16 00:25:13.505615 containerd[1467]: time="2026-04-16T00:25:13.505551547Z" level=info msg="Forcibly stopping sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\"" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.554 [WARNING][5316] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"071c8c0b-162d-4eea-a8c8-a1554ee321bf", ResourceVersion:"1060", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"77ea66554bb66f9035033c4481e43709cb21dd07f149a536332b3a7fa63b573b", Pod:"goldmane-5b85766d88-xvg2d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie3dd520123b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.554 [INFO][5316] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.554 [INFO][5316] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" iface="eth0" netns="" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.554 [INFO][5316] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.555 [INFO][5316] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.581 [INFO][5323] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.581 [INFO][5323] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.581 [INFO][5323] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.592 [WARNING][5323] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.592 [INFO][5323] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" HandleID="k8s-pod-network.1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Workload="ci--4081--3--6--n--56c15b786d-k8s-goldmane--5b85766d88--xvg2d-eth0" Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.594 [INFO][5323] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:13.599330 containerd[1467]: 2026-04-16 00:25:13.596 [INFO][5316] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509" Apr 16 00:25:13.599330 containerd[1467]: time="2026-04-16T00:25:13.598541585Z" level=info msg="TearDown network for sandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\" successfully" Apr 16 00:25:13.603789 containerd[1467]: time="2026-04-16T00:25:13.603700045Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:13.604006 containerd[1467]: time="2026-04-16T00:25:13.603975246Z" level=info msg="RemovePodSandbox \"1b52e70af167dabe611fce9728c48b8de8808bad471abefb58c7bab7b8888509\" returns successfully" Apr 16 00:25:13.604959 containerd[1467]: time="2026-04-16T00:25:13.604915929Z" level=info msg="StopPodSandbox for \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\"" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.668 [WARNING][5337] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"801cc257-2986-4d78-aabb-a2b3b76027fd", ResourceVersion:"990", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e", Pod:"coredns-674b8bbfcf-7nrz4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califbd1c1f0916", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.669 [INFO][5337] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.669 [INFO][5337] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" iface="eth0" netns="" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.669 [INFO][5337] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.669 [INFO][5337] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.691 [INFO][5345] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.692 [INFO][5345] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.692 [INFO][5345] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.705 [WARNING][5345] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.705 [INFO][5345] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.707 [INFO][5345] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:13.711632 containerd[1467]: 2026-04-16 00:25:13.709 [INFO][5337] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.712664 containerd[1467]: time="2026-04-16T00:25:13.711657500Z" level=info msg="TearDown network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\" successfully" Apr 16 00:25:13.712664 containerd[1467]: time="2026-04-16T00:25:13.711689100Z" level=info msg="StopPodSandbox for \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\" returns successfully" Apr 16 00:25:13.712664 containerd[1467]: time="2026-04-16T00:25:13.712527703Z" level=info msg="RemovePodSandbox for \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\"" Apr 16 00:25:13.712664 containerd[1467]: time="2026-04-16T00:25:13.712567304Z" level=info msg="Forcibly stopping sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\"" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.764 [WARNING][5359] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"801cc257-2986-4d78-aabb-a2b3b76027fd", ResourceVersion:"990", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"0f6343ad40f0f1c7d199b8c097d8d43780c8192dd7d0f5e061d0dded9c91dd6e", Pod:"coredns-674b8bbfcf-7nrz4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califbd1c1f0916", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.765 [INFO][5359] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.765 [INFO][5359] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" iface="eth0" netns="" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.765 [INFO][5359] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.765 [INFO][5359] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.788 [INFO][5366] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.788 [INFO][5366] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.788 [INFO][5366] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.802 [WARNING][5366] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.802 [INFO][5366] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" HandleID="k8s-pod-network.096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--7nrz4-eth0" Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.804 [INFO][5366] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:13.808566 containerd[1467]: 2026-04-16 00:25:13.806 [INFO][5359] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076" Apr 16 00:25:13.809086 containerd[1467]: time="2026-04-16T00:25:13.808616313Z" level=info msg="TearDown network for sandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\" successfully" Apr 16 00:25:13.814344 containerd[1467]: time="2026-04-16T00:25:13.814256095Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:13.814773 containerd[1467]: time="2026-04-16T00:25:13.814388015Z" level=info msg="RemovePodSandbox \"096e40c412b85626a92510093b3ab4e13ac57215254ca11dda05433c61a12076\" returns successfully" Apr 16 00:25:13.815335 containerd[1467]: time="2026-04-16T00:25:13.814971977Z" level=info msg="StopPodSandbox for \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\"" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.869 [WARNING][5380] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"4679abf7-027e-48d1-9202-b1dbdd5b8949", ResourceVersion:"1020", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42", Pod:"calico-apiserver-688cbf68ff-qht6h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calidf2d11505cc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.869 [INFO][5380] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.869 [INFO][5380] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" iface="eth0" netns="" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.869 [INFO][5380] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.869 [INFO][5380] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.892 [INFO][5387] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.892 [INFO][5387] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.892 [INFO][5387] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.906 [WARNING][5387] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.906 [INFO][5387] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.908 [INFO][5387] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:13.914863 containerd[1467]: 2026-04-16 00:25:13.911 [INFO][5380] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:13.916057 containerd[1467]: time="2026-04-16T00:25:13.915483164Z" level=info msg="TearDown network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\" successfully" Apr 16 00:25:13.916057 containerd[1467]: time="2026-04-16T00:25:13.915517524Z" level=info msg="StopPodSandbox for \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\" returns successfully" Apr 16 00:25:13.916743 containerd[1467]: time="2026-04-16T00:25:13.916428248Z" level=info msg="RemovePodSandbox for \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\"" Apr 16 00:25:13.916743 containerd[1467]: time="2026-04-16T00:25:13.916461368Z" level=info msg="Forcibly stopping sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\"" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.971 [WARNING][5401] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"4679abf7-027e-48d1-9202-b1dbdd5b8949", ResourceVersion:"1020", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"4869377a3a0c52783ea845c727e9d4301c903797a5d603b8021875c0af752b42", Pod:"calico-apiserver-688cbf68ff-qht6h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calidf2d11505cc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.972 [INFO][5401] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.972 [INFO][5401] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" iface="eth0" netns="" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.972 [INFO][5401] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.972 [INFO][5401] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.994 [INFO][5408] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.994 [INFO][5408] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:13.994 [INFO][5408] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:14.005 [WARNING][5408] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:14.005 [INFO][5408] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" HandleID="k8s-pod-network.2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--qht6h-eth0" Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:14.008 [INFO][5408] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.012585 containerd[1467]: 2026-04-16 00:25:14.010 [INFO][5401] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4" Apr 16 00:25:14.013120 containerd[1467]: time="2026-04-16T00:25:14.012569234Z" level=info msg="TearDown network for sandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\" successfully" Apr 16 00:25:14.019099 containerd[1467]: time="2026-04-16T00:25:14.019046668Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:14.019224 containerd[1467]: time="2026-04-16T00:25:14.019140908Z" level=info msg="RemovePodSandbox \"2d910cefa52d82977176b897ff2151288ac53de3bee816f554f88458cc7018e4\" returns successfully" Apr 16 00:25:14.019749 containerd[1467]: time="2026-04-16T00:25:14.019714951Z" level=info msg="StopPodSandbox for \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\"" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.061 [WARNING][5422] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec", ResourceVersion:"1074", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc", Pod:"csi-node-driver-ptvrb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif64780672a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.062 [INFO][5422] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.062 [INFO][5422] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" iface="eth0" netns="" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.062 [INFO][5422] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.062 [INFO][5422] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.086 [INFO][5429] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.087 [INFO][5429] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.087 [INFO][5429] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.099 [WARNING][5429] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.099 [INFO][5429] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.101 [INFO][5429] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.106931 containerd[1467]: 2026-04-16 00:25:14.104 [INFO][5422] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.108349 containerd[1467]: time="2026-04-16T00:25:14.106979051Z" level=info msg="TearDown network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\" successfully" Apr 16 00:25:14.108349 containerd[1467]: time="2026-04-16T00:25:14.107005332Z" level=info msg="StopPodSandbox for \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\" returns successfully" Apr 16 00:25:14.108349 containerd[1467]: time="2026-04-16T00:25:14.107575775Z" level=info msg="RemovePodSandbox for \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\"" Apr 16 00:25:14.108349 containerd[1467]: time="2026-04-16T00:25:14.107606815Z" level=info msg="Forcibly stopping sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\"" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.154 [WARNING][5444] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a1e3207-cc48-4c2e-99f2-9f1e71ba31ec", ResourceVersion:"1074", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"51ab9d0840f35f6b488c0de3715ed97354b12d88c25a36c42bb7f04d13c1cacc", Pod:"csi-node-driver-ptvrb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif64780672a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.155 [INFO][5444] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.155 [INFO][5444] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" iface="eth0" netns="" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.155 [INFO][5444] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.155 [INFO][5444] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.179 [INFO][5451] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.179 [INFO][5451] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.179 [INFO][5451] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.192 [WARNING][5451] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.192 [INFO][5451] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" HandleID="k8s-pod-network.04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Workload="ci--4081--3--6--n--56c15b786d-k8s-csi--node--driver--ptvrb-eth0" Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.195 [INFO][5451] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.200678 containerd[1467]: 2026-04-16 00:25:14.197 [INFO][5444] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce" Apr 16 00:25:14.200678 containerd[1467]: time="2026-04-16T00:25:14.200429104Z" level=info msg="TearDown network for sandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\" successfully" Apr 16 00:25:14.208499 containerd[1467]: time="2026-04-16T00:25:14.207926264Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:14.208499 containerd[1467]: time="2026-04-16T00:25:14.208036864Z" level=info msg="RemovePodSandbox \"04a6121cdd5c4d6e56c20263e8997364c7d03f943137d4e59166a7f34429a2ce\" returns successfully" Apr 16 00:25:14.208853 containerd[1467]: time="2026-04-16T00:25:14.208694988Z" level=info msg="StopPodSandbox for \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\"" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.256 [WARNING][5465] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.256 [INFO][5465] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.257 [INFO][5465] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" iface="eth0" netns="" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.257 [INFO][5465] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.257 [INFO][5465] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.277 [INFO][5472] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.277 [INFO][5472] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.278 [INFO][5472] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.288 [WARNING][5472] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.289 [INFO][5472] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.291 [INFO][5472] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.295054 containerd[1467]: 2026-04-16 00:25:14.293 [INFO][5465] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.295899 containerd[1467]: time="2026-04-16T00:25:14.295372085Z" level=info msg="TearDown network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\" successfully" Apr 16 00:25:14.295899 containerd[1467]: time="2026-04-16T00:25:14.295403485Z" level=info msg="StopPodSandbox for \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\" returns successfully" Apr 16 00:25:14.296843 containerd[1467]: time="2026-04-16T00:25:14.296495851Z" level=info msg="RemovePodSandbox for \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\"" Apr 16 00:25:14.296843 containerd[1467]: time="2026-04-16T00:25:14.296533731Z" level=info msg="Forcibly stopping sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\"" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.337 [WARNING][5486] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" WorkloadEndpoint="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.337 [INFO][5486] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.337 [INFO][5486] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" iface="eth0" netns="" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.337 [INFO][5486] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.337 [INFO][5486] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.360 [INFO][5493] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.360 [INFO][5493] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.360 [INFO][5493] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.371 [WARNING][5493] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.371 [INFO][5493] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" HandleID="k8s-pod-network.89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Workload="ci--4081--3--6--n--56c15b786d-k8s-whisker--5fc887d9f7--rvjtn-eth0" Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.373 [INFO][5493] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.377873 containerd[1467]: 2026-04-16 00:25:14.375 [INFO][5486] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44" Apr 16 00:25:14.379125 containerd[1467]: time="2026-04-16T00:25:14.378386403Z" level=info msg="TearDown network for sandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\" successfully" Apr 16 00:25:14.382750 containerd[1467]: time="2026-04-16T00:25:14.382572305Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:14.382750 containerd[1467]: time="2026-04-16T00:25:14.382653025Z" level=info msg="RemovePodSandbox \"89567a3ee1a37512be4ac9f38647664f2a8268faa47aec0ba6f3de333f4f2b44\" returns successfully" Apr 16 00:25:14.383608 containerd[1467]: time="2026-04-16T00:25:14.383305549Z" level=info msg="StopPodSandbox for \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\"" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.437 [WARNING][5507] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"eb8562da-f5ee-41ba-a284-e8ce170cf70d", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2", Pod:"calico-apiserver-688cbf68ff-cmgfm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali32e8cb1a426", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.444 [INFO][5507] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.444 [INFO][5507] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" iface="eth0" netns="" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.444 [INFO][5507] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.444 [INFO][5507] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.467 [INFO][5514] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.467 [INFO][5514] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.467 [INFO][5514] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.478 [WARNING][5514] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.479 [INFO][5514] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.480 [INFO][5514] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.484926 containerd[1467]: 2026-04-16 00:25:14.482 [INFO][5507] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.485495 containerd[1467]: time="2026-04-16T00:25:14.484937205Z" level=info msg="TearDown network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\" successfully" Apr 16 00:25:14.485495 containerd[1467]: time="2026-04-16T00:25:14.484964525Z" level=info msg="StopPodSandbox for \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\" returns successfully" Apr 16 00:25:14.486014 containerd[1467]: time="2026-04-16T00:25:14.485698409Z" level=info msg="RemovePodSandbox for \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\"" Apr 16 00:25:14.486014 containerd[1467]: time="2026-04-16T00:25:14.485730209Z" level=info msg="Forcibly stopping sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\"" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.528 [WARNING][5528] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0", GenerateName:"calico-apiserver-688cbf68ff-", Namespace:"calico-system", SelfLink:"", UID:"eb8562da-f5ee-41ba-a284-e8ce170cf70d", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"688cbf68ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"8a307e4b5a4fc8bb2f1d704d33e7af20f547ee0f2594a84051fc7f3b3da4c8a2", Pod:"calico-apiserver-688cbf68ff-cmgfm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali32e8cb1a426", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.528 [INFO][5528] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.528 [INFO][5528] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" iface="eth0" netns="" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.528 [INFO][5528] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.528 [INFO][5528] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.550 [INFO][5535] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.550 [INFO][5535] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.550 [INFO][5535] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.561 [WARNING][5535] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.561 [INFO][5535] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" HandleID="k8s-pod-network.9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--apiserver--688cbf68ff--cmgfm-eth0" Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.563 [INFO][5535] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.567306 containerd[1467]: 2026-04-16 00:25:14.565 [INFO][5528] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f" Apr 16 00:25:14.567306 containerd[1467]: time="2026-04-16T00:25:14.567171598Z" level=info msg="TearDown network for sandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\" successfully" Apr 16 00:25:14.572628 containerd[1467]: time="2026-04-16T00:25:14.572564027Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:14.572931 containerd[1467]: time="2026-04-16T00:25:14.572818988Z" level=info msg="RemovePodSandbox \"9c40facdd29877a47ecf1103d4a5e10eb0151a6dd5a2e9a4ed6fb433f285052f\" returns successfully" Apr 16 00:25:14.573662 containerd[1467]: time="2026-04-16T00:25:14.573622672Z" level=info msg="StopPodSandbox for \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\"" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.629 [WARNING][5549] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5ade9d9c-9070-44d0-8989-12094cc3969e", ResourceVersion:"982", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961", Pod:"coredns-674b8bbfcf-pjng2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali206a03830d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.629 [INFO][5549] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.629 [INFO][5549] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" iface="eth0" netns="" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.629 [INFO][5549] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.629 [INFO][5549] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.652 [INFO][5556] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.652 [INFO][5556] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.652 [INFO][5556] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.662 [WARNING][5556] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.662 [INFO][5556] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.665 [INFO][5556] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.668909 containerd[1467]: 2026-04-16 00:25:14.667 [INFO][5549] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.669842 containerd[1467]: time="2026-04-16T00:25:14.669555858Z" level=info msg="TearDown network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\" successfully" Apr 16 00:25:14.669842 containerd[1467]: time="2026-04-16T00:25:14.669606578Z" level=info msg="StopPodSandbox for \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\" returns successfully" Apr 16 00:25:14.670166 containerd[1467]: time="2026-04-16T00:25:14.670129301Z" level=info msg="RemovePodSandbox for \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\"" Apr 16 00:25:14.670234 containerd[1467]: time="2026-04-16T00:25:14.670168941Z" level=info msg="Forcibly stopping sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\"" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.714 [WARNING][5570] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5ade9d9c-9070-44d0-8989-12094cc3969e", ResourceVersion:"982", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"3b233599f74090ec9728e7279c894b303992f9008ebb478a6040de64688d1961", Pod:"coredns-674b8bbfcf-pjng2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali206a03830d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.714 [INFO][5570] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.714 [INFO][5570] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" iface="eth0" netns="" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.714 [INFO][5570] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.714 [INFO][5570] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.735 [INFO][5577] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.735 [INFO][5577] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.735 [INFO][5577] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.747 [WARNING][5577] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.747 [INFO][5577] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" HandleID="k8s-pod-network.5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Workload="ci--4081--3--6--n--56c15b786d-k8s-coredns--674b8bbfcf--pjng2-eth0" Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.750 [INFO][5577] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.755724 containerd[1467]: 2026-04-16 00:25:14.752 [INFO][5570] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914" Apr 16 00:25:14.757289 containerd[1467]: time="2026-04-16T00:25:14.756686678Z" level=info msg="TearDown network for sandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\" successfully" Apr 16 00:25:14.761575 containerd[1467]: time="2026-04-16T00:25:14.761533583Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:14.761726 containerd[1467]: time="2026-04-16T00:25:14.761610824Z" level=info msg="RemovePodSandbox \"5d4e55315280393af40f5943ec831baf086c8a6c095426d2f4b0f717d5b35914\" returns successfully" Apr 16 00:25:14.762054 containerd[1467]: time="2026-04-16T00:25:14.762025866Z" level=info msg="StopPodSandbox for \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\"" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.805 [WARNING][5591] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0", GenerateName:"calico-kube-controllers-5bcb8465df-", Namespace:"calico-system", SelfLink:"", UID:"ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84", ResourceVersion:"1006", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bcb8465df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44", Pod:"calico-kube-controllers-5bcb8465df-zbc8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidb50fab48aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.805 [INFO][5591] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.805 [INFO][5591] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" iface="eth0" netns="" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.805 [INFO][5591] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.806 [INFO][5591] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.833 [INFO][5599] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.833 [INFO][5599] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.833 [INFO][5599] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.847 [WARNING][5599] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.847 [INFO][5599] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.850 [INFO][5599] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.856358 containerd[1467]: 2026-04-16 00:25:14.854 [INFO][5591] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.857131 containerd[1467]: time="2026-04-16T00:25:14.856394283Z" level=info msg="TearDown network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\" successfully" Apr 16 00:25:14.857131 containerd[1467]: time="2026-04-16T00:25:14.856420044Z" level=info msg="StopPodSandbox for \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\" returns successfully" Apr 16 00:25:14.857131 containerd[1467]: time="2026-04-16T00:25:14.857057727Z" level=info msg="RemovePodSandbox for \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\"" Apr 16 00:25:14.857131 containerd[1467]: time="2026-04-16T00:25:14.857087367Z" level=info msg="Forcibly stopping sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\"" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.904 [WARNING][5613] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0", GenerateName:"calico-kube-controllers-5bcb8465df-", Namespace:"calico-system", SelfLink:"", UID:"ee00d6a3-52e5-4c2d-97b5-3b5102dc1e84", ResourceVersion:"1006", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 24, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bcb8465df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-56c15b786d", ContainerID:"830882920c0d95c41ccca0f90c44eac5adc4ecec2dec16c8af18d138b4909a44", Pod:"calico-kube-controllers-5bcb8465df-zbc8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidb50fab48aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.905 [INFO][5613] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.905 [INFO][5613] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" iface="eth0" netns="" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.905 [INFO][5613] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.905 [INFO][5613] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.934 [INFO][5620] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.934 [INFO][5620] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.934 [INFO][5620] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.948 [WARNING][5620] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.948 [INFO][5620] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" HandleID="k8s-pod-network.b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Workload="ci--4081--3--6--n--56c15b786d-k8s-calico--kube--controllers--5bcb8465df--zbc8b-eth0" Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.951 [INFO][5620] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:25:14.958152 containerd[1467]: 2026-04-16 00:25:14.956 [INFO][5613] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d" Apr 16 00:25:14.958737 containerd[1467]: time="2026-04-16T00:25:14.958216500Z" level=info msg="TearDown network for sandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\" successfully" Apr 16 00:25:14.962010 containerd[1467]: time="2026-04-16T00:25:14.961963080Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:25:14.962090 containerd[1467]: time="2026-04-16T00:25:14.962041401Z" level=info msg="RemovePodSandbox \"b342f93fbb8d8518c9a7963c6418614c23470c9f767f89030313093d25718b9d\" returns successfully" Apr 16 00:25:26.854923 kubelet[2577]: I0416 00:25:26.854800 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:26:03.300001 systemd[1]: run-containerd-runc-k8s.io-7aadc8b71b40cd61af65c3ccec17997df765d949cf0f504f818bdc1ad30479b0-runc.Tws2As.mount: Deactivated successfully. Apr 16 00:26:24.767098 systemd[1]: run-containerd-runc-k8s.io-e6206b62e51fdfddbeadb99e69613b68c8edefc09c695b70d1442a48dbf66e37-runc.OFh0c5.mount: Deactivated successfully. Apr 16 00:26:38.138612 systemd[1]: Started sshd@7-46.224.6.157:22-4.175.71.9:45590.service - OpenSSH per-connection server daemon (4.175.71.9:45590). Apr 16 00:26:38.281853 sshd[5985]: Accepted publickey for core from 4.175.71.9 port 45590 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:26:38.285357 sshd[5985]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:26:38.292924 systemd-logind[1451]: New session 8 of user core. Apr 16 00:26:38.297518 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 16 00:26:38.500891 sshd[5985]: pam_unix(sshd:session): session closed for user core Apr 16 00:26:38.506895 systemd-logind[1451]: Session 8 logged out. Waiting for processes to exit. Apr 16 00:26:38.507237 systemd[1]: sshd@7-46.224.6.157:22-4.175.71.9:45590.service: Deactivated successfully. Apr 16 00:26:38.510214 systemd[1]: session-8.scope: Deactivated successfully. Apr 16 00:26:38.515228 systemd-logind[1451]: Removed session 8. Apr 16 00:26:43.540562 systemd[1]: Started sshd@8-46.224.6.157:22-4.175.71.9:45596.service - OpenSSH per-connection server daemon (4.175.71.9:45596). Apr 16 00:26:43.659324 sshd[5999]: Accepted publickey for core from 4.175.71.9 port 45596 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:26:43.662326 sshd[5999]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:26:43.668759 systemd-logind[1451]: New session 9 of user core. Apr 16 00:26:43.678608 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 16 00:26:43.873179 sshd[5999]: pam_unix(sshd:session): session closed for user core Apr 16 00:26:43.878968 systemd[1]: sshd@8-46.224.6.157:22-4.175.71.9:45596.service: Deactivated successfully. Apr 16 00:26:43.879120 systemd-logind[1451]: Session 9 logged out. Waiting for processes to exit. Apr 16 00:26:43.882310 systemd[1]: session-9.scope: Deactivated successfully. Apr 16 00:26:43.883356 systemd-logind[1451]: Removed session 9. Apr 16 00:26:48.920603 systemd[1]: Started sshd@9-46.224.6.157:22-4.175.71.9:57310.service - OpenSSH per-connection server daemon (4.175.71.9:57310). Apr 16 00:26:49.045843 sshd[6013]: Accepted publickey for core from 4.175.71.9 port 57310 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:26:49.048399 sshd[6013]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:26:49.054233 systemd-logind[1451]: New session 10 of user core. Apr 16 00:26:49.059539 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 16 00:26:49.251425 sshd[6013]: pam_unix(sshd:session): session closed for user core Apr 16 00:26:49.256861 systemd[1]: sshd@9-46.224.6.157:22-4.175.71.9:57310.service: Deactivated successfully. Apr 16 00:26:49.260624 systemd[1]: session-10.scope: Deactivated successfully. Apr 16 00:26:49.263565 systemd-logind[1451]: Session 10 logged out. Waiting for processes to exit. Apr 16 00:26:49.264664 systemd-logind[1451]: Removed session 10. Apr 16 00:26:54.279586 systemd[1]: Started sshd@10-46.224.6.157:22-4.175.71.9:57326.service - OpenSSH per-connection server daemon (4.175.71.9:57326). Apr 16 00:26:54.410341 sshd[6046]: Accepted publickey for core from 4.175.71.9 port 57326 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:26:54.412321 sshd[6046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:26:54.419321 systemd-logind[1451]: New session 11 of user core. Apr 16 00:26:54.427681 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 16 00:26:54.613201 sshd[6046]: pam_unix(sshd:session): session closed for user core Apr 16 00:26:54.619597 systemd[1]: sshd@10-46.224.6.157:22-4.175.71.9:57326.service: Deactivated successfully. Apr 16 00:26:54.621773 systemd[1]: session-11.scope: Deactivated successfully. Apr 16 00:26:54.624466 systemd-logind[1451]: Session 11 logged out. Waiting for processes to exit. Apr 16 00:26:54.637875 systemd-logind[1451]: Removed session 11. Apr 16 00:26:54.642741 systemd[1]: Started sshd@11-46.224.6.157:22-4.175.71.9:57328.service - OpenSSH per-connection server daemon (4.175.71.9:57328). Apr 16 00:26:54.770361 sshd[6059]: Accepted publickey for core from 4.175.71.9 port 57328 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:26:54.774041 sshd[6059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:26:54.782711 systemd-logind[1451]: New session 12 of user core. Apr 16 00:26:54.789662 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 16 00:26:55.037104 sshd[6059]: pam_unix(sshd:session): session closed for user core Apr 16 00:26:55.044689 systemd[1]: sshd@11-46.224.6.157:22-4.175.71.9:57328.service: Deactivated successfully. Apr 16 00:26:55.048177 systemd[1]: session-12.scope: Deactivated successfully. Apr 16 00:26:55.049201 systemd-logind[1451]: Session 12 logged out. Waiting for processes to exit. Apr 16 00:26:55.069012 systemd[1]: Started sshd@12-46.224.6.157:22-4.175.71.9:57340.service - OpenSSH per-connection server daemon (4.175.71.9:57340). Apr 16 00:26:55.071079 systemd-logind[1451]: Removed session 12. Apr 16 00:26:55.182199 sshd[6091]: Accepted publickey for core from 4.175.71.9 port 57340 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:26:55.186050 sshd[6091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:26:55.194085 systemd-logind[1451]: New session 13 of user core. Apr 16 00:26:55.197546 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 16 00:26:55.390658 sshd[6091]: pam_unix(sshd:session): session closed for user core Apr 16 00:26:55.399098 systemd[1]: sshd@12-46.224.6.157:22-4.175.71.9:57340.service: Deactivated successfully. Apr 16 00:26:55.405898 systemd[1]: session-13.scope: Deactivated successfully. Apr 16 00:26:55.408040 systemd-logind[1451]: Session 13 logged out. Waiting for processes to exit. Apr 16 00:26:55.410460 systemd-logind[1451]: Removed session 13. Apr 16 00:27:00.426618 systemd[1]: Started sshd@13-46.224.6.157:22-4.175.71.9:60808.service - OpenSSH per-connection server daemon (4.175.71.9:60808). Apr 16 00:27:00.548775 sshd[6104]: Accepted publickey for core from 4.175.71.9 port 60808 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:27:00.551561 sshd[6104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:27:00.559898 systemd-logind[1451]: New session 14 of user core. Apr 16 00:27:00.564730 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 16 00:27:00.776243 sshd[6104]: pam_unix(sshd:session): session closed for user core Apr 16 00:27:00.782226 systemd[1]: sshd@13-46.224.6.157:22-4.175.71.9:60808.service: Deactivated successfully. Apr 16 00:27:00.787155 systemd[1]: session-14.scope: Deactivated successfully. Apr 16 00:27:00.789139 systemd-logind[1451]: Session 14 logged out. Waiting for processes to exit. Apr 16 00:27:00.806311 systemd-logind[1451]: Removed session 14. Apr 16 00:27:00.815958 systemd[1]: Started sshd@14-46.224.6.157:22-4.175.71.9:60820.service - OpenSSH per-connection server daemon (4.175.71.9:60820). Apr 16 00:27:00.941567 sshd[6117]: Accepted publickey for core from 4.175.71.9 port 60820 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:27:00.944075 sshd[6117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:27:00.953131 systemd-logind[1451]: New session 15 of user core. Apr 16 00:27:00.961718 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 16 00:27:01.356033 sshd[6117]: pam_unix(sshd:session): session closed for user core Apr 16 00:27:01.364145 systemd[1]: sshd@14-46.224.6.157:22-4.175.71.9:60820.service: Deactivated successfully. Apr 16 00:27:01.370210 systemd[1]: session-15.scope: Deactivated successfully. Apr 16 00:27:01.382622 systemd-logind[1451]: Session 15 logged out. Waiting for processes to exit. Apr 16 00:27:01.387702 systemd[1]: Started sshd@15-46.224.6.157:22-4.175.71.9:60824.service - OpenSSH per-connection server daemon (4.175.71.9:60824). Apr 16 00:27:01.390260 systemd-logind[1451]: Removed session 15. Apr 16 00:27:01.529137 sshd[6127]: Accepted publickey for core from 4.175.71.9 port 60824 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:27:01.531822 sshd[6127]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:27:01.539399 systemd-logind[1451]: New session 16 of user core. Apr 16 00:27:01.544545 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 16 00:27:02.502220 sshd[6127]: pam_unix(sshd:session): session closed for user core Apr 16 00:27:02.508740 systemd[1]: sshd@15-46.224.6.157:22-4.175.71.9:60824.service: Deactivated successfully. Apr 16 00:27:02.514540 systemd[1]: session-16.scope: Deactivated successfully. Apr 16 00:27:02.515890 systemd-logind[1451]: Session 16 logged out. Waiting for processes to exit. Apr 16 00:27:02.546883 systemd[1]: Started sshd@16-46.224.6.157:22-4.175.71.9:60836.service - OpenSSH per-connection server daemon (4.175.71.9:60836). Apr 16 00:27:02.547833 systemd-logind[1451]: Removed session 16. Apr 16 00:27:02.678376 sshd[6150]: Accepted publickey for core from 4.175.71.9 port 60836 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:27:02.680075 sshd[6150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:27:02.690109 systemd-logind[1451]: New session 17 of user core. Apr 16 00:27:02.696826 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 16 00:27:03.025621 sshd[6150]: pam_unix(sshd:session): session closed for user core Apr 16 00:27:03.032479 systemd[1]: sshd@16-46.224.6.157:22-4.175.71.9:60836.service: Deactivated successfully. Apr 16 00:27:03.037040 systemd[1]: session-17.scope: Deactivated successfully. Apr 16 00:27:03.040535 systemd-logind[1451]: Session 17 logged out. Waiting for processes to exit. Apr 16 00:27:03.066151 systemd[1]: Started sshd@17-46.224.6.157:22-4.175.71.9:60840.service - OpenSSH per-connection server daemon (4.175.71.9:60840). Apr 16 00:27:03.068326 systemd-logind[1451]: Removed session 17. Apr 16 00:27:03.189918 sshd[6163]: Accepted publickey for core from 4.175.71.9 port 60840 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:27:03.192093 sshd[6163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:27:03.199521 systemd-logind[1451]: New session 18 of user core. Apr 16 00:27:03.205534 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 16 00:27:03.420741 sshd[6163]: pam_unix(sshd:session): session closed for user core Apr 16 00:27:03.427878 systemd[1]: sshd@17-46.224.6.157:22-4.175.71.9:60840.service: Deactivated successfully. Apr 16 00:27:03.433498 systemd[1]: session-18.scope: Deactivated successfully. Apr 16 00:27:03.435606 systemd-logind[1451]: Session 18 logged out. Waiting for processes to exit. Apr 16 00:27:03.436848 systemd-logind[1451]: Removed session 18. Apr 16 00:27:07.934854 systemd[1]: Started sshd@18-46.224.6.157:22-119.196.155.38:58344.service - OpenSSH per-connection server daemon (119.196.155.38:58344). Apr 16 00:27:07.958675 sshd[6218]: Connection closed by 119.196.155.38 port 58344 Apr 16 00:27:07.961207 systemd[1]: sshd@18-46.224.6.157:22-119.196.155.38:58344.service: Deactivated successfully. Apr 16 00:27:08.461295 systemd[1]: Started sshd@19-46.224.6.157:22-4.175.71.9:50196.service - OpenSSH per-connection server daemon (4.175.71.9:50196). Apr 16 00:27:08.576449 sshd[6222]: Accepted publickey for core from 4.175.71.9 port 50196 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:27:08.578913 sshd[6222]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:27:08.587337 systemd-logind[1451]: New session 19 of user core. Apr 16 00:27:08.592027 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 16 00:27:08.803635 sshd[6222]: pam_unix(sshd:session): session closed for user core Apr 16 00:27:08.810167 systemd[1]: sshd@19-46.224.6.157:22-4.175.71.9:50196.service: Deactivated successfully. Apr 16 00:27:08.814685 systemd[1]: session-19.scope: Deactivated successfully. Apr 16 00:27:08.816573 systemd-logind[1451]: Session 19 logged out. Waiting for processes to exit. Apr 16 00:27:08.818104 systemd-logind[1451]: Removed session 19. Apr 16 00:27:13.839736 systemd[1]: Started sshd@20-46.224.6.157:22-4.175.71.9:50208.service - OpenSSH per-connection server daemon (4.175.71.9:50208). Apr 16 00:27:13.976334 sshd[6236]: Accepted publickey for core from 4.175.71.9 port 50208 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:27:13.978562 sshd[6236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:27:13.985414 systemd-logind[1451]: New session 20 of user core. Apr 16 00:27:13.992623 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 16 00:27:14.162465 sshd[6236]: pam_unix(sshd:session): session closed for user core Apr 16 00:27:14.167726 systemd[1]: sshd@20-46.224.6.157:22-4.175.71.9:50208.service: Deactivated successfully. Apr 16 00:27:14.171937 systemd[1]: session-20.scope: Deactivated successfully. Apr 16 00:27:14.174053 systemd-logind[1451]: Session 20 logged out. Waiting for processes to exit. Apr 16 00:27:14.175016 systemd-logind[1451]: Removed session 20. Apr 16 00:27:34.819818 systemd[1]: run-containerd-runc-k8s.io-7c4076c6cc5976804a71894cb0b8c4bc2e26e1a9008027ab1f0465c5c1699e56-runc.HSkiDm.mount: Deactivated successfully. Apr 16 00:27:44.431071 systemd[1]: cri-containerd-70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943.scope: Deactivated successfully. Apr 16 00:27:44.431477 systemd[1]: cri-containerd-70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943.scope: Consumed 3.939s CPU time, 16.4M memory peak, 0B memory swap peak. Apr 16 00:27:44.464216 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943-rootfs.mount: Deactivated successfully. Apr 16 00:27:44.473209 containerd[1467]: time="2026-04-16T00:27:44.473018350Z" level=info msg="shim disconnected" id=70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943 namespace=k8s.io Apr 16 00:27:44.473209 containerd[1467]: time="2026-04-16T00:27:44.473077670Z" level=warning msg="cleaning up after shim disconnected" id=70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943 namespace=k8s.io Apr 16 00:27:44.473209 containerd[1467]: time="2026-04-16T00:27:44.473086310Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:27:44.872062 kubelet[2577]: E0416 00:27:44.871992 2577 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:59306->10.0.0.2:2379: read: connection timed out" Apr 16 00:27:45.322565 kubelet[2577]: I0416 00:27:45.321688 2577 scope.go:117] "RemoveContainer" containerID="70de6f25d15b02ac9de4bf83fddcdec7b96c6072270f6a2c07832e5bcab9f943" Apr 16 00:27:45.325344 containerd[1467]: time="2026-04-16T00:27:45.325255856Z" level=info msg="CreateContainer within sandbox \"7b6208f400f870fab0b5309573b55f9a5c827170711c9c71322555316cfc48c2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Apr 16 00:27:45.346486 containerd[1467]: time="2026-04-16T00:27:45.346154878Z" level=info msg="CreateContainer within sandbox \"7b6208f400f870fab0b5309573b55f9a5c827170711c9c71322555316cfc48c2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"9bf23fe03c836524c3c1b4d2eaca8e16ee381cfa45d651f3bb4922b92ffd0e4d\"" Apr 16 00:27:45.347218 containerd[1467]: time="2026-04-16T00:27:45.346715637Z" level=info msg="StartContainer for \"9bf23fe03c836524c3c1b4d2eaca8e16ee381cfa45d651f3bb4922b92ffd0e4d\"" Apr 16 00:27:45.388646 systemd[1]: Started cri-containerd-9bf23fe03c836524c3c1b4d2eaca8e16ee381cfa45d651f3bb4922b92ffd0e4d.scope - libcontainer container 9bf23fe03c836524c3c1b4d2eaca8e16ee381cfa45d651f3bb4922b92ffd0e4d. Apr 16 00:27:45.429255 containerd[1467]: time="2026-04-16T00:27:45.429210890Z" level=info msg="StartContainer for \"9bf23fe03c836524c3c1b4d2eaca8e16ee381cfa45d651f3bb4922b92ffd0e4d\" returns successfully" Apr 16 00:27:45.514554 systemd[1]: cri-containerd-01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c.scope: Deactivated successfully. Apr 16 00:27:45.515380 systemd[1]: cri-containerd-01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c.scope: Consumed 15.042s CPU time. Apr 16 00:27:45.544799 containerd[1467]: time="2026-04-16T00:27:45.544720054Z" level=info msg="shim disconnected" id=01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c namespace=k8s.io Apr 16 00:27:45.544799 containerd[1467]: time="2026-04-16T00:27:45.544791533Z" level=warning msg="cleaning up after shim disconnected" id=01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c namespace=k8s.io Apr 16 00:27:45.544799 containerd[1467]: time="2026-04-16T00:27:45.544803333Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:27:45.553732 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c-rootfs.mount: Deactivated successfully. Apr 16 00:27:46.325137 kubelet[2577]: I0416 00:27:46.325105 2577 scope.go:117] "RemoveContainer" containerID="01d8a9e20a53bc26f87e1390bc72db945727853abe31dc3a729d1f2b3403033c" Apr 16 00:27:46.328298 containerd[1467]: time="2026-04-16T00:27:46.328220841Z" level=info msg="CreateContainer within sandbox \"4caa3e2c59fdba7976294613ea5c0b07a32653d91bd6b7e2a40203aa062474ab\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Apr 16 00:27:46.346294 containerd[1467]: time="2026-04-16T00:27:46.344454687Z" level=info msg="CreateContainer within sandbox \"4caa3e2c59fdba7976294613ea5c0b07a32653d91bd6b7e2a40203aa062474ab\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"0ec14bc53110d0c919ce11e3c74cffaa5ed741f43a2dc4084ade0a005c14ad51\"" Apr 16 00:27:46.346294 containerd[1467]: time="2026-04-16T00:27:46.345001726Z" level=info msg="StartContainer for \"0ec14bc53110d0c919ce11e3c74cffaa5ed741f43a2dc4084ade0a005c14ad51\"" Apr 16 00:27:46.385668 systemd[1]: Started cri-containerd-0ec14bc53110d0c919ce11e3c74cffaa5ed741f43a2dc4084ade0a005c14ad51.scope - libcontainer container 0ec14bc53110d0c919ce11e3c74cffaa5ed741f43a2dc4084ade0a005c14ad51. Apr 16 00:27:46.423656 containerd[1467]: time="2026-04-16T00:27:46.423613082Z" level=info msg="StartContainer for \"0ec14bc53110d0c919ce11e3c74cffaa5ed741f43a2dc4084ade0a005c14ad51\" returns successfully"