Apr 16 00:18:18.906501 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Apr 16 00:18:18.906525 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Wed Apr 15 22:32:48 -00 2026 Apr 16 00:18:18.906535 kernel: KASLR enabled Apr 16 00:18:18.906541 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Apr 16 00:18:18.906547 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390c1018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Apr 16 00:18:18.906553 kernel: random: crng init done Apr 16 00:18:18.906560 kernel: ACPI: Early table checksum verification disabled Apr 16 00:18:18.906566 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Apr 16 00:18:18.906572 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Apr 16 00:18:18.906579 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.906586 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.906592 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.906597 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.906997 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.907006 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.907017 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.907024 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.907030 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 00:18:18.907036 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Apr 16 00:18:18.907043 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Apr 16 00:18:18.907049 kernel: NUMA: Failed to initialise from firmware Apr 16 00:18:18.907056 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Apr 16 00:18:18.907062 kernel: NUMA: NODE_DATA [mem 0x13966e800-0x139673fff] Apr 16 00:18:18.907068 kernel: Zone ranges: Apr 16 00:18:18.907075 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Apr 16 00:18:18.907083 kernel: DMA32 empty Apr 16 00:18:18.907090 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Apr 16 00:18:18.907096 kernel: Movable zone start for each node Apr 16 00:18:18.907102 kernel: Early memory node ranges Apr 16 00:18:18.907109 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Apr 16 00:18:18.907115 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Apr 16 00:18:18.907122 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Apr 16 00:18:18.907128 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Apr 16 00:18:18.907134 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Apr 16 00:18:18.907141 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Apr 16 00:18:18.907148 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Apr 16 00:18:18.907154 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Apr 16 00:18:18.907162 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Apr 16 00:18:18.907169 kernel: psci: probing for conduit method from ACPI. Apr 16 00:18:18.907175 kernel: psci: PSCIv1.1 detected in firmware. Apr 16 00:18:18.907185 kernel: psci: Using standard PSCI v0.2 function IDs Apr 16 00:18:18.907191 kernel: psci: Trusted OS migration not required Apr 16 00:18:18.907198 kernel: psci: SMC Calling Convention v1.1 Apr 16 00:18:18.907207 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Apr 16 00:18:18.907214 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Apr 16 00:18:18.907220 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Apr 16 00:18:18.907227 kernel: pcpu-alloc: [0] 0 [0] 1 Apr 16 00:18:18.907234 kernel: Detected PIPT I-cache on CPU0 Apr 16 00:18:18.907241 kernel: CPU features: detected: GIC system register CPU interface Apr 16 00:18:18.907248 kernel: CPU features: detected: Hardware dirty bit management Apr 16 00:18:18.907255 kernel: CPU features: detected: Spectre-v4 Apr 16 00:18:18.907262 kernel: CPU features: detected: Spectre-BHB Apr 16 00:18:18.907269 kernel: CPU features: kernel page table isolation forced ON by KASLR Apr 16 00:18:18.907277 kernel: CPU features: detected: Kernel page table isolation (KPTI) Apr 16 00:18:18.907358 kernel: CPU features: detected: ARM erratum 1418040 Apr 16 00:18:18.907367 kernel: CPU features: detected: SSBS not fully self-synchronizing Apr 16 00:18:18.907374 kernel: alternatives: applying boot alternatives Apr 16 00:18:18.907382 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=0adf63447ce845e6a0056fdc0e76e619192ad10bb115f878c5a0d78c1b8c220d Apr 16 00:18:18.907389 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Apr 16 00:18:18.907396 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Apr 16 00:18:18.907403 kernel: Fallback order for Node 0: 0 Apr 16 00:18:18.907410 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Apr 16 00:18:18.907417 kernel: Policy zone: Normal Apr 16 00:18:18.907423 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 16 00:18:18.907432 kernel: software IO TLB: area num 2. Apr 16 00:18:18.907439 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Apr 16 00:18:18.907447 kernel: Memory: 3882812K/4096000K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 213188K reserved, 0K cma-reserved) Apr 16 00:18:18.907453 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Apr 16 00:18:18.907460 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 16 00:18:18.907482 kernel: rcu: RCU event tracing is enabled. Apr 16 00:18:18.907489 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Apr 16 00:18:18.907496 kernel: Trampoline variant of Tasks RCU enabled. Apr 16 00:18:18.907503 kernel: Tracing variant of Tasks RCU enabled. Apr 16 00:18:18.907510 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 16 00:18:18.907517 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Apr 16 00:18:18.907524 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Apr 16 00:18:18.907533 kernel: GICv3: 256 SPIs implemented Apr 16 00:18:18.907540 kernel: GICv3: 0 Extended SPIs implemented Apr 16 00:18:18.907546 kernel: Root IRQ handler: gic_handle_irq Apr 16 00:18:18.907553 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Apr 16 00:18:18.907560 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Apr 16 00:18:18.907567 kernel: ITS [mem 0x08080000-0x0809ffff] Apr 16 00:18:18.907574 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Apr 16 00:18:18.907581 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Apr 16 00:18:18.907587 kernel: GICv3: using LPI property table @0x00000001000e0000 Apr 16 00:18:18.907594 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Apr 16 00:18:18.907626 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 16 00:18:18.907646 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 00:18:18.907653 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Apr 16 00:18:18.907660 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Apr 16 00:18:18.907667 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Apr 16 00:18:18.907674 kernel: Console: colour dummy device 80x25 Apr 16 00:18:18.907681 kernel: ACPI: Core revision 20230628 Apr 16 00:18:18.907688 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Apr 16 00:18:18.907696 kernel: pid_max: default: 32768 minimum: 301 Apr 16 00:18:18.907703 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Apr 16 00:18:18.907710 kernel: landlock: Up and running. Apr 16 00:18:18.907719 kernel: SELinux: Initializing. Apr 16 00:18:18.907726 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 16 00:18:18.907733 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 16 00:18:18.907741 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 16 00:18:18.907784 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 16 00:18:18.907793 kernel: rcu: Hierarchical SRCU implementation. Apr 16 00:18:18.907801 kernel: rcu: Max phase no-delay instances is 400. Apr 16 00:18:18.907808 kernel: Platform MSI: ITS@0x8080000 domain created Apr 16 00:18:18.907815 kernel: PCI/MSI: ITS@0x8080000 domain created Apr 16 00:18:18.907825 kernel: Remapping and enabling EFI services. Apr 16 00:18:18.907832 kernel: smp: Bringing up secondary CPUs ... Apr 16 00:18:18.907839 kernel: Detected PIPT I-cache on CPU1 Apr 16 00:18:18.907846 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Apr 16 00:18:18.907854 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Apr 16 00:18:18.907861 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 00:18:18.907868 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Apr 16 00:18:18.907875 kernel: smp: Brought up 1 node, 2 CPUs Apr 16 00:18:18.907882 kernel: SMP: Total of 2 processors activated. Apr 16 00:18:18.907889 kernel: CPU features: detected: 32-bit EL0 Support Apr 16 00:18:18.907898 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Apr 16 00:18:18.907906 kernel: CPU features: detected: Common not Private translations Apr 16 00:18:18.907918 kernel: CPU features: detected: CRC32 instructions Apr 16 00:18:18.907928 kernel: CPU features: detected: Enhanced Virtualization Traps Apr 16 00:18:18.907935 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Apr 16 00:18:18.907946 kernel: CPU features: detected: LSE atomic instructions Apr 16 00:18:18.907954 kernel: CPU features: detected: Privileged Access Never Apr 16 00:18:18.907962 kernel: CPU features: detected: RAS Extension Support Apr 16 00:18:18.907973 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Apr 16 00:18:18.907981 kernel: CPU: All CPU(s) started at EL1 Apr 16 00:18:18.907989 kernel: alternatives: applying system-wide alternatives Apr 16 00:18:18.908000 kernel: devtmpfs: initialized Apr 16 00:18:18.908010 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 16 00:18:18.908018 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Apr 16 00:18:18.908026 kernel: pinctrl core: initialized pinctrl subsystem Apr 16 00:18:18.908034 kernel: SMBIOS 3.0.0 present. Apr 16 00:18:18.908045 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Apr 16 00:18:18.908053 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 16 00:18:18.908061 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Apr 16 00:18:18.908069 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Apr 16 00:18:18.908080 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Apr 16 00:18:18.908088 kernel: audit: initializing netlink subsys (disabled) Apr 16 00:18:18.908098 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Apr 16 00:18:18.908105 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 16 00:18:18.908114 kernel: cpuidle: using governor menu Apr 16 00:18:18.908124 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Apr 16 00:18:18.908132 kernel: ASID allocator initialised with 32768 entries Apr 16 00:18:18.908139 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 16 00:18:18.908147 kernel: Serial: AMBA PL011 UART driver Apr 16 00:18:18.908154 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Apr 16 00:18:18.908162 kernel: Modules: 0 pages in range for non-PLT usage Apr 16 00:18:18.908169 kernel: Modules: 509008 pages in range for PLT usage Apr 16 00:18:18.908177 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 16 00:18:18.908184 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Apr 16 00:18:18.908194 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Apr 16 00:18:18.908201 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Apr 16 00:18:18.908208 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 16 00:18:18.908216 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Apr 16 00:18:18.908223 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Apr 16 00:18:18.908231 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Apr 16 00:18:18.908238 kernel: ACPI: Added _OSI(Module Device) Apr 16 00:18:18.908246 kernel: ACPI: Added _OSI(Processor Device) Apr 16 00:18:18.908253 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 16 00:18:18.908260 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 16 00:18:18.908269 kernel: ACPI: Interpreter enabled Apr 16 00:18:18.908276 kernel: ACPI: Using GIC for interrupt routing Apr 16 00:18:18.908284 kernel: ACPI: MCFG table detected, 1 entries Apr 16 00:18:18.908291 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Apr 16 00:18:18.908299 kernel: printk: console [ttyAMA0] enabled Apr 16 00:18:18.908307 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Apr 16 00:18:18.908492 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Apr 16 00:18:18.908571 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Apr 16 00:18:18.908719 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Apr 16 00:18:18.908806 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Apr 16 00:18:18.908873 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Apr 16 00:18:18.908883 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Apr 16 00:18:18.908895 kernel: PCI host bridge to bus 0000:00 Apr 16 00:18:18.908978 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Apr 16 00:18:18.909040 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Apr 16 00:18:18.909105 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Apr 16 00:18:18.909165 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Apr 16 00:18:18.909248 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Apr 16 00:18:18.909334 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Apr 16 00:18:18.909412 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Apr 16 00:18:18.909479 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Apr 16 00:18:18.909558 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.909680 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Apr 16 00:18:18.909823 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.909900 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Apr 16 00:18:18.909974 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.910042 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Apr 16 00:18:18.910121 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.910195 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Apr 16 00:18:18.910277 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.910355 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Apr 16 00:18:18.910441 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.910516 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Apr 16 00:18:18.911150 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.911273 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Apr 16 00:18:18.911352 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.911420 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Apr 16 00:18:18.911495 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Apr 16 00:18:18.911562 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Apr 16 00:18:18.912306 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Apr 16 00:18:18.912401 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Apr 16 00:18:18.912482 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Apr 16 00:18:18.912557 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Apr 16 00:18:18.913962 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Apr 16 00:18:18.914062 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 16 00:18:18.914150 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Apr 16 00:18:18.914230 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Apr 16 00:18:18.914310 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Apr 16 00:18:18.914379 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Apr 16 00:18:18.914448 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Apr 16 00:18:18.914527 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Apr 16 00:18:18.914597 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Apr 16 00:18:18.914721 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Apr 16 00:18:18.914819 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] Apr 16 00:18:18.914890 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Apr 16 00:18:18.914967 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Apr 16 00:18:18.915036 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Apr 16 00:18:18.915105 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Apr 16 00:18:18.915188 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Apr 16 00:18:18.915259 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Apr 16 00:18:18.915327 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Apr 16 00:18:18.915395 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 16 00:18:18.915467 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Apr 16 00:18:18.915534 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Apr 16 00:18:18.917716 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Apr 16 00:18:18.917884 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Apr 16 00:18:18.917966 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Apr 16 00:18:18.918035 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Apr 16 00:18:18.918108 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Apr 16 00:18:18.918177 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Apr 16 00:18:18.918243 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Apr 16 00:18:18.918314 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Apr 16 00:18:18.918383 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Apr 16 00:18:18.918455 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Apr 16 00:18:18.918527 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Apr 16 00:18:18.918595 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Apr 16 00:18:18.919787 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Apr 16 00:18:18.919877 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Apr 16 00:18:18.919947 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Apr 16 00:18:18.920014 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Apr 16 00:18:18.920095 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Apr 16 00:18:18.920164 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Apr 16 00:18:18.920232 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Apr 16 00:18:18.920304 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Apr 16 00:18:18.920372 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Apr 16 00:18:18.920441 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Apr 16 00:18:18.920517 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Apr 16 00:18:18.920585 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Apr 16 00:18:18.922804 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Apr 16 00:18:18.922920 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Apr 16 00:18:18.923013 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 00:18:18.923101 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Apr 16 00:18:18.923181 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 00:18:18.923254 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Apr 16 00:18:18.923326 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 00:18:18.923410 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Apr 16 00:18:18.923479 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 00:18:18.923551 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Apr 16 00:18:18.923636 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 00:18:18.923712 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Apr 16 00:18:18.923798 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 00:18:18.923872 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Apr 16 00:18:18.923945 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 00:18:18.924016 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Apr 16 00:18:18.924083 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 00:18:18.924152 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Apr 16 00:18:18.924218 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 00:18:18.924291 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Apr 16 00:18:18.925691 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Apr 16 00:18:18.925839 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Apr 16 00:18:18.925915 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Apr 16 00:18:18.925983 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Apr 16 00:18:18.926050 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Apr 16 00:18:18.926120 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Apr 16 00:18:18.926186 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Apr 16 00:18:18.926260 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Apr 16 00:18:18.926350 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Apr 16 00:18:18.926429 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Apr 16 00:18:18.926497 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Apr 16 00:18:18.926567 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Apr 16 00:18:18.927544 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Apr 16 00:18:18.927709 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Apr 16 00:18:18.927828 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Apr 16 00:18:18.927903 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Apr 16 00:18:18.927971 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Apr 16 00:18:18.928048 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Apr 16 00:18:18.928113 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Apr 16 00:18:18.928187 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Apr 16 00:18:18.928264 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Apr 16 00:18:18.928333 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Apr 16 00:18:18.928400 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Apr 16 00:18:18.928477 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Apr 16 00:18:18.928562 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Apr 16 00:18:18.928893 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Apr 16 00:18:18.928970 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 00:18:18.929048 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Apr 16 00:18:18.929127 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Apr 16 00:18:18.929198 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Apr 16 00:18:18.929279 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Apr 16 00:18:18.929355 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 00:18:18.929433 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Apr 16 00:18:18.929502 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Apr 16 00:18:18.930228 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Apr 16 00:18:18.930334 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Apr 16 00:18:18.930402 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Apr 16 00:18:18.930475 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 00:18:18.930556 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Apr 16 00:18:18.930670 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Apr 16 00:18:18.930742 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Apr 16 00:18:18.930830 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Apr 16 00:18:18.930898 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 00:18:18.930976 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Apr 16 00:18:18.931053 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] Apr 16 00:18:18.931123 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Apr 16 00:18:18.931190 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Apr 16 00:18:18.931257 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Apr 16 00:18:18.931323 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 00:18:18.931401 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Apr 16 00:18:18.931473 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Apr 16 00:18:18.931590 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Apr 16 00:18:18.931680 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Apr 16 00:18:18.931767 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Apr 16 00:18:18.931838 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 00:18:18.931917 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Apr 16 00:18:18.931987 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Apr 16 00:18:18.932056 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Apr 16 00:18:18.932125 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Apr 16 00:18:18.932229 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Apr 16 00:18:18.932306 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Apr 16 00:18:18.932374 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 00:18:18.932443 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Apr 16 00:18:18.932509 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Apr 16 00:18:18.932575 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Apr 16 00:18:18.932743 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 00:18:18.932877 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Apr 16 00:18:18.932944 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Apr 16 00:18:18.933016 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Apr 16 00:18:18.933081 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 00:18:18.933148 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Apr 16 00:18:18.933208 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Apr 16 00:18:18.933265 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Apr 16 00:18:18.933338 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Apr 16 00:18:18.933399 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Apr 16 00:18:18.933463 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 00:18:18.933531 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Apr 16 00:18:18.933592 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Apr 16 00:18:18.933673 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 00:18:18.933745 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Apr 16 00:18:18.933828 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Apr 16 00:18:18.933889 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 00:18:18.933974 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Apr 16 00:18:18.934035 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Apr 16 00:18:18.934109 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 00:18:18.934178 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Apr 16 00:18:18.934240 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Apr 16 00:18:18.934300 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 00:18:18.934372 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Apr 16 00:18:18.934434 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Apr 16 00:18:18.934499 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 00:18:18.934568 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Apr 16 00:18:18.934684 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Apr 16 00:18:18.934774 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 00:18:18.934849 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Apr 16 00:18:18.934911 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Apr 16 00:18:18.934972 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 00:18:18.935043 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Apr 16 00:18:18.935105 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Apr 16 00:18:18.935171 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 00:18:18.935181 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Apr 16 00:18:18.935189 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Apr 16 00:18:18.935197 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Apr 16 00:18:18.935205 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Apr 16 00:18:18.935213 kernel: iommu: Default domain type: Translated Apr 16 00:18:18.935221 kernel: iommu: DMA domain TLB invalidation policy: strict mode Apr 16 00:18:18.935229 kernel: efivars: Registered efivars operations Apr 16 00:18:18.935237 kernel: vgaarb: loaded Apr 16 00:18:18.935246 kernel: clocksource: Switched to clocksource arch_sys_counter Apr 16 00:18:18.935254 kernel: VFS: Disk quotas dquot_6.6.0 Apr 16 00:18:18.935262 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 16 00:18:18.935270 kernel: pnp: PnP ACPI init Apr 16 00:18:18.935345 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Apr 16 00:18:18.935357 kernel: pnp: PnP ACPI: found 1 devices Apr 16 00:18:18.935365 kernel: NET: Registered PF_INET protocol family Apr 16 00:18:18.935373 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Apr 16 00:18:18.935381 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Apr 16 00:18:18.935391 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 16 00:18:18.935399 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 16 00:18:18.935408 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Apr 16 00:18:18.935416 kernel: TCP: Hash tables configured (established 32768 bind 32768) Apr 16 00:18:18.935424 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 16 00:18:18.935432 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 16 00:18:18.935440 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 16 00:18:18.935517 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Apr 16 00:18:18.935530 kernel: PCI: CLS 0 bytes, default 64 Apr 16 00:18:18.935538 kernel: kvm [1]: HYP mode not available Apr 16 00:18:18.935546 kernel: Initialise system trusted keyrings Apr 16 00:18:18.935556 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Apr 16 00:18:18.935565 kernel: Key type asymmetric registered Apr 16 00:18:18.935572 kernel: Asymmetric key parser 'x509' registered Apr 16 00:18:18.935580 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Apr 16 00:18:18.935588 kernel: io scheduler mq-deadline registered Apr 16 00:18:18.935596 kernel: io scheduler kyber registered Apr 16 00:18:18.935700 kernel: io scheduler bfq registered Apr 16 00:18:18.935710 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Apr 16 00:18:18.935847 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Apr 16 00:18:18.935925 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Apr 16 00:18:18.935993 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.936062 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Apr 16 00:18:18.936129 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Apr 16 00:18:18.936202 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.936274 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Apr 16 00:18:18.936341 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Apr 16 00:18:18.936409 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.936480 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Apr 16 00:18:18.936548 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Apr 16 00:18:18.936722 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.936825 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Apr 16 00:18:18.936894 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Apr 16 00:18:18.936960 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.937031 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Apr 16 00:18:18.937098 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Apr 16 00:18:18.937169 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.937239 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Apr 16 00:18:18.937305 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Apr 16 00:18:18.937370 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.937439 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Apr 16 00:18:18.937522 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Apr 16 00:18:18.937594 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.937616 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Apr 16 00:18:18.937705 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Apr 16 00:18:18.937816 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Apr 16 00:18:18.937886 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 00:18:18.937897 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Apr 16 00:18:18.937905 kernel: ACPI: button: Power Button [PWRB] Apr 16 00:18:18.937917 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Apr 16 00:18:18.937990 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Apr 16 00:18:18.938065 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Apr 16 00:18:18.938077 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 16 00:18:18.938085 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Apr 16 00:18:18.938153 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Apr 16 00:18:18.938164 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Apr 16 00:18:18.938172 kernel: thunder_xcv, ver 1.0 Apr 16 00:18:18.938182 kernel: thunder_bgx, ver 1.0 Apr 16 00:18:18.938190 kernel: nicpf, ver 1.0 Apr 16 00:18:18.938198 kernel: nicvf, ver 1.0 Apr 16 00:18:18.938286 kernel: rtc-efi rtc-efi.0: registered as rtc0 Apr 16 00:18:18.938351 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-04-16T00:18:18 UTC (1776298698) Apr 16 00:18:18.938361 kernel: hid: raw HID events driver (C) Jiri Kosina Apr 16 00:18:18.938369 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Apr 16 00:18:18.938377 kernel: watchdog: Delayed init of the lockup detector failed: -19 Apr 16 00:18:18.938388 kernel: watchdog: Hard watchdog permanently disabled Apr 16 00:18:18.938396 kernel: NET: Registered PF_INET6 protocol family Apr 16 00:18:18.938403 kernel: Segment Routing with IPv6 Apr 16 00:18:18.938411 kernel: In-situ OAM (IOAM) with IPv6 Apr 16 00:18:18.938419 kernel: NET: Registered PF_PACKET protocol family Apr 16 00:18:18.938428 kernel: Key type dns_resolver registered Apr 16 00:18:18.938436 kernel: registered taskstats version 1 Apr 16 00:18:18.938444 kernel: Loading compiled-in X.509 certificates Apr 16 00:18:18.938452 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 42c6438655eac241afd498b973a7e22ad5b14a7d' Apr 16 00:18:18.938460 kernel: Key type .fscrypt registered Apr 16 00:18:18.938469 kernel: Key type fscrypt-provisioning registered Apr 16 00:18:18.938477 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 16 00:18:18.938485 kernel: ima: Allocated hash algorithm: sha1 Apr 16 00:18:18.938495 kernel: ima: No architecture policies found Apr 16 00:18:18.938503 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Apr 16 00:18:18.938511 kernel: clk: Disabling unused clocks Apr 16 00:18:18.938519 kernel: Freeing unused kernel memory: 39424K Apr 16 00:18:18.938526 kernel: Run /init as init process Apr 16 00:18:18.938534 kernel: with arguments: Apr 16 00:18:18.938544 kernel: /init Apr 16 00:18:18.938551 kernel: with environment: Apr 16 00:18:18.938559 kernel: HOME=/ Apr 16 00:18:18.938566 kernel: TERM=linux Apr 16 00:18:18.938577 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 16 00:18:18.938587 systemd[1]: Detected virtualization kvm. Apr 16 00:18:18.938595 systemd[1]: Detected architecture arm64. Apr 16 00:18:18.938655 systemd[1]: Running in initrd. Apr 16 00:18:18.938664 systemd[1]: No hostname configured, using default hostname. Apr 16 00:18:18.938672 systemd[1]: Hostname set to . Apr 16 00:18:18.938681 systemd[1]: Initializing machine ID from VM UUID. Apr 16 00:18:18.938690 systemd[1]: Queued start job for default target initrd.target. Apr 16 00:18:18.938698 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 00:18:18.938706 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 00:18:18.938715 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 16 00:18:18.938724 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 16 00:18:18.938734 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 16 00:18:18.938743 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 16 00:18:18.938765 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 16 00:18:18.938774 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 16 00:18:18.938783 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 00:18:18.938791 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 16 00:18:18.938803 systemd[1]: Reached target paths.target - Path Units. Apr 16 00:18:18.938811 systemd[1]: Reached target slices.target - Slice Units. Apr 16 00:18:18.938820 systemd[1]: Reached target swap.target - Swaps. Apr 16 00:18:18.938828 systemd[1]: Reached target timers.target - Timer Units. Apr 16 00:18:18.938837 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 16 00:18:18.938846 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 16 00:18:18.938854 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 16 00:18:18.938862 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 16 00:18:18.938871 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 16 00:18:18.938881 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 16 00:18:18.938889 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 00:18:18.938898 systemd[1]: Reached target sockets.target - Socket Units. Apr 16 00:18:18.938906 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 16 00:18:18.938914 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 16 00:18:18.938923 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 16 00:18:18.938931 systemd[1]: Starting systemd-fsck-usr.service... Apr 16 00:18:18.938939 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 16 00:18:18.938948 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 16 00:18:18.938958 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:18:18.938967 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 16 00:18:18.939006 systemd-journald[237]: Collecting audit messages is disabled. Apr 16 00:18:18.939030 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 00:18:18.939039 systemd[1]: Finished systemd-fsck-usr.service. Apr 16 00:18:18.939048 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 16 00:18:18.939056 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 16 00:18:18.939065 kernel: Bridge firewalling registered Apr 16 00:18:18.939075 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 16 00:18:18.939085 systemd-journald[237]: Journal started Apr 16 00:18:18.939105 systemd-journald[237]: Runtime Journal (/run/log/journal/278f4662e4754920b99574c6ec17f1d5) is 8.0M, max 76.6M, 68.6M free. Apr 16 00:18:18.908383 systemd-modules-load[238]: Inserted module 'overlay' Apr 16 00:18:18.930563 systemd-modules-load[238]: Inserted module 'br_netfilter' Apr 16 00:18:18.946763 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:18:18.946814 systemd[1]: Started systemd-journald.service - Journal Service. Apr 16 00:18:18.954930 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 16 00:18:18.957349 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 16 00:18:18.960861 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 16 00:18:18.962814 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 16 00:18:18.970525 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 16 00:18:18.988221 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 16 00:18:18.991905 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 00:18:19.000911 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 16 00:18:19.001732 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:18:19.005426 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 00:18:19.007776 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 16 00:18:19.031227 systemd-resolved[272]: Positive Trust Anchors: Apr 16 00:18:19.031248 systemd-resolved[272]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 16 00:18:19.031284 systemd-resolved[272]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 16 00:18:19.036365 systemd-resolved[272]: Defaulting to hostname 'linux'. Apr 16 00:18:19.040073 dracut-cmdline[276]: dracut-dracut-053 Apr 16 00:18:19.037528 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 16 00:18:19.039677 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 16 00:18:19.043908 dracut-cmdline[276]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=0adf63447ce845e6a0056fdc0e76e619192ad10bb115f878c5a0d78c1b8c220d Apr 16 00:18:19.134656 kernel: SCSI subsystem initialized Apr 16 00:18:19.139643 kernel: Loading iSCSI transport class v2.0-870. Apr 16 00:18:19.147714 kernel: iscsi: registered transport (tcp) Apr 16 00:18:19.162681 kernel: iscsi: registered transport (qla4xxx) Apr 16 00:18:19.162783 kernel: QLogic iSCSI HBA Driver Apr 16 00:18:19.210766 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 16 00:18:19.216894 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 16 00:18:19.242003 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 16 00:18:19.242092 kernel: device-mapper: uevent: version 1.0.3 Apr 16 00:18:19.242118 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Apr 16 00:18:19.293676 kernel: raid6: neonx8 gen() 15657 MB/s Apr 16 00:18:19.310691 kernel: raid6: neonx4 gen() 15562 MB/s Apr 16 00:18:19.327668 kernel: raid6: neonx2 gen() 13139 MB/s Apr 16 00:18:19.344681 kernel: raid6: neonx1 gen() 10394 MB/s Apr 16 00:18:19.361678 kernel: raid6: int64x8 gen() 6927 MB/s Apr 16 00:18:19.378678 kernel: raid6: int64x4 gen() 7309 MB/s Apr 16 00:18:19.395676 kernel: raid6: int64x2 gen() 6101 MB/s Apr 16 00:18:19.412688 kernel: raid6: int64x1 gen() 5031 MB/s Apr 16 00:18:19.412799 kernel: raid6: using algorithm neonx8 gen() 15657 MB/s Apr 16 00:18:19.429697 kernel: raid6: .... xor() 11916 MB/s, rmw enabled Apr 16 00:18:19.429808 kernel: raid6: using neon recovery algorithm Apr 16 00:18:19.434684 kernel: xor: measuring software checksum speed Apr 16 00:18:19.434826 kernel: 8regs : 19702 MB/sec Apr 16 00:18:19.434863 kernel: 32regs : 16209 MB/sec Apr 16 00:18:19.435992 kernel: arm64_neon : 26919 MB/sec Apr 16 00:18:19.436032 kernel: xor: using function: arm64_neon (26919 MB/sec) Apr 16 00:18:19.486668 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 16 00:18:19.504362 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 16 00:18:19.511901 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 00:18:19.527728 systemd-udevd[457]: Using default interface naming scheme 'v255'. Apr 16 00:18:19.531589 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 00:18:19.542901 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 16 00:18:19.559043 dracut-pre-trigger[465]: rd.md=0: removing MD RAID activation Apr 16 00:18:19.599622 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 16 00:18:19.606819 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 16 00:18:19.666640 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 00:18:19.675947 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 16 00:18:19.710392 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 16 00:18:19.712440 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 16 00:18:19.713954 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 00:18:19.715638 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 16 00:18:19.725374 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 16 00:18:19.746096 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 16 00:18:19.784340 kernel: scsi host0: Virtio SCSI HBA Apr 16 00:18:19.791668 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Apr 16 00:18:19.791793 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Apr 16 00:18:19.808139 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 16 00:18:19.808892 kernel: ACPI: bus type USB registered Apr 16 00:18:19.808294 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:18:19.810397 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 16 00:18:19.813507 kernel: usbcore: registered new interface driver usbfs Apr 16 00:18:19.811420 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 16 00:18:19.816505 kernel: usbcore: registered new interface driver hub Apr 16 00:18:19.816539 kernel: usbcore: registered new device driver usb Apr 16 00:18:19.811585 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:18:19.812704 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:18:19.823077 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:18:19.843657 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:18:19.852978 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 16 00:18:19.860452 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 16 00:18:19.860640 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Apr 16 00:18:19.860737 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Apr 16 00:18:19.860855 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 16 00:18:19.860958 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Apr 16 00:18:19.861044 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Apr 16 00:18:19.861196 kernel: hub 1-0:1.0: USB hub found Apr 16 00:18:19.863645 kernel: hub 1-0:1.0: 4 ports detected Apr 16 00:18:19.864009 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Apr 16 00:18:19.864140 kernel: hub 2-0:1.0: USB hub found Apr 16 00:18:19.864276 kernel: hub 2-0:1.0: 4 ports detected Apr 16 00:18:19.877855 kernel: sr 0:0:0:0: Power-on or device reset occurred Apr 16 00:18:19.878080 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Apr 16 00:18:19.879081 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Apr 16 00:18:19.883648 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Apr 16 00:18:19.891118 kernel: sd 0:0:0:1: Power-on or device reset occurred Apr 16 00:18:19.890815 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:18:19.893627 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Apr 16 00:18:19.893823 kernel: sd 0:0:0:1: [sda] Write Protect is off Apr 16 00:18:19.893942 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Apr 16 00:18:19.894029 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Apr 16 00:18:19.898911 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Apr 16 00:18:19.898999 kernel: GPT:17805311 != 80003071 Apr 16 00:18:19.899013 kernel: GPT:Alternate GPT header not at the end of the disk. Apr 16 00:18:19.899032 kernel: GPT:17805311 != 80003071 Apr 16 00:18:19.899041 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 16 00:18:19.899674 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:18:19.900677 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Apr 16 00:18:19.952454 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (510) Apr 16 00:18:19.954632 kernel: BTRFS: device fsid a6240e59-bdb5-4432-bae9-6f06a7303c55 devid 1 transid 37 /dev/sda3 scanned by (udev-worker) (517) Apr 16 00:18:19.959145 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Apr 16 00:18:19.968392 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Apr 16 00:18:19.976235 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 16 00:18:19.981159 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Apr 16 00:18:19.981941 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Apr 16 00:18:19.994926 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 16 00:18:20.007686 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:18:20.007786 disk-uuid[575]: Primary Header is updated. Apr 16 00:18:20.007786 disk-uuid[575]: Secondary Entries is updated. Apr 16 00:18:20.007786 disk-uuid[575]: Secondary Header is updated. Apr 16 00:18:20.105752 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Apr 16 00:18:20.240595 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Apr 16 00:18:20.240686 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Apr 16 00:18:20.241757 kernel: usbcore: registered new interface driver usbhid Apr 16 00:18:20.241801 kernel: usbhid: USB HID core driver Apr 16 00:18:20.347635 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Apr 16 00:18:20.477685 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Apr 16 00:18:20.531653 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Apr 16 00:18:21.027202 disk-uuid[576]: The operation has completed successfully. Apr 16 00:18:21.028265 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 16 00:18:21.089898 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 16 00:18:21.090028 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 16 00:18:21.100893 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 16 00:18:21.117921 sh[593]: Success Apr 16 00:18:21.132747 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Apr 16 00:18:21.198139 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 16 00:18:21.199771 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 16 00:18:21.206788 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 16 00:18:21.222062 kernel: BTRFS info (device dm-0): first mount of filesystem a6240e59-bdb5-4432-bae9-6f06a7303c55 Apr 16 00:18:21.222142 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:18:21.222175 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Apr 16 00:18:21.223628 kernel: BTRFS info (device dm-0): disabling log replay at mount time Apr 16 00:18:21.223663 kernel: BTRFS info (device dm-0): using free space tree Apr 16 00:18:21.230649 kernel: BTRFS info (device dm-0): enabling ssd optimizations Apr 16 00:18:21.233236 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 16 00:18:21.235987 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 16 00:18:21.243827 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 16 00:18:21.246846 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 16 00:18:21.264464 kernel: BTRFS info (device sda6): first mount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:18:21.264531 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:18:21.264557 kernel: BTRFS info (device sda6): using free space tree Apr 16 00:18:21.272629 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 16 00:18:21.272700 kernel: BTRFS info (device sda6): auto enabling async discard Apr 16 00:18:21.286114 systemd[1]: mnt-oem.mount: Deactivated successfully. Apr 16 00:18:21.287011 kernel: BTRFS info (device sda6): last unmount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:18:21.294105 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 16 00:18:21.302848 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 16 00:18:21.414476 ignition[681]: Ignition 2.19.0 Apr 16 00:18:21.415524 ignition[681]: Stage: fetch-offline Apr 16 00:18:21.415588 ignition[681]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:18:21.415599 ignition[681]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:18:21.416222 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 16 00:18:21.415825 ignition[681]: parsed url from cmdline: "" Apr 16 00:18:21.415829 ignition[681]: no config URL provided Apr 16 00:18:21.415835 ignition[681]: reading system config file "/usr/lib/ignition/user.ign" Apr 16 00:18:21.415843 ignition[681]: no config at "/usr/lib/ignition/user.ign" Apr 16 00:18:21.415849 ignition[681]: failed to fetch config: resource requires networking Apr 16 00:18:21.419918 ignition[681]: Ignition finished successfully Apr 16 00:18:21.426205 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 16 00:18:21.428622 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 16 00:18:21.448245 systemd-networkd[781]: lo: Link UP Apr 16 00:18:21.448258 systemd-networkd[781]: lo: Gained carrier Apr 16 00:18:21.449934 systemd-networkd[781]: Enumeration completed Apr 16 00:18:21.450528 systemd-networkd[781]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:21.450531 systemd-networkd[781]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:18:21.451456 systemd-networkd[781]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:21.451460 systemd-networkd[781]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:18:21.452293 systemd-networkd[781]: eth0: Link UP Apr 16 00:18:21.452296 systemd-networkd[781]: eth0: Gained carrier Apr 16 00:18:21.452304 systemd-networkd[781]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:21.453069 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 16 00:18:21.454515 systemd[1]: Reached target network.target - Network. Apr 16 00:18:21.459210 systemd-networkd[781]: eth1: Link UP Apr 16 00:18:21.459215 systemd-networkd[781]: eth1: Gained carrier Apr 16 00:18:21.459225 systemd-networkd[781]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:21.463941 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 16 00:18:21.478550 ignition[784]: Ignition 2.19.0 Apr 16 00:18:21.478561 ignition[784]: Stage: fetch Apr 16 00:18:21.478807 ignition[784]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:18:21.478819 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:18:21.478926 ignition[784]: parsed url from cmdline: "" Apr 16 00:18:21.478929 ignition[784]: no config URL provided Apr 16 00:18:21.478934 ignition[784]: reading system config file "/usr/lib/ignition/user.ign" Apr 16 00:18:21.478943 ignition[784]: no config at "/usr/lib/ignition/user.ign" Apr 16 00:18:21.478965 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Apr 16 00:18:21.479557 ignition[784]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Apr 16 00:18:21.499792 systemd-networkd[781]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 16 00:18:21.507686 systemd-networkd[781]: eth0: DHCPv4 address 78.46.194.74/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 16 00:18:21.680379 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Apr 16 00:18:21.688004 ignition[784]: GET result: OK Apr 16 00:18:21.688161 ignition[784]: parsing config with SHA512: 93e03e01324bb4b463e51569413b904da5d99725755573324d8eb72d0d9f0c7d9b1675aac17b97fd2df6ef562d491454c8cbc2d7b9b1bfb1caa287a36a75a714 Apr 16 00:18:21.696281 unknown[784]: fetched base config from "system" Apr 16 00:18:21.696299 unknown[784]: fetched base config from "system" Apr 16 00:18:21.696778 ignition[784]: fetch: fetch complete Apr 16 00:18:21.696305 unknown[784]: fetched user config from "hetzner" Apr 16 00:18:21.696784 ignition[784]: fetch: fetch passed Apr 16 00:18:21.696838 ignition[784]: Ignition finished successfully Apr 16 00:18:21.700758 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 16 00:18:21.707943 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 16 00:18:21.723275 ignition[791]: Ignition 2.19.0 Apr 16 00:18:21.723287 ignition[791]: Stage: kargs Apr 16 00:18:21.723485 ignition[791]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:18:21.723495 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:18:21.724477 ignition[791]: kargs: kargs passed Apr 16 00:18:21.728268 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 16 00:18:21.724532 ignition[791]: Ignition finished successfully Apr 16 00:18:21.735945 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 16 00:18:21.751006 ignition[797]: Ignition 2.19.0 Apr 16 00:18:21.751018 ignition[797]: Stage: disks Apr 16 00:18:21.751211 ignition[797]: no configs at "/usr/lib/ignition/base.d" Apr 16 00:18:21.751222 ignition[797]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:18:21.752350 ignition[797]: disks: disks passed Apr 16 00:18:21.752410 ignition[797]: Ignition finished successfully Apr 16 00:18:21.754635 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 16 00:18:21.755933 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 16 00:18:21.758703 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 16 00:18:21.760108 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 16 00:18:21.761630 systemd[1]: Reached target sysinit.target - System Initialization. Apr 16 00:18:21.763321 systemd[1]: Reached target basic.target - Basic System. Apr 16 00:18:21.770890 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 16 00:18:21.786860 systemd-fsck[806]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Apr 16 00:18:21.793713 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 16 00:18:21.800834 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 16 00:18:21.853643 kernel: EXT4-fs (sda9): mounted filesystem a7d1b52a-2d60-4e63-87fc-077f5b665cf4 r/w with ordered data mode. Quota mode: none. Apr 16 00:18:21.854792 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 16 00:18:21.856578 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 16 00:18:21.864833 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 16 00:18:21.869575 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 16 00:18:21.871829 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Apr 16 00:18:21.872656 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 16 00:18:21.872689 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 16 00:18:21.883847 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (814) Apr 16 00:18:21.885794 kernel: BTRFS info (device sda6): first mount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:18:21.885831 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:18:21.885843 kernel: BTRFS info (device sda6): using free space tree Apr 16 00:18:21.891174 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 16 00:18:21.896787 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 16 00:18:21.900379 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 16 00:18:21.900404 kernel: BTRFS info (device sda6): auto enabling async discard Apr 16 00:18:21.910139 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 16 00:18:21.949542 initrd-setup-root[842]: cut: /sysroot/etc/passwd: No such file or directory Apr 16 00:18:21.952594 coreos-metadata[816]: Apr 16 00:18:21.952 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Apr 16 00:18:21.954808 coreos-metadata[816]: Apr 16 00:18:21.954 INFO Fetch successful Apr 16 00:18:21.956115 initrd-setup-root[849]: cut: /sysroot/etc/group: No such file or directory Apr 16 00:18:21.957172 coreos-metadata[816]: Apr 16 00:18:21.957 INFO wrote hostname ci-4081-3-6-n-42941c021f to /sysroot/etc/hostname Apr 16 00:18:21.958416 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 16 00:18:21.964064 initrd-setup-root[857]: cut: /sysroot/etc/shadow: No such file or directory Apr 16 00:18:21.968680 initrd-setup-root[864]: cut: /sysroot/etc/gshadow: No such file or directory Apr 16 00:18:22.076669 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 16 00:18:22.083755 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 16 00:18:22.085808 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 16 00:18:22.099645 kernel: BTRFS info (device sda6): last unmount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:18:22.120953 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 16 00:18:22.124331 ignition[932]: INFO : Ignition 2.19.0 Apr 16 00:18:22.124331 ignition[932]: INFO : Stage: mount Apr 16 00:18:22.125675 ignition[932]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 00:18:22.125675 ignition[932]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:18:22.125675 ignition[932]: INFO : mount: mount passed Apr 16 00:18:22.128619 ignition[932]: INFO : Ignition finished successfully Apr 16 00:18:22.128911 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 16 00:18:22.134787 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 16 00:18:22.222750 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 16 00:18:22.228937 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 16 00:18:22.244070 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (943) Apr 16 00:18:22.244146 kernel: BTRFS info (device sda6): first mount of filesystem d00c5e58-4065-42ad-81de-759701ad0aab Apr 16 00:18:22.245191 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 00:18:22.245242 kernel: BTRFS info (device sda6): using free space tree Apr 16 00:18:22.249637 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 16 00:18:22.249710 kernel: BTRFS info (device sda6): auto enabling async discard Apr 16 00:18:22.252110 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 16 00:18:22.278098 ignition[961]: INFO : Ignition 2.19.0 Apr 16 00:18:22.278098 ignition[961]: INFO : Stage: files Apr 16 00:18:22.279399 ignition[961]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 00:18:22.279399 ignition[961]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:18:22.279399 ignition[961]: DEBUG : files: compiled without relabeling support, skipping Apr 16 00:18:22.282872 ignition[961]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 16 00:18:22.282872 ignition[961]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 16 00:18:22.285713 ignition[961]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 16 00:18:22.285713 ignition[961]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 16 00:18:22.285713 ignition[961]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 16 00:18:22.284574 unknown[961]: wrote ssh authorized keys file for user: core Apr 16 00:18:22.290217 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 16 00:18:22.290217 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Apr 16 00:18:22.373580 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Apr 16 00:18:22.518416 systemd-networkd[781]: eth0: Gained IPv6LL Apr 16 00:18:22.533683 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 16 00:18:22.533683 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 16 00:18:22.538523 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-arm64.raw: attempt #1 Apr 16 00:18:22.643877 systemd-networkd[781]: eth1: Gained IPv6LL Apr 16 00:18:22.850294 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Apr 16 00:18:23.473089 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 16 00:18:23.473089 ignition[961]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Apr 16 00:18:23.478110 ignition[961]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 16 00:18:23.478110 ignition[961]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 16 00:18:23.478110 ignition[961]: INFO : files: files passed Apr 16 00:18:23.478110 ignition[961]: INFO : Ignition finished successfully Apr 16 00:18:23.478509 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 16 00:18:23.484841 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 16 00:18:23.502065 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 16 00:18:23.511194 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 16 00:18:23.511319 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 16 00:18:23.521004 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 16 00:18:23.521004 initrd-setup-root-after-ignition[989]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 16 00:18:23.525249 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 16 00:18:23.527633 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 16 00:18:23.528815 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 16 00:18:23.534923 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 16 00:18:23.577179 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 16 00:18:23.578063 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 16 00:18:23.580257 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 16 00:18:23.583750 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 16 00:18:23.584454 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 16 00:18:23.593999 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 16 00:18:23.613382 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 16 00:18:23.620803 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 16 00:18:23.636000 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 16 00:18:23.637505 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 00:18:23.639143 systemd[1]: Stopped target timers.target - Timer Units. Apr 16 00:18:23.640072 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 16 00:18:23.640245 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 16 00:18:23.642120 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 16 00:18:23.643922 systemd[1]: Stopped target basic.target - Basic System. Apr 16 00:18:23.645164 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 16 00:18:23.646543 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 16 00:18:23.647696 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 16 00:18:23.648859 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 16 00:18:23.649852 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 16 00:18:23.651048 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 16 00:18:23.652214 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 16 00:18:23.653279 systemd[1]: Stopped target swap.target - Swaps. Apr 16 00:18:23.654292 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 16 00:18:23.654467 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 16 00:18:23.655922 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 16 00:18:23.657115 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 00:18:23.658227 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 16 00:18:23.658341 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 00:18:23.659429 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 16 00:18:23.659624 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 16 00:18:23.661375 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 16 00:18:23.661541 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 16 00:18:23.662780 systemd[1]: ignition-files.service: Deactivated successfully. Apr 16 00:18:23.662937 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 16 00:18:23.664083 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Apr 16 00:18:23.664242 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 16 00:18:23.676445 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 16 00:18:23.677397 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 16 00:18:23.677577 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 00:18:23.681895 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 16 00:18:23.682598 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 16 00:18:23.683231 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 00:18:23.684899 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 16 00:18:23.685837 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 16 00:18:23.698037 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 16 00:18:23.698273 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 16 00:18:23.702704 ignition[1013]: INFO : Ignition 2.19.0 Apr 16 00:18:23.702704 ignition[1013]: INFO : Stage: umount Apr 16 00:18:23.705336 ignition[1013]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 00:18:23.705336 ignition[1013]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 16 00:18:23.705336 ignition[1013]: INFO : umount: umount passed Apr 16 00:18:23.705336 ignition[1013]: INFO : Ignition finished successfully Apr 16 00:18:23.708411 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 16 00:18:23.709755 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 16 00:18:23.713350 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 16 00:18:23.713405 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 16 00:18:23.715372 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 16 00:18:23.716450 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 16 00:18:23.717364 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 16 00:18:23.717411 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 16 00:18:23.718433 systemd[1]: Stopped target network.target - Network. Apr 16 00:18:23.719410 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 16 00:18:23.719463 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 16 00:18:23.721369 systemd[1]: Stopped target paths.target - Path Units. Apr 16 00:18:23.723166 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 16 00:18:23.726736 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 00:18:23.728881 systemd[1]: Stopped target slices.target - Slice Units. Apr 16 00:18:23.730414 systemd[1]: Stopped target sockets.target - Socket Units. Apr 16 00:18:23.731543 systemd[1]: iscsid.socket: Deactivated successfully. Apr 16 00:18:23.731613 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 16 00:18:23.732580 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 16 00:18:23.732649 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 16 00:18:23.733630 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 16 00:18:23.733730 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 16 00:18:23.734629 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 16 00:18:23.734673 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 16 00:18:23.735782 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 16 00:18:23.736750 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 16 00:18:23.737410 systemd-networkd[781]: eth1: DHCPv6 lease lost Apr 16 00:18:23.738121 systemd-networkd[781]: eth0: DHCPv6 lease lost Apr 16 00:18:23.740486 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 16 00:18:23.741191 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 16 00:18:23.741306 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 16 00:18:23.744175 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 16 00:18:23.744265 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 16 00:18:23.745410 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 16 00:18:23.745496 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 16 00:18:23.749320 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 16 00:18:23.749381 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 16 00:18:23.750516 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 16 00:18:23.750572 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 16 00:18:23.755838 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 16 00:18:23.756344 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 16 00:18:23.756418 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 16 00:18:23.759774 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 16 00:18:23.759840 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 16 00:18:23.760470 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 16 00:18:23.760514 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 16 00:18:23.761375 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 16 00:18:23.761421 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 00:18:23.762363 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 00:18:23.783576 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 16 00:18:23.783849 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 00:18:23.786184 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 16 00:18:23.786311 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 16 00:18:23.787829 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 16 00:18:23.787904 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 16 00:18:23.789163 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 16 00:18:23.789198 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 00:18:23.790257 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 16 00:18:23.790307 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 16 00:18:23.791763 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 16 00:18:23.791808 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 16 00:18:23.793421 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 16 00:18:23.793473 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 00:18:23.804359 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 16 00:18:23.805846 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 16 00:18:23.805956 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 00:18:23.811993 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 16 00:18:23.812065 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:18:23.814169 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 16 00:18:23.814277 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 16 00:18:23.816014 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 16 00:18:23.820801 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 16 00:18:23.834102 systemd[1]: Switching root. Apr 16 00:18:23.866770 systemd-journald[237]: Journal stopped Apr 16 00:18:24.738030 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). Apr 16 00:18:24.738093 kernel: SELinux: policy capability network_peer_controls=1 Apr 16 00:18:24.738109 kernel: SELinux: policy capability open_perms=1 Apr 16 00:18:24.738123 kernel: SELinux: policy capability extended_socket_class=1 Apr 16 00:18:24.738133 kernel: SELinux: policy capability always_check_network=0 Apr 16 00:18:24.738143 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 16 00:18:24.738153 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 16 00:18:24.738166 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 16 00:18:24.738176 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 16 00:18:24.738185 kernel: audit: type=1403 audit(1776298704.014:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 16 00:18:24.738197 systemd[1]: Successfully loaded SELinux policy in 35.624ms. Apr 16 00:18:24.738235 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.916ms. Apr 16 00:18:24.738250 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 16 00:18:24.738270 systemd[1]: Detected virtualization kvm. Apr 16 00:18:24.738282 systemd[1]: Detected architecture arm64. Apr 16 00:18:24.738293 systemd[1]: Detected first boot. Apr 16 00:18:24.738306 systemd[1]: Hostname set to . Apr 16 00:18:24.738316 systemd[1]: Initializing machine ID from VM UUID. Apr 16 00:18:24.738327 zram_generator::config[1055]: No configuration found. Apr 16 00:18:24.738337 systemd[1]: Populated /etc with preset unit settings. Apr 16 00:18:24.738348 systemd[1]: initrd-switch-root.service: Deactivated successfully. Apr 16 00:18:24.738358 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Apr 16 00:18:24.738369 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Apr 16 00:18:24.738380 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 16 00:18:24.738392 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 16 00:18:24.738403 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 16 00:18:24.738414 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 16 00:18:24.738425 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 16 00:18:24.738435 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 16 00:18:24.738446 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 16 00:18:24.738457 systemd[1]: Created slice user.slice - User and Session Slice. Apr 16 00:18:24.738467 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 00:18:24.738479 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 00:18:24.738492 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 16 00:18:24.738507 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 16 00:18:24.738519 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 16 00:18:24.738530 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 16 00:18:24.738540 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Apr 16 00:18:24.738551 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 00:18:24.738562 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Apr 16 00:18:24.738574 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Apr 16 00:18:24.738585 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Apr 16 00:18:24.738596 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 16 00:18:24.741081 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 00:18:24.741115 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 16 00:18:24.741126 systemd[1]: Reached target slices.target - Slice Units. Apr 16 00:18:24.741137 systemd[1]: Reached target swap.target - Swaps. Apr 16 00:18:24.741148 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 16 00:18:24.741163 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 16 00:18:24.741207 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 16 00:18:24.741220 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 16 00:18:24.741232 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 00:18:24.741242 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 16 00:18:24.741253 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 16 00:18:24.741360 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 16 00:18:24.741374 systemd[1]: Mounting media.mount - External Media Directory... Apr 16 00:18:24.741386 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 16 00:18:24.741401 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 16 00:18:24.741412 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 16 00:18:24.741424 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 16 00:18:24.741435 systemd[1]: Reached target machines.target - Containers. Apr 16 00:18:24.741446 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 16 00:18:24.741457 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:18:24.741469 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 16 00:18:24.741480 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 16 00:18:24.742125 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 00:18:24.742168 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 16 00:18:24.742187 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 00:18:24.742201 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 16 00:18:24.742212 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 00:18:24.742223 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 16 00:18:24.742241 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Apr 16 00:18:24.742252 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Apr 16 00:18:24.742262 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Apr 16 00:18:24.742273 systemd[1]: Stopped systemd-fsck-usr.service. Apr 16 00:18:24.742283 kernel: fuse: init (API version 7.39) Apr 16 00:18:24.742295 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 16 00:18:24.742306 kernel: ACPI: bus type drm_connector registered Apr 16 00:18:24.742316 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 16 00:18:24.742327 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 16 00:18:24.742340 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 16 00:18:24.742351 kernel: loop: module loaded Apr 16 00:18:24.742361 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 16 00:18:24.742372 systemd[1]: verity-setup.service: Deactivated successfully. Apr 16 00:18:24.742383 systemd[1]: Stopped verity-setup.service. Apr 16 00:18:24.742394 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 16 00:18:24.742442 systemd-journald[1122]: Collecting audit messages is disabled. Apr 16 00:18:24.742467 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 16 00:18:24.742479 systemd[1]: Mounted media.mount - External Media Directory. Apr 16 00:18:24.742490 systemd-journald[1122]: Journal started Apr 16 00:18:24.742513 systemd-journald[1122]: Runtime Journal (/run/log/journal/278f4662e4754920b99574c6ec17f1d5) is 8.0M, max 76.6M, 68.6M free. Apr 16 00:18:24.744648 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 16 00:18:24.497313 systemd[1]: Queued start job for default target multi-user.target. Apr 16 00:18:24.523541 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Apr 16 00:18:24.524013 systemd[1]: systemd-journald.service: Deactivated successfully. Apr 16 00:18:24.748156 systemd[1]: Started systemd-journald.service - Journal Service. Apr 16 00:18:24.748349 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 16 00:18:24.750744 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 16 00:18:24.751913 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 00:18:24.753748 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 16 00:18:24.753904 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 16 00:18:24.756034 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 00:18:24.756186 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 00:18:24.757829 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 16 00:18:24.757971 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 16 00:18:24.759046 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 00:18:24.759181 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 00:18:24.762404 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 16 00:18:24.763524 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 16 00:18:24.763697 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 16 00:18:24.764838 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 00:18:24.764973 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 00:18:24.765938 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 16 00:18:24.768437 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 16 00:18:24.771322 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 16 00:18:24.784153 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 16 00:18:24.791888 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 16 00:18:24.796107 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 16 00:18:24.796788 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 16 00:18:24.796822 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 16 00:18:24.798386 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Apr 16 00:18:24.809963 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 16 00:18:24.816580 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 16 00:18:24.817314 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:18:24.829823 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 16 00:18:24.835130 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 16 00:18:24.837789 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 16 00:18:24.838968 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 16 00:18:24.840843 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 16 00:18:24.846864 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 16 00:18:24.851808 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 16 00:18:24.854246 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 16 00:18:24.859202 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 16 00:18:24.860669 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 16 00:18:24.861936 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 16 00:18:24.886347 systemd-journald[1122]: Time spent on flushing to /var/log/journal/278f4662e4754920b99574c6ec17f1d5 is 70.133ms for 1122 entries. Apr 16 00:18:24.886347 systemd-journald[1122]: System Journal (/var/log/journal/278f4662e4754920b99574c6ec17f1d5) is 8.0M, max 584.8M, 576.8M free. Apr 16 00:18:24.975807 systemd-journald[1122]: Received client request to flush runtime journal. Apr 16 00:18:24.975861 kernel: loop0: detected capacity change from 0 to 197488 Apr 16 00:18:24.975892 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 16 00:18:24.899403 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 16 00:18:24.900495 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 16 00:18:24.910026 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Apr 16 00:18:24.942993 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 00:18:24.954862 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Apr 16 00:18:24.958749 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 16 00:18:24.961680 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 16 00:18:24.963481 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Apr 16 00:18:24.982096 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 16 00:18:24.983638 kernel: loop1: detected capacity change from 0 to 114432 Apr 16 00:18:24.987526 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 16 00:18:24.997001 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 16 00:18:25.002052 udevadm[1182]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Apr 16 00:18:25.044622 kernel: loop2: detected capacity change from 0 to 8 Apr 16 00:18:25.051634 systemd-tmpfiles[1188]: ACLs are not supported, ignoring. Apr 16 00:18:25.051658 systemd-tmpfiles[1188]: ACLs are not supported, ignoring. Apr 16 00:18:25.061284 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 00:18:25.061744 kernel: loop3: detected capacity change from 0 to 114328 Apr 16 00:18:25.110665 kernel: loop4: detected capacity change from 0 to 197488 Apr 16 00:18:25.128633 kernel: loop5: detected capacity change from 0 to 114432 Apr 16 00:18:25.155200 kernel: loop6: detected capacity change from 0 to 8 Apr 16 00:18:25.160671 kernel: loop7: detected capacity change from 0 to 114328 Apr 16 00:18:25.177956 (sd-merge)[1194]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Apr 16 00:18:25.178692 (sd-merge)[1194]: Merged extensions into '/usr'. Apr 16 00:18:25.187729 systemd[1]: Reloading requested from client PID 1169 ('systemd-sysext') (unit systemd-sysext.service)... Apr 16 00:18:25.187749 systemd[1]: Reloading... Apr 16 00:18:25.315673 zram_generator::config[1221]: No configuration found. Apr 16 00:18:25.350658 ldconfig[1164]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 16 00:18:25.450514 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:18:25.509914 systemd[1]: Reloading finished in 321 ms. Apr 16 00:18:25.538438 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 16 00:18:25.543666 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 16 00:18:25.552962 systemd[1]: Starting ensure-sysext.service... Apr 16 00:18:25.559805 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 16 00:18:25.573784 systemd[1]: Reloading requested from client PID 1257 ('systemctl') (unit ensure-sysext.service)... Apr 16 00:18:25.573804 systemd[1]: Reloading... Apr 16 00:18:25.596943 systemd-tmpfiles[1258]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 16 00:18:25.597204 systemd-tmpfiles[1258]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 16 00:18:25.597881 systemd-tmpfiles[1258]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 16 00:18:25.598095 systemd-tmpfiles[1258]: ACLs are not supported, ignoring. Apr 16 00:18:25.598142 systemd-tmpfiles[1258]: ACLs are not supported, ignoring. Apr 16 00:18:25.601372 systemd-tmpfiles[1258]: Detected autofs mount point /boot during canonicalization of boot. Apr 16 00:18:25.601385 systemd-tmpfiles[1258]: Skipping /boot Apr 16 00:18:25.618135 systemd-tmpfiles[1258]: Detected autofs mount point /boot during canonicalization of boot. Apr 16 00:18:25.618148 systemd-tmpfiles[1258]: Skipping /boot Apr 16 00:18:25.655642 zram_generator::config[1281]: No configuration found. Apr 16 00:18:25.770515 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:18:25.825881 systemd[1]: Reloading finished in 251 ms. Apr 16 00:18:25.848971 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 16 00:18:25.872670 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 00:18:25.887191 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 16 00:18:25.894015 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 16 00:18:25.903208 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 16 00:18:25.908220 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 16 00:18:25.913222 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 00:18:25.929362 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 16 00:18:25.935501 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:18:25.951046 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 00:18:25.959371 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 00:18:25.964995 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 00:18:25.965953 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:18:25.967904 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 16 00:18:25.977377 systemd-udevd[1334]: Using default interface naming scheme 'v255'. Apr 16 00:18:25.983496 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 16 00:18:25.988418 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:18:25.990748 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:18:25.997823 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 16 00:18:26.002406 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:18:26.007955 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 16 00:18:26.008846 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:18:26.009461 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 00:18:26.010896 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 00:18:26.017118 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 00:18:26.032487 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 16 00:18:26.035253 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 16 00:18:26.052410 systemd[1]: Finished ensure-sysext.service. Apr 16 00:18:26.055206 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 16 00:18:26.065096 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Apr 16 00:18:26.065843 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 16 00:18:26.075198 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 00:18:26.076725 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 00:18:26.078983 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 16 00:18:26.083677 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 16 00:18:26.084769 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 00:18:26.085343 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 00:18:26.086269 augenrules[1378]: No rules Apr 16 00:18:26.087482 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 16 00:18:26.090491 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 16 00:18:26.097509 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 16 00:18:26.098526 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 16 00:18:26.156278 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Apr 16 00:18:26.156375 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 16 00:18:26.275199 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Apr 16 00:18:26.279273 systemd[1]: Reached target time-set.target - System Time Set. Apr 16 00:18:26.317625 kernel: mousedev: PS/2 mouse device common for all mice Apr 16 00:18:26.322366 systemd-resolved[1333]: Positive Trust Anchors: Apr 16 00:18:26.323665 systemd-resolved[1333]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 16 00:18:26.323822 systemd-resolved[1333]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 16 00:18:26.325638 systemd-networkd[1354]: lo: Link UP Apr 16 00:18:26.325650 systemd-networkd[1354]: lo: Gained carrier Apr 16 00:18:26.327398 systemd-networkd[1354]: Enumeration completed Apr 16 00:18:26.327511 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 16 00:18:26.333318 systemd-resolved[1333]: Using system hostname 'ci-4081-3-6-n-42941c021f'. Apr 16 00:18:26.336877 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 16 00:18:26.338627 systemd-networkd[1354]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:26.338638 systemd-networkd[1354]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:18:26.340240 systemd-networkd[1354]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:26.340258 systemd-networkd[1354]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 00:18:26.340807 systemd-networkd[1354]: eth0: Link UP Apr 16 00:18:26.340817 systemd-networkd[1354]: eth0: Gained carrier Apr 16 00:18:26.340834 systemd-networkd[1354]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:26.342368 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 16 00:18:26.344849 systemd[1]: Reached target network.target - Network. Apr 16 00:18:26.346342 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 16 00:18:26.350187 systemd-networkd[1354]: eth1: Link UP Apr 16 00:18:26.350214 systemd-networkd[1354]: eth1: Gained carrier Apr 16 00:18:26.350255 systemd-networkd[1354]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:26.358260 systemd-networkd[1354]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 00:18:26.384644 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1351) Apr 16 00:18:26.393898 systemd-networkd[1354]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 16 00:18:26.397675 systemd-networkd[1354]: eth0: DHCPv4 address 78.46.194.74/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 16 00:18:26.398205 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Apr 16 00:18:26.398285 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Apr 16 00:18:26.400042 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Apr 16 00:18:26.419308 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Apr 16 00:18:26.420993 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 00:18:26.428995 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 00:18:26.433505 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 00:18:26.437500 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 00:18:26.438289 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 00:18:26.438336 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 16 00:18:26.438671 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 00:18:26.438986 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 00:18:26.462416 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 00:18:26.462665 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 00:18:26.464143 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 00:18:26.464308 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 00:18:26.466219 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 16 00:18:26.466298 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 16 00:18:26.471288 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 16 00:18:26.482247 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Apr 16 00:18:26.482318 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Apr 16 00:18:26.482332 kernel: [drm] features: -context_init Apr 16 00:18:26.485042 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 16 00:18:26.489077 kernel: [drm] number of scanouts: 1 Apr 16 00:18:26.496387 kernel: [drm] number of cap sets: 0 Apr 16 00:18:26.507005 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Apr 16 00:18:26.519661 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 16 00:18:26.521117 kernel: Console: switching to colour frame buffer device 160x50 Apr 16 00:18:26.528659 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Apr 16 00:18:26.539996 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 00:18:26.618735 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 00:18:26.664326 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Apr 16 00:18:26.670898 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Apr 16 00:18:26.692047 lvm[1434]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 16 00:18:26.723982 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Apr 16 00:18:26.726117 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 16 00:18:26.727100 systemd[1]: Reached target sysinit.target - System Initialization. Apr 16 00:18:26.728223 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 16 00:18:26.729267 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 16 00:18:26.730354 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 16 00:18:26.731298 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 16 00:18:26.732234 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 16 00:18:26.733090 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 16 00:18:26.733130 systemd[1]: Reached target paths.target - Path Units. Apr 16 00:18:26.733674 systemd[1]: Reached target timers.target - Timer Units. Apr 16 00:18:26.735437 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 16 00:18:26.738091 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 16 00:18:26.744455 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 16 00:18:26.747212 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Apr 16 00:18:26.748540 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 16 00:18:26.749386 systemd[1]: Reached target sockets.target - Socket Units. Apr 16 00:18:26.750144 systemd[1]: Reached target basic.target - Basic System. Apr 16 00:18:26.750858 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 16 00:18:26.750890 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 16 00:18:26.755861 systemd[1]: Starting containerd.service - containerd container runtime... Apr 16 00:18:26.760865 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Apr 16 00:18:26.761675 lvm[1438]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 16 00:18:26.770899 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 16 00:18:26.775852 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 16 00:18:26.778801 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 16 00:18:26.781786 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 16 00:18:26.790937 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 16 00:18:26.794280 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 16 00:18:26.801424 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Apr 16 00:18:26.806876 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 16 00:18:26.811838 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 16 00:18:26.817366 dbus-daemon[1441]: [system] SELinux support is enabled Apr 16 00:18:26.819841 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 16 00:18:26.821282 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 16 00:18:26.823270 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 16 00:18:26.826210 systemd[1]: Starting update-engine.service - Update Engine... Apr 16 00:18:26.831513 extend-filesystems[1445]: Found loop4 Apr 16 00:18:26.833599 extend-filesystems[1445]: Found loop5 Apr 16 00:18:26.833666 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 16 00:18:26.834237 extend-filesystems[1445]: Found loop6 Apr 16 00:18:26.835098 extend-filesystems[1445]: Found loop7 Apr 16 00:18:26.835142 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 16 00:18:26.835636 extend-filesystems[1445]: Found sda Apr 16 00:18:26.839328 extend-filesystems[1445]: Found sda1 Apr 16 00:18:26.839328 extend-filesystems[1445]: Found sda2 Apr 16 00:18:26.839328 extend-filesystems[1445]: Found sda3 Apr 16 00:18:26.839328 extend-filesystems[1445]: Found usr Apr 16 00:18:26.839328 extend-filesystems[1445]: Found sda4 Apr 16 00:18:26.839328 extend-filesystems[1445]: Found sda6 Apr 16 00:18:26.839328 extend-filesystems[1445]: Found sda7 Apr 16 00:18:26.839328 extend-filesystems[1445]: Found sda9 Apr 16 00:18:26.839328 extend-filesystems[1445]: Checking size of /dev/sda9 Apr 16 00:18:26.841876 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Apr 16 00:18:26.850950 jq[1442]: false Apr 16 00:18:26.852027 coreos-metadata[1440]: Apr 16 00:18:26.851 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Apr 16 00:18:26.854415 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 16 00:18:26.854484 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 16 00:18:26.856331 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 16 00:18:26.856365 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 16 00:18:26.857184 coreos-metadata[1440]: Apr 16 00:18:26.857 INFO Fetch successful Apr 16 00:18:26.857184 coreos-metadata[1440]: Apr 16 00:18:26.857 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Apr 16 00:18:26.861712 coreos-metadata[1440]: Apr 16 00:18:26.860 INFO Fetch successful Apr 16 00:18:26.866037 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 16 00:18:26.866242 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 16 00:18:26.900012 jq[1454]: true Apr 16 00:18:26.900288 extend-filesystems[1445]: Resized partition /dev/sda9 Apr 16 00:18:26.912115 extend-filesystems[1479]: resize2fs 1.47.1 (20-May-2024) Apr 16 00:18:26.913418 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 16 00:18:26.916164 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Apr 16 00:18:26.914382 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 16 00:18:26.931512 update_engine[1453]: I20260416 00:18:26.931165 1453 main.cc:92] Flatcar Update Engine starting Apr 16 00:18:26.934125 systemd[1]: Started update-engine.service - Update Engine. Apr 16 00:18:26.934353 update_engine[1453]: I20260416 00:18:26.934265 1453 update_check_scheduler.cc:74] Next update check in 10m46s Apr 16 00:18:26.939979 systemd[1]: motdgen.service: Deactivated successfully. Apr 16 00:18:26.940242 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 16 00:18:26.940941 (ntainerd)[1477]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 16 00:18:26.948411 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 16 00:18:26.962591 tar[1461]: linux-arm64/LICENSE Apr 16 00:18:26.962591 tar[1461]: linux-arm64/helm Apr 16 00:18:26.972459 jq[1480]: true Apr 16 00:18:27.042631 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1373) Apr 16 00:18:27.059532 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Apr 16 00:18:27.088632 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Apr 16 00:18:27.104713 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 16 00:18:27.117713 extend-filesystems[1479]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Apr 16 00:18:27.117713 extend-filesystems[1479]: old_desc_blocks = 1, new_desc_blocks = 5 Apr 16 00:18:27.117713 extend-filesystems[1479]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Apr 16 00:18:27.121984 extend-filesystems[1445]: Resized filesystem in /dev/sda9 Apr 16 00:18:27.121984 extend-filesystems[1445]: Found sr0 Apr 16 00:18:27.144861 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 16 00:18:27.145544 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 16 00:18:27.153969 systemd-logind[1452]: New seat seat0. Apr 16 00:18:27.158092 systemd-logind[1452]: Watching system buttons on /dev/input/event0 (Power Button) Apr 16 00:18:27.158120 systemd-logind[1452]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Apr 16 00:18:27.159980 systemd[1]: Started systemd-logind.service - User Login Management. Apr 16 00:18:27.164245 bash[1515]: Updated "/home/core/.ssh/authorized_keys" Apr 16 00:18:27.169950 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 16 00:18:27.185217 systemd[1]: Starting sshkeys.service... Apr 16 00:18:27.209562 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Apr 16 00:18:27.221093 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Apr 16 00:18:27.252765 containerd[1477]: time="2026-04-16T00:18:27.250320320Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Apr 16 00:18:27.258581 coreos-metadata[1520]: Apr 16 00:18:27.257 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Apr 16 00:18:27.260834 coreos-metadata[1520]: Apr 16 00:18:27.260 INFO Fetch successful Apr 16 00:18:27.264577 unknown[1520]: wrote ssh authorized keys file for user: core Apr 16 00:18:27.312032 update-ssh-keys[1526]: Updated "/home/core/.ssh/authorized_keys" Apr 16 00:18:27.313232 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Apr 16 00:18:27.321653 systemd[1]: Finished sshkeys.service. Apr 16 00:18:27.329642 containerd[1477]: time="2026-04-16T00:18:27.328994840Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:18:27.331721 containerd[1477]: time="2026-04-16T00:18:27.330581760Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:18:27.331721 containerd[1477]: time="2026-04-16T00:18:27.331225840Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Apr 16 00:18:27.331721 containerd[1477]: time="2026-04-16T00:18:27.331248040Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Apr 16 00:18:27.331721 containerd[1477]: time="2026-04-16T00:18:27.331438640Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Apr 16 00:18:27.331721 containerd[1477]: time="2026-04-16T00:18:27.331460160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Apr 16 00:18:27.331721 containerd[1477]: time="2026-04-16T00:18:27.331525360Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:18:27.331721 containerd[1477]: time="2026-04-16T00:18:27.331538160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:18:27.332457 containerd[1477]: time="2026-04-16T00:18:27.332426800Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:18:27.332775 containerd[1477]: time="2026-04-16T00:18:27.332755520Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Apr 16 00:18:27.332844 containerd[1477]: time="2026-04-16T00:18:27.332829680Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:18:27.333035 containerd[1477]: time="2026-04-16T00:18:27.333018040Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Apr 16 00:18:27.333493 containerd[1477]: time="2026-04-16T00:18:27.333469240Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:18:27.334737 containerd[1477]: time="2026-04-16T00:18:27.334142960Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Apr 16 00:18:27.334737 containerd[1477]: time="2026-04-16T00:18:27.334271640Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 16 00:18:27.334737 containerd[1477]: time="2026-04-16T00:18:27.334287720Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Apr 16 00:18:27.334737 containerd[1477]: time="2026-04-16T00:18:27.334360280Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Apr 16 00:18:27.334737 containerd[1477]: time="2026-04-16T00:18:27.334402760Z" level=info msg="metadata content store policy set" policy=shared Apr 16 00:18:27.346277 containerd[1477]: time="2026-04-16T00:18:27.346228280Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Apr 16 00:18:27.346444 containerd[1477]: time="2026-04-16T00:18:27.346429200Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Apr 16 00:18:27.346721 containerd[1477]: time="2026-04-16T00:18:27.346658800Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Apr 16 00:18:27.346843 containerd[1477]: time="2026-04-16T00:18:27.346825760Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Apr 16 00:18:27.347748 containerd[1477]: time="2026-04-16T00:18:27.347573000Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.347996760Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348294240Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348401400Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348420960Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348433400Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348446640Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348458640Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348470120Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348483200Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348496960Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348509840Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348521640Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348533920Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Apr 16 00:18:27.349026 containerd[1477]: time="2026-04-16T00:18:27.348554120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.349299 containerd[1477]: time="2026-04-16T00:18:27.348567800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.349299 containerd[1477]: time="2026-04-16T00:18:27.348579440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.349299 containerd[1477]: time="2026-04-16T00:18:27.348592400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350044 containerd[1477]: time="2026-04-16T00:18:27.350020440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350164 containerd[1477]: time="2026-04-16T00:18:27.350147480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350430520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350452280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350467600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350485400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350498800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350510240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350522360Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350548840Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350577080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.350768 containerd[1477]: time="2026-04-16T00:18:27.350589520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351624440Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351783640Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351807720Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351819280Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351831200Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351840680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351853320Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351863720Z" level=info msg="NRI interface is disabled by configuration." Apr 16 00:18:27.352755 containerd[1477]: time="2026-04-16T00:18:27.351876720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Apr 16 00:18:27.352975 containerd[1477]: time="2026-04-16T00:18:27.352237520Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Apr 16 00:18:27.352975 containerd[1477]: time="2026-04-16T00:18:27.352297520Z" level=info msg="Connect containerd service" Apr 16 00:18:27.352975 containerd[1477]: time="2026-04-16T00:18:27.352324160Z" level=info msg="using legacy CRI server" Apr 16 00:18:27.352975 containerd[1477]: time="2026-04-16T00:18:27.352330680Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 16 00:18:27.352975 containerd[1477]: time="2026-04-16T00:18:27.352414760Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Apr 16 00:18:27.356184 containerd[1477]: time="2026-04-16T00:18:27.356145120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 16 00:18:27.357581 containerd[1477]: time="2026-04-16T00:18:27.356823840Z" level=info msg="Start subscribing containerd event" Apr 16 00:18:27.357581 containerd[1477]: time="2026-04-16T00:18:27.356898240Z" level=info msg="Start recovering state" Apr 16 00:18:27.357581 containerd[1477]: time="2026-04-16T00:18:27.356984240Z" level=info msg="Start event monitor" Apr 16 00:18:27.357581 containerd[1477]: time="2026-04-16T00:18:27.356997440Z" level=info msg="Start snapshots syncer" Apr 16 00:18:27.357581 containerd[1477]: time="2026-04-16T00:18:27.357007880Z" level=info msg="Start cni network conf syncer for default" Apr 16 00:18:27.357581 containerd[1477]: time="2026-04-16T00:18:27.357016120Z" level=info msg="Start streaming server" Apr 16 00:18:27.358289 containerd[1477]: time="2026-04-16T00:18:27.358264280Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 16 00:18:27.358769 containerd[1477]: time="2026-04-16T00:18:27.358745840Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 16 00:18:27.359494 systemd[1]: Started containerd.service - containerd container runtime. Apr 16 00:18:27.361102 containerd[1477]: time="2026-04-16T00:18:27.360421800Z" level=info msg="containerd successfully booted in 0.112902s" Apr 16 00:18:27.428208 locksmithd[1489]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 16 00:18:27.629316 sshd_keygen[1488]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 16 00:18:27.652327 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 16 00:18:27.663179 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 16 00:18:27.671335 systemd[1]: issuegen.service: Deactivated successfully. Apr 16 00:18:27.671817 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 16 00:18:27.679194 tar[1461]: linux-arm64/README.md Apr 16 00:18:27.684767 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 16 00:18:27.696721 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 16 00:18:27.701067 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 16 00:18:27.704082 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Apr 16 00:18:27.706796 systemd[1]: Reached target getty.target - Login Prompts. Apr 16 00:18:27.708710 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 16 00:18:27.891869 systemd-networkd[1354]: eth0: Gained IPv6LL Apr 16 00:18:27.894796 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Apr 16 00:18:27.897461 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 16 00:18:27.899991 systemd[1]: Reached target network-online.target - Network is Online. Apr 16 00:18:27.914089 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:18:27.918083 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 16 00:18:27.953794 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 16 00:18:28.403959 systemd-networkd[1354]: eth1: Gained IPv6LL Apr 16 00:18:28.404780 systemd-timesyncd[1379]: Network configuration changed, trying to establish connection. Apr 16 00:18:28.700971 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:18:28.701404 (kubelet)[1570]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:18:28.703711 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 16 00:18:28.704846 systemd[1]: Startup finished in 781ms (kernel) + 5.322s (initrd) + 4.725s (userspace) = 10.829s. Apr 16 00:18:29.145949 kubelet[1570]: E0416 00:18:29.145831 1570 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:18:29.148848 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:18:29.149349 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:18:39.399911 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 16 00:18:39.414081 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:18:39.540340 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:18:39.550477 (kubelet)[1589]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:18:39.591564 kubelet[1589]: E0416 00:18:39.591502 1589 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:18:39.596194 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:18:39.596446 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:18:49.846794 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Apr 16 00:18:49.859007 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:18:49.980401 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:18:49.985485 (kubelet)[1604]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:18:50.033087 kubelet[1604]: E0416 00:18:50.033020 1604 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:18:50.036598 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:18:50.036974 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:18:58.656528 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 16 00:18:58.661929 systemd-timesyncd[1379]: Contacted time server 46.38.233.159:123 (2.flatcar.pool.ntp.org). Apr 16 00:18:58.662004 systemd-timesyncd[1379]: Initial clock synchronization to Thu 2026-04-16 00:18:58.689820 UTC. Apr 16 00:18:58.666149 systemd[1]: Started sshd@0-78.46.194.74:22-4.175.71.9:40938.service - OpenSSH per-connection server daemon (4.175.71.9:40938). Apr 16 00:18:58.801632 sshd[1611]: Accepted publickey for core from 4.175.71.9 port 40938 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:18:58.803885 sshd[1611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:18:58.814956 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 16 00:18:58.823397 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 16 00:18:58.828433 systemd-logind[1452]: New session 1 of user core. Apr 16 00:18:58.838315 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 16 00:18:58.852338 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 16 00:18:58.858136 (systemd)[1615]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 16 00:18:58.972009 systemd[1615]: Queued start job for default target default.target. Apr 16 00:18:58.983806 systemd[1615]: Created slice app.slice - User Application Slice. Apr 16 00:18:58.983872 systemd[1615]: Reached target paths.target - Paths. Apr 16 00:18:58.983901 systemd[1615]: Reached target timers.target - Timers. Apr 16 00:18:58.986399 systemd[1615]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 16 00:18:59.001622 systemd[1615]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 16 00:18:59.001748 systemd[1615]: Reached target sockets.target - Sockets. Apr 16 00:18:59.001761 systemd[1615]: Reached target basic.target - Basic System. Apr 16 00:18:59.001805 systemd[1615]: Reached target default.target - Main User Target. Apr 16 00:18:59.001833 systemd[1615]: Startup finished in 136ms. Apr 16 00:18:59.002369 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 16 00:18:59.015262 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 16 00:18:59.138507 systemd[1]: Started sshd@1-78.46.194.74:22-4.175.71.9:40940.service - OpenSSH per-connection server daemon (4.175.71.9:40940). Apr 16 00:18:59.252625 sshd[1626]: Accepted publickey for core from 4.175.71.9 port 40940 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:18:59.254969 sshd[1626]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:18:59.260674 systemd-logind[1452]: New session 2 of user core. Apr 16 00:18:59.271918 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 16 00:18:59.372048 sshd[1626]: pam_unix(sshd:session): session closed for user core Apr 16 00:18:59.376111 systemd-logind[1452]: Session 2 logged out. Waiting for processes to exit. Apr 16 00:18:59.376376 systemd[1]: sshd@1-78.46.194.74:22-4.175.71.9:40940.service: Deactivated successfully. Apr 16 00:18:59.378520 systemd[1]: session-2.scope: Deactivated successfully. Apr 16 00:18:59.380480 systemd-logind[1452]: Removed session 2. Apr 16 00:18:59.399928 systemd[1]: Started sshd@2-78.46.194.74:22-4.175.71.9:40956.service - OpenSSH per-connection server daemon (4.175.71.9:40956). Apr 16 00:18:59.517355 sshd[1633]: Accepted publickey for core from 4.175.71.9 port 40956 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:18:59.519912 sshd[1633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:18:59.525122 systemd-logind[1452]: New session 3 of user core. Apr 16 00:18:59.530922 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 16 00:18:59.626514 sshd[1633]: pam_unix(sshd:session): session closed for user core Apr 16 00:18:59.631323 systemd[1]: sshd@2-78.46.194.74:22-4.175.71.9:40956.service: Deactivated successfully. Apr 16 00:18:59.633620 systemd[1]: session-3.scope: Deactivated successfully. Apr 16 00:18:59.636980 systemd-logind[1452]: Session 3 logged out. Waiting for processes to exit. Apr 16 00:18:59.638454 systemd-logind[1452]: Removed session 3. Apr 16 00:18:59.658229 systemd[1]: Started sshd@3-78.46.194.74:22-4.175.71.9:40972.service - OpenSSH per-connection server daemon (4.175.71.9:40972). Apr 16 00:18:59.782596 sshd[1640]: Accepted publickey for core from 4.175.71.9 port 40972 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:18:59.785323 sshd[1640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:18:59.791669 systemd-logind[1452]: New session 4 of user core. Apr 16 00:18:59.802000 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 16 00:18:59.904553 sshd[1640]: pam_unix(sshd:session): session closed for user core Apr 16 00:18:59.908959 systemd[1]: sshd@3-78.46.194.74:22-4.175.71.9:40972.service: Deactivated successfully. Apr 16 00:18:59.911219 systemd[1]: session-4.scope: Deactivated successfully. Apr 16 00:18:59.913211 systemd-logind[1452]: Session 4 logged out. Waiting for processes to exit. Apr 16 00:18:59.914319 systemd-logind[1452]: Removed session 4. Apr 16 00:18:59.950681 systemd[1]: Started sshd@4-78.46.194.74:22-4.175.71.9:40984.service - OpenSSH per-connection server daemon (4.175.71.9:40984). Apr 16 00:19:00.048741 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Apr 16 00:19:00.064641 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:19:00.080670 sshd[1647]: Accepted publickey for core from 4.175.71.9 port 40984 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:19:00.083172 sshd[1647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:19:00.094813 systemd-logind[1452]: New session 5 of user core. Apr 16 00:19:00.099974 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 16 00:19:00.202591 sudo[1655]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 16 00:19:00.202937 sudo[1655]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:19:00.208036 (kubelet)[1659]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:19:00.209290 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:00.226878 sudo[1655]: pam_unix(sudo:session): session closed for user root Apr 16 00:19:00.244901 sshd[1647]: pam_unix(sshd:session): session closed for user core Apr 16 00:19:00.253579 systemd-logind[1452]: Session 5 logged out. Waiting for processes to exit. Apr 16 00:19:00.253806 systemd[1]: sshd@4-78.46.194.74:22-4.175.71.9:40984.service: Deactivated successfully. Apr 16 00:19:00.256874 systemd[1]: session-5.scope: Deactivated successfully. Apr 16 00:19:00.259022 systemd-logind[1452]: Removed session 5. Apr 16 00:19:00.271478 kubelet[1659]: E0416 00:19:00.271430 1659 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:19:00.275423 systemd[1]: Started sshd@5-78.46.194.74:22-4.175.71.9:40994.service - OpenSSH per-connection server daemon (4.175.71.9:40994). Apr 16 00:19:00.276016 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:19:00.276157 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:19:00.393491 sshd[1670]: Accepted publickey for core from 4.175.71.9 port 40994 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:19:00.394676 sshd[1670]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:19:00.398921 systemd-logind[1452]: New session 6 of user core. Apr 16 00:19:00.409019 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 16 00:19:00.494453 sudo[1675]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 16 00:19:00.494758 sudo[1675]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:19:00.499167 sudo[1675]: pam_unix(sudo:session): session closed for user root Apr 16 00:19:00.505890 sudo[1674]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Apr 16 00:19:00.506212 sudo[1674]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:19:00.528158 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Apr 16 00:19:00.530061 auditctl[1678]: No rules Apr 16 00:19:00.530761 systemd[1]: audit-rules.service: Deactivated successfully. Apr 16 00:19:00.531032 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Apr 16 00:19:00.533832 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 16 00:19:00.574979 augenrules[1696]: No rules Apr 16 00:19:00.576920 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 16 00:19:00.578469 sudo[1674]: pam_unix(sudo:session): session closed for user root Apr 16 00:19:00.595370 sshd[1670]: pam_unix(sshd:session): session closed for user core Apr 16 00:19:00.600300 systemd[1]: sshd@5-78.46.194.74:22-4.175.71.9:40994.service: Deactivated successfully. Apr 16 00:19:00.602342 systemd[1]: session-6.scope: Deactivated successfully. Apr 16 00:19:00.605392 systemd-logind[1452]: Session 6 logged out. Waiting for processes to exit. Apr 16 00:19:00.606730 systemd-logind[1452]: Removed session 6. Apr 16 00:19:00.633856 systemd[1]: Started sshd@6-78.46.194.74:22-4.175.71.9:41006.service - OpenSSH per-connection server daemon (4.175.71.9:41006). Apr 16 00:19:00.761748 sshd[1704]: Accepted publickey for core from 4.175.71.9 port 41006 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:19:00.765380 sshd[1704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:19:00.772502 systemd-logind[1452]: New session 7 of user core. Apr 16 00:19:00.776979 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 16 00:19:00.862960 sudo[1707]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 16 00:19:00.863569 sudo[1707]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 00:19:01.169250 (dockerd)[1722]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 16 00:19:01.169640 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 16 00:19:01.425819 dockerd[1722]: time="2026-04-16T00:19:01.425651584Z" level=info msg="Starting up" Apr 16 00:19:01.521460 dockerd[1722]: time="2026-04-16T00:19:01.521027371Z" level=info msg="Loading containers: start." Apr 16 00:19:01.622962 kernel: Initializing XFRM netlink socket Apr 16 00:19:01.697031 systemd-networkd[1354]: docker0: Link UP Apr 16 00:19:01.722967 dockerd[1722]: time="2026-04-16T00:19:01.721563039Z" level=info msg="Loading containers: done." Apr 16 00:19:01.738417 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck185304086-merged.mount: Deactivated successfully. Apr 16 00:19:01.742083 dockerd[1722]: time="2026-04-16T00:19:01.742039670Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 16 00:19:01.742306 dockerd[1722]: time="2026-04-16T00:19:01.742287492Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Apr 16 00:19:01.742496 dockerd[1722]: time="2026-04-16T00:19:01.742479228Z" level=info msg="Daemon has completed initialization" Apr 16 00:19:01.793865 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 16 00:19:01.794196 dockerd[1722]: time="2026-04-16T00:19:01.793825646Z" level=info msg="API listen on /run/docker.sock" Apr 16 00:19:02.251462 containerd[1477]: time="2026-04-16T00:19:02.251146618Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\"" Apr 16 00:19:02.816532 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount579667358.mount: Deactivated successfully. Apr 16 00:19:03.609480 containerd[1477]: time="2026-04-16T00:19:03.609404121Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:03.611631 containerd[1477]: time="2026-04-16T00:19:03.610798372Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.4: active requests=0, bytes read=24608883" Apr 16 00:19:03.612330 containerd[1477]: time="2026-04-16T00:19:03.612286070Z" level=info msg="ImageCreate event name:\"sha256:09c946ff1743c56c0d49ef90ba95500741e0534f2f590ec98c924e4673ee3096\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:03.617381 containerd[1477]: time="2026-04-16T00:19:03.617305958Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:03.618936 containerd[1477]: time="2026-04-16T00:19:03.618885580Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.4\" with image id \"sha256:09c946ff1743c56c0d49ef90ba95500741e0534f2f590ec98c924e4673ee3096\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\", size \"24605384\" in 1.367688971s" Apr 16 00:19:03.618936 containerd[1477]: time="2026-04-16T00:19:03.618931122Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\" returns image reference \"sha256:09c946ff1743c56c0d49ef90ba95500741e0534f2f590ec98c924e4673ee3096\"" Apr 16 00:19:03.620569 containerd[1477]: time="2026-04-16T00:19:03.620349686Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\"" Apr 16 00:19:04.494651 containerd[1477]: time="2026-04-16T00:19:04.494583653Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:04.496622 containerd[1477]: time="2026-04-16T00:19:04.496554119Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.4: active requests=0, bytes read=19073314" Apr 16 00:19:04.498396 containerd[1477]: time="2026-04-16T00:19:04.498341912Z" level=info msg="ImageCreate event name:\"sha256:95ce7d322e267614405a2a0eccfc0a1bdf5664dd9ab089bdfa9ae74d5ccb05a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:04.504151 containerd[1477]: time="2026-04-16T00:19:04.502547100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:04.504151 containerd[1477]: time="2026-04-16T00:19:04.503772578Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.4\" with image id \"sha256:95ce7d322e267614405a2a0eccfc0a1bdf5664dd9ab089bdfa9ae74d5ccb05a7\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\", size \"20579933\" in 883.381797ms" Apr 16 00:19:04.504151 containerd[1477]: time="2026-04-16T00:19:04.503806421Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\" returns image reference \"sha256:95ce7d322e267614405a2a0eccfc0a1bdf5664dd9ab089bdfa9ae74d5ccb05a7\"" Apr 16 00:19:04.504918 containerd[1477]: time="2026-04-16T00:19:04.504894084Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\"" Apr 16 00:19:05.405653 containerd[1477]: time="2026-04-16T00:19:05.404688368Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:05.406750 containerd[1477]: time="2026-04-16T00:19:05.406679542Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.4: active requests=0, bytes read=13800856" Apr 16 00:19:05.408501 containerd[1477]: time="2026-04-16T00:19:05.407993589Z" level=info msg="ImageCreate event name:\"sha256:77d7d4cb9aa826105b6410a50df1dda7462ec663ced995347d8c171b04b0ee81\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:05.416479 containerd[1477]: time="2026-04-16T00:19:05.416423679Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:05.417968 containerd[1477]: time="2026-04-16T00:19:05.417910372Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.4\" with image id \"sha256:77d7d4cb9aa826105b6410a50df1dda7462ec663ced995347d8c171b04b0ee81\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\", size \"15307493\" in 912.867419ms" Apr 16 00:19:05.418123 containerd[1477]: time="2026-04-16T00:19:05.418105725Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\" returns image reference \"sha256:77d7d4cb9aa826105b6410a50df1dda7462ec663ced995347d8c171b04b0ee81\"" Apr 16 00:19:05.419054 containerd[1477]: time="2026-04-16T00:19:05.419011685Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\"" Apr 16 00:19:06.308099 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1169989337.mount: Deactivated successfully. Apr 16 00:19:06.545185 containerd[1477]: time="2026-04-16T00:19:06.545075062Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:06.546155 containerd[1477]: time="2026-04-16T00:19:06.546123073Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.4: active requests=0, bytes read=22340610" Apr 16 00:19:06.547099 containerd[1477]: time="2026-04-16T00:19:06.546786094Z" level=info msg="ImageCreate event name:\"sha256:8c75fb69e773da539298848d12a0a12029818ee910a62f2abd68aa1a5805991c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:06.549303 containerd[1477]: time="2026-04-16T00:19:06.549255495Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:06.550181 containerd[1477]: time="2026-04-16T00:19:06.550142846Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.4\" with image id \"sha256:8c75fb69e773da539298848d12a0a12029818ee910a62f2abd68aa1a5805991c\", repo tag \"registry.k8s.io/kube-proxy:v1.35.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\", size \"22339603\" in 1.130917709s" Apr 16 00:19:06.550181 containerd[1477]: time="2026-04-16T00:19:06.550179087Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\" returns image reference \"sha256:8c75fb69e773da539298848d12a0a12029818ee910a62f2abd68aa1a5805991c\"" Apr 16 00:19:06.550951 containerd[1477]: time="2026-04-16T00:19:06.550886237Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Apr 16 00:19:07.069575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2656991326.mount: Deactivated successfully. Apr 16 00:19:07.903392 containerd[1477]: time="2026-04-16T00:19:07.903340973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:07.905356 containerd[1477]: time="2026-04-16T00:19:07.904985777Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=21172309" Apr 16 00:19:07.905530 containerd[1477]: time="2026-04-16T00:19:07.905488303Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:07.909426 containerd[1477]: time="2026-04-16T00:19:07.909377699Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:07.911255 containerd[1477]: time="2026-04-16T00:19:07.911187476Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.36013145s" Apr 16 00:19:07.911255 containerd[1477]: time="2026-04-16T00:19:07.911240772Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"" Apr 16 00:19:07.911915 containerd[1477]: time="2026-04-16T00:19:07.911840160Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Apr 16 00:19:08.369856 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount356056904.mount: Deactivated successfully. Apr 16 00:19:08.380779 containerd[1477]: time="2026-04-16T00:19:08.380672685Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:08.382578 containerd[1477]: time="2026-04-16T00:19:08.382516937Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Apr 16 00:19:08.383563 containerd[1477]: time="2026-04-16T00:19:08.383081131Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:08.386777 containerd[1477]: time="2026-04-16T00:19:08.386706093Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:08.387660 containerd[1477]: time="2026-04-16T00:19:08.387600291Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 475.728979ms" Apr 16 00:19:08.387781 containerd[1477]: time="2026-04-16T00:19:08.387763732Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Apr 16 00:19:08.388853 containerd[1477]: time="2026-04-16T00:19:08.388816046Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Apr 16 00:19:08.880053 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2123236388.mount: Deactivated successfully. Apr 16 00:19:09.495982 containerd[1477]: time="2026-04-16T00:19:09.495896539Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:09.498526 containerd[1477]: time="2026-04-16T00:19:09.498458499Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21752394" Apr 16 00:19:09.500065 containerd[1477]: time="2026-04-16T00:19:09.499999078Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:09.503290 containerd[1477]: time="2026-04-16T00:19:09.503216482Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:09.505383 containerd[1477]: time="2026-04-16T00:19:09.505300922Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 1.116448441s" Apr 16 00:19:09.505383 containerd[1477]: time="2026-04-16T00:19:09.505354932Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\"" Apr 16 00:19:10.374020 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Apr 16 00:19:10.383649 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:19:10.507774 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:10.513938 (kubelet)[2084]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 00:19:10.557262 kubelet[2084]: E0416 00:19:10.553563 2084 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 00:19:10.556856 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 00:19:10.556984 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 00:19:11.887878 update_engine[1453]: I20260416 00:19:11.887640 1453 update_attempter.cc:509] Updating boot flags... Apr 16 00:19:11.951665 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (2100) Apr 16 00:19:12.052786 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (2104) Apr 16 00:19:12.121684 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (2104) Apr 16 00:19:12.680118 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:12.689520 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:19:12.735679 systemd[1]: Reloading requested from client PID 2119 ('systemctl') (unit session-7.scope)... Apr 16 00:19:12.735699 systemd[1]: Reloading... Apr 16 00:19:12.866642 zram_generator::config[2159]: No configuration found. Apr 16 00:19:12.980569 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:19:13.059703 systemd[1]: Reloading finished in 323 ms. Apr 16 00:19:13.117983 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:13.121762 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:19:13.125262 systemd[1]: kubelet.service: Deactivated successfully. Apr 16 00:19:13.125549 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:13.136226 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:19:13.259872 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:13.275318 (kubelet)[2209]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 16 00:19:13.323287 kubelet[2209]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 00:19:14.414898 kubelet[2209]: I0416 00:19:14.414682 2209 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 16 00:19:14.414898 kubelet[2209]: I0416 00:19:14.414789 2209 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 00:19:14.417382 kubelet[2209]: I0416 00:19:14.416813 2209 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 16 00:19:14.417382 kubelet[2209]: I0416 00:19:14.416845 2209 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 00:19:14.417382 kubelet[2209]: I0416 00:19:14.417255 2209 server.go:951] "Client rotation is on, will bootstrap in background" Apr 16 00:19:14.426120 kubelet[2209]: I0416 00:19:14.426004 2209 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 16 00:19:14.427156 kubelet[2209]: E0416 00:19:14.426649 2209 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://78.46.194.74:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 78.46.194.74:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 16 00:19:14.430274 kubelet[2209]: E0416 00:19:14.430212 2209 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 16 00:19:14.430391 kubelet[2209]: I0416 00:19:14.430300 2209 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 16 00:19:14.433031 kubelet[2209]: I0416 00:19:14.433005 2209 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 16 00:19:14.433325 kubelet[2209]: I0416 00:19:14.433292 2209 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 00:19:14.433487 kubelet[2209]: I0416 00:19:14.433326 2209 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-42941c021f","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 00:19:14.433589 kubelet[2209]: I0416 00:19:14.433494 2209 topology_manager.go:143] "Creating topology manager with none policy" Apr 16 00:19:14.433589 kubelet[2209]: I0416 00:19:14.433503 2209 container_manager_linux.go:308] "Creating device plugin manager" Apr 16 00:19:14.433668 kubelet[2209]: I0416 00:19:14.433636 2209 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 16 00:19:14.437089 kubelet[2209]: I0416 00:19:14.437026 2209 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 16 00:19:14.437416 kubelet[2209]: I0416 00:19:14.437314 2209 kubelet.go:482] "Attempting to sync node with API server" Apr 16 00:19:14.437416 kubelet[2209]: I0416 00:19:14.437334 2209 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 00:19:14.437416 kubelet[2209]: I0416 00:19:14.437356 2209 kubelet.go:394] "Adding apiserver pod source" Apr 16 00:19:14.437416 kubelet[2209]: I0416 00:19:14.437367 2209 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 00:19:14.442023 kubelet[2209]: I0416 00:19:14.441993 2209 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 16 00:19:14.443354 kubelet[2209]: I0416 00:19:14.443323 2209 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 00:19:14.443431 kubelet[2209]: I0416 00:19:14.443368 2209 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 16 00:19:14.443431 kubelet[2209]: W0416 00:19:14.443418 2209 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 16 00:19:14.446443 kubelet[2209]: I0416 00:19:14.446408 2209 server.go:1257] "Started kubelet" Apr 16 00:19:14.450838 kubelet[2209]: I0416 00:19:14.450787 2209 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 16 00:19:14.458534 kubelet[2209]: I0416 00:19:14.458467 2209 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 00:19:14.459402 kubelet[2209]: E0416 00:19:14.457824 2209 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://78.46.194.74:6443/api/v1/namespaces/default/events\": dial tcp 78.46.194.74:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-42941c021f.18a6ae4e51becaee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-42941c021f,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-42941c021f,},FirstTimestamp:2026-04-16 00:19:14.446379758 +0000 UTC m=+1.165408385,LastTimestamp:2026-04-16 00:19:14.446379758 +0000 UTC m=+1.165408385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-42941c021f,}" Apr 16 00:19:14.460296 kubelet[2209]: I0416 00:19:14.460221 2209 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 00:19:14.460376 kubelet[2209]: I0416 00:19:14.460323 2209 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 16 00:19:14.463908 kubelet[2209]: I0416 00:19:14.463873 2209 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 16 00:19:14.464398 kubelet[2209]: I0416 00:19:14.464357 2209 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 00:19:14.465937 kubelet[2209]: I0416 00:19:14.465765 2209 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 16 00:19:14.466294 kubelet[2209]: E0416 00:19:14.466275 2209 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:14.467186 kubelet[2209]: I0416 00:19:14.467168 2209 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 16 00:19:14.467351 kubelet[2209]: I0416 00:19:14.467340 2209 reconciler.go:29] "Reconciler: start to sync state" Apr 16 00:19:14.467562 kubelet[2209]: I0416 00:19:14.467480 2209 server.go:317] "Adding debug handlers to kubelet server" Apr 16 00:19:14.469129 kubelet[2209]: I0416 00:19:14.468941 2209 factory.go:223] Registration of the systemd container factory successfully Apr 16 00:19:14.469129 kubelet[2209]: I0416 00:19:14.469049 2209 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 16 00:19:14.469846 kubelet[2209]: E0416 00:19:14.469052 2209 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.194.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-42941c021f?timeout=10s\": dial tcp 78.46.194.74:6443: connect: connection refused" interval="200ms" Apr 16 00:19:14.470925 kubelet[2209]: E0416 00:19:14.470888 2209 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 16 00:19:14.471351 kubelet[2209]: I0416 00:19:14.471323 2209 factory.go:223] Registration of the containerd container factory successfully Apr 16 00:19:14.484256 kubelet[2209]: I0416 00:19:14.484194 2209 cpu_manager.go:225] "Starting" policy="none" Apr 16 00:19:14.484435 kubelet[2209]: I0416 00:19:14.484422 2209 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 16 00:19:14.484521 kubelet[2209]: I0416 00:19:14.484511 2209 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 16 00:19:14.487712 kubelet[2209]: I0416 00:19:14.487666 2209 policy_none.go:50] "Start" Apr 16 00:19:14.487712 kubelet[2209]: I0416 00:19:14.487706 2209 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 16 00:19:14.487712 kubelet[2209]: I0416 00:19:14.487720 2209 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 16 00:19:14.490256 kubelet[2209]: I0416 00:19:14.489637 2209 policy_none.go:44] "Start" Apr 16 00:19:14.494430 kubelet[2209]: I0416 00:19:14.494383 2209 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 16 00:19:14.495422 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Apr 16 00:19:14.498455 kubelet[2209]: I0416 00:19:14.498401 2209 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 16 00:19:14.498634 kubelet[2209]: I0416 00:19:14.498623 2209 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 16 00:19:14.499273 kubelet[2209]: I0416 00:19:14.499240 2209 kubelet.go:2501] "Starting kubelet main sync loop" Apr 16 00:19:14.499444 kubelet[2209]: E0416 00:19:14.499314 2209 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 16 00:19:14.511332 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Apr 16 00:19:14.515659 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Apr 16 00:19:14.527187 kubelet[2209]: E0416 00:19:14.527149 2209 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 00:19:14.527447 kubelet[2209]: I0416 00:19:14.527430 2209 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 16 00:19:14.527484 kubelet[2209]: I0416 00:19:14.527450 2209 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 00:19:14.528011 kubelet[2209]: I0416 00:19:14.527862 2209 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 16 00:19:14.530550 kubelet[2209]: E0416 00:19:14.530365 2209 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 16 00:19:14.530550 kubelet[2209]: E0416 00:19:14.530410 2209 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:14.615246 systemd[1]: Created slice kubepods-burstable-pod9f438274a8734294643326ce7c1f08d0.slice - libcontainer container kubepods-burstable-pod9f438274a8734294643326ce7c1f08d0.slice. Apr 16 00:19:14.630520 kubelet[2209]: I0416 00:19:14.630473 2209 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.632754 kubelet[2209]: E0416 00:19:14.631907 2209 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://78.46.194.74:6443/api/v1/nodes\": dial tcp 78.46.194.74:6443: connect: connection refused" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.632754 kubelet[2209]: E0416 00:19:14.632107 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.634656 systemd[1]: Created slice kubepods-burstable-pod39a56de86cb9b07fd25b0e6c91fb0bde.slice - libcontainer container kubepods-burstable-pod39a56de86cb9b07fd25b0e6c91fb0bde.slice. Apr 16 00:19:14.649145 kubelet[2209]: E0416 00:19:14.649104 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.652970 systemd[1]: Created slice kubepods-burstable-pod6ab78f6094756e520d51c8bf9343dea4.slice - libcontainer container kubepods-burstable-pod6ab78f6094756e520d51c8bf9343dea4.slice. Apr 16 00:19:14.655242 kubelet[2209]: E0416 00:19:14.655174 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.671468 kubelet[2209]: E0416 00:19:14.671282 2209 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.194.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-42941c021f?timeout=10s\": dial tcp 78.46.194.74:6443: connect: connection refused" interval="400ms" Apr 16 00:19:14.769008 kubelet[2209]: I0416 00:19:14.768941 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9f438274a8734294643326ce7c1f08d0-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-42941c021f\" (UID: \"9f438274a8734294643326ce7c1f08d0\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769008 kubelet[2209]: I0416 00:19:14.768996 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9f438274a8734294643326ce7c1f08d0-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-42941c021f\" (UID: \"9f438274a8734294643326ce7c1f08d0\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769008 kubelet[2209]: I0416 00:19:14.769020 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769311 kubelet[2209]: I0416 00:19:14.769047 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769311 kubelet[2209]: I0416 00:19:14.769070 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9f438274a8734294643326ce7c1f08d0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-42941c021f\" (UID: \"9f438274a8734294643326ce7c1f08d0\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769311 kubelet[2209]: I0416 00:19:14.769094 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769311 kubelet[2209]: I0416 00:19:14.769114 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769311 kubelet[2209]: I0416 00:19:14.769135 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.769466 kubelet[2209]: I0416 00:19:14.769157 2209 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6ab78f6094756e520d51c8bf9343dea4-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-42941c021f\" (UID: \"6ab78f6094756e520d51c8bf9343dea4\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.835353 kubelet[2209]: I0416 00:19:14.835033 2209 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.835579 kubelet[2209]: E0416 00:19:14.835460 2209 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://78.46.194.74:6443/api/v1/nodes\": dial tcp 78.46.194.74:6443: connect: connection refused" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:14.937792 containerd[1477]: time="2026-04-16T00:19:14.937356946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-42941c021f,Uid:9f438274a8734294643326ce7c1f08d0,Namespace:kube-system,Attempt:0,}" Apr 16 00:19:14.953387 containerd[1477]: time="2026-04-16T00:19:14.952991018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-42941c021f,Uid:39a56de86cb9b07fd25b0e6c91fb0bde,Namespace:kube-system,Attempt:0,}" Apr 16 00:19:14.958444 containerd[1477]: time="2026-04-16T00:19:14.958382816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-42941c021f,Uid:6ab78f6094756e520d51c8bf9343dea4,Namespace:kube-system,Attempt:0,}" Apr 16 00:19:15.073013 kubelet[2209]: E0416 00:19:15.072971 2209 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.194.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-42941c021f?timeout=10s\": dial tcp 78.46.194.74:6443: connect: connection refused" interval="800ms" Apr 16 00:19:15.238824 kubelet[2209]: I0416 00:19:15.238667 2209 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:15.239583 kubelet[2209]: E0416 00:19:15.239530 2209 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://78.46.194.74:6443/api/v1/nodes\": dial tcp 78.46.194.74:6443: connect: connection refused" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:15.370322 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2893157020.mount: Deactivated successfully. Apr 16 00:19:15.380641 containerd[1477]: time="2026-04-16T00:19:15.378993598Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:19:15.380641 containerd[1477]: time="2026-04-16T00:19:15.380219965Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:19:15.381570 containerd[1477]: time="2026-04-16T00:19:15.381538590Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 16 00:19:15.381969 containerd[1477]: time="2026-04-16T00:19:15.381942963Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 16 00:19:15.382639 containerd[1477]: time="2026-04-16T00:19:15.382592409Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:19:15.383511 containerd[1477]: time="2026-04-16T00:19:15.383483887Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:19:15.384273 containerd[1477]: time="2026-04-16T00:19:15.384247004Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Apr 16 00:19:15.387947 containerd[1477]: time="2026-04-16T00:19:15.387904292Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 00:19:15.388815 containerd[1477]: time="2026-04-16T00:19:15.388773956Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 430.109232ms" Apr 16 00:19:15.392735 containerd[1477]: time="2026-04-16T00:19:15.392692368Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 455.224069ms" Apr 16 00:19:15.394129 containerd[1477]: time="2026-04-16T00:19:15.394083958Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 440.988792ms" Apr 16 00:19:15.516268 containerd[1477]: time="2026-04-16T00:19:15.516084444Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:15.516268 containerd[1477]: time="2026-04-16T00:19:15.516161892Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:15.517037 containerd[1477]: time="2026-04-16T00:19:15.516176942Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:15.517037 containerd[1477]: time="2026-04-16T00:19:15.516572749Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:15.519660 containerd[1477]: time="2026-04-16T00:19:15.519505744Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:15.519660 containerd[1477]: time="2026-04-16T00:19:15.519578750Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:15.520396 containerd[1477]: time="2026-04-16T00:19:15.520327458Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:15.521166 containerd[1477]: time="2026-04-16T00:19:15.520622963Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:15.522322 containerd[1477]: time="2026-04-16T00:19:15.522059982Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:15.523757 containerd[1477]: time="2026-04-16T00:19:15.522789519Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:15.523757 containerd[1477]: time="2026-04-16T00:19:15.522812773Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:15.523757 containerd[1477]: time="2026-04-16T00:19:15.522909994Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:15.551887 systemd[1]: Started cri-containerd-0a43f610222e62207ccb773daa2be4b9c442e390a43e999200db7d2f61c8ede9.scope - libcontainer container 0a43f610222e62207ccb773daa2be4b9c442e390a43e999200db7d2f61c8ede9. Apr 16 00:19:15.558380 systemd[1]: Started cri-containerd-3be1998b36ff46399c8abd9ee0d985b5160822a01d6c5e3a791d92415572e93b.scope - libcontainer container 3be1998b36ff46399c8abd9ee0d985b5160822a01d6c5e3a791d92415572e93b. Apr 16 00:19:15.560319 systemd[1]: Started cri-containerd-6bc6f8d66eee8e7b752b2b5fe6ac8f21fe8a254eadfa4d93d10758dc2f6e4a66.scope - libcontainer container 6bc6f8d66eee8e7b752b2b5fe6ac8f21fe8a254eadfa4d93d10758dc2f6e4a66. Apr 16 00:19:15.615060 containerd[1477]: time="2026-04-16T00:19:15.614996125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-42941c021f,Uid:9f438274a8734294643326ce7c1f08d0,Namespace:kube-system,Attempt:0,} returns sandbox id \"3be1998b36ff46399c8abd9ee0d985b5160822a01d6c5e3a791d92415572e93b\"" Apr 16 00:19:15.626334 containerd[1477]: time="2026-04-16T00:19:15.626294673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-42941c021f,Uid:39a56de86cb9b07fd25b0e6c91fb0bde,Namespace:kube-system,Attempt:0,} returns sandbox id \"0a43f610222e62207ccb773daa2be4b9c442e390a43e999200db7d2f61c8ede9\"" Apr 16 00:19:15.627187 containerd[1477]: time="2026-04-16T00:19:15.627065436Z" level=info msg="CreateContainer within sandbox \"3be1998b36ff46399c8abd9ee0d985b5160822a01d6c5e3a791d92415572e93b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 16 00:19:15.632790 containerd[1477]: time="2026-04-16T00:19:15.632743628Z" level=info msg="CreateContainer within sandbox \"0a43f610222e62207ccb773daa2be4b9c442e390a43e999200db7d2f61c8ede9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 16 00:19:15.637821 containerd[1477]: time="2026-04-16T00:19:15.637073016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-42941c021f,Uid:6ab78f6094756e520d51c8bf9343dea4,Namespace:kube-system,Attempt:0,} returns sandbox id \"6bc6f8d66eee8e7b752b2b5fe6ac8f21fe8a254eadfa4d93d10758dc2f6e4a66\"" Apr 16 00:19:15.645048 containerd[1477]: time="2026-04-16T00:19:15.644959831Z" level=info msg="CreateContainer within sandbox \"6bc6f8d66eee8e7b752b2b5fe6ac8f21fe8a254eadfa4d93d10758dc2f6e4a66\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 16 00:19:15.649135 containerd[1477]: time="2026-04-16T00:19:15.649077126Z" level=info msg="CreateContainer within sandbox \"3be1998b36ff46399c8abd9ee0d985b5160822a01d6c5e3a791d92415572e93b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"583cb4156b696af27a011c67ab1d8f5a132490876ca743212a0cd892c5638f64\"" Apr 16 00:19:15.649899 containerd[1477]: time="2026-04-16T00:19:15.649834840Z" level=info msg="StartContainer for \"583cb4156b696af27a011c67ab1d8f5a132490876ca743212a0cd892c5638f64\"" Apr 16 00:19:15.664409 containerd[1477]: time="2026-04-16T00:19:15.664189861Z" level=info msg="CreateContainer within sandbox \"0a43f610222e62207ccb773daa2be4b9c442e390a43e999200db7d2f61c8ede9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c\"" Apr 16 00:19:15.665510 containerd[1477]: time="2026-04-16T00:19:15.665459175Z" level=info msg="StartContainer for \"0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c\"" Apr 16 00:19:15.668132 containerd[1477]: time="2026-04-16T00:19:15.668019457Z" level=info msg="CreateContainer within sandbox \"6bc6f8d66eee8e7b752b2b5fe6ac8f21fe8a254eadfa4d93d10758dc2f6e4a66\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"939eb6bb5dd52d13aa848296b88525dafadbfda72be71abcb7c6175538bafc6a\"" Apr 16 00:19:15.668587 containerd[1477]: time="2026-04-16T00:19:15.668554552Z" level=info msg="StartContainer for \"939eb6bb5dd52d13aa848296b88525dafadbfda72be71abcb7c6175538bafc6a\"" Apr 16 00:19:15.686259 systemd[1]: Started cri-containerd-583cb4156b696af27a011c67ab1d8f5a132490876ca743212a0cd892c5638f64.scope - libcontainer container 583cb4156b696af27a011c67ab1d8f5a132490876ca743212a0cd892c5638f64. Apr 16 00:19:15.704937 systemd[1]: Started cri-containerd-939eb6bb5dd52d13aa848296b88525dafadbfda72be71abcb7c6175538bafc6a.scope - libcontainer container 939eb6bb5dd52d13aa848296b88525dafadbfda72be71abcb7c6175538bafc6a. Apr 16 00:19:15.719382 systemd[1]: Started cri-containerd-0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c.scope - libcontainer container 0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c. Apr 16 00:19:15.748146 containerd[1477]: time="2026-04-16T00:19:15.748104720Z" level=info msg="StartContainer for \"583cb4156b696af27a011c67ab1d8f5a132490876ca743212a0cd892c5638f64\" returns successfully" Apr 16 00:19:15.792891 containerd[1477]: time="2026-04-16T00:19:15.792748169Z" level=info msg="StartContainer for \"939eb6bb5dd52d13aa848296b88525dafadbfda72be71abcb7c6175538bafc6a\" returns successfully" Apr 16 00:19:15.801673 containerd[1477]: time="2026-04-16T00:19:15.800909035Z" level=info msg="StartContainer for \"0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c\" returns successfully" Apr 16 00:19:16.044105 kubelet[2209]: I0416 00:19:16.042881 2209 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:16.514684 kubelet[2209]: E0416 00:19:16.514042 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:16.517908 kubelet[2209]: E0416 00:19:16.517679 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:16.524843 kubelet[2209]: E0416 00:19:16.524631 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:17.525754 kubelet[2209]: E0416 00:19:17.525705 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:17.526539 kubelet[2209]: E0416 00:19:17.526496 2209 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:17.553889 kubelet[2209]: E0416 00:19:17.553831 2209 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-n-42941c021f\" not found" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:17.605330 kubelet[2209]: I0416 00:19:17.605011 2209 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:17.605330 kubelet[2209]: E0416 00:19:17.605079 2209 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4081-3-6-n-42941c021f\": node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:17.620073 kubelet[2209]: E0416 00:19:17.620008 2209 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:17.720402 kubelet[2209]: E0416 00:19:17.720332 2209 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:17.821699 kubelet[2209]: E0416 00:19:17.821520 2209 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:17.922364 kubelet[2209]: E0416 00:19:17.922259 2209 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:18.022932 kubelet[2209]: E0416 00:19:18.022819 2209 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:18.067995 kubelet[2209]: I0416 00:19:18.067903 2209 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" Apr 16 00:19:18.075887 kubelet[2209]: E0416 00:19:18.075743 2209 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-42941c021f\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" Apr 16 00:19:18.075887 kubelet[2209]: I0416 00:19:18.075793 2209 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:18.078845 kubelet[2209]: E0416 00:19:18.078778 2209 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-42941c021f\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:18.078845 kubelet[2209]: I0416 00:19:18.078827 2209 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:18.081747 kubelet[2209]: E0416 00:19:18.081703 2209 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:18.441618 kubelet[2209]: I0416 00:19:18.441537 2209 apiserver.go:52] "Watching apiserver" Apr 16 00:19:18.468493 kubelet[2209]: I0416 00:19:18.468449 2209 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 16 00:19:18.524258 kubelet[2209]: I0416 00:19:18.524212 2209 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:18.524258 kubelet[2209]: I0416 00:19:18.524270 2209 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" Apr 16 00:19:19.188534 kubelet[2209]: I0416 00:19:19.188499 2209 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:19.726926 systemd[1]: Reloading requested from client PID 2491 ('systemctl') (unit session-7.scope)... Apr 16 00:19:19.726948 systemd[1]: Reloading... Apr 16 00:19:19.833681 zram_generator::config[2537]: No configuration found. Apr 16 00:19:19.930072 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 16 00:19:20.025150 systemd[1]: Reloading finished in 297 ms. Apr 16 00:19:20.066914 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:19:20.082338 systemd[1]: kubelet.service: Deactivated successfully. Apr 16 00:19:20.082770 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:20.082853 systemd[1]: kubelet.service: Consumed 1.588s CPU time, 125.2M memory peak, 0B memory swap peak. Apr 16 00:19:20.091105 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 00:19:20.227162 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 00:19:20.237451 (kubelet)[2576]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 16 00:19:20.289692 kubelet[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 00:19:20.300111 kubelet[2576]: I0416 00:19:20.298814 2576 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 16 00:19:20.300111 kubelet[2576]: I0416 00:19:20.298879 2576 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 00:19:20.300111 kubelet[2576]: I0416 00:19:20.298905 2576 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 16 00:19:20.300111 kubelet[2576]: I0416 00:19:20.298911 2576 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 00:19:20.300111 kubelet[2576]: I0416 00:19:20.299411 2576 server.go:951] "Client rotation is on, will bootstrap in background" Apr 16 00:19:20.303448 kubelet[2576]: I0416 00:19:20.303411 2576 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 16 00:19:20.306986 kubelet[2576]: I0416 00:19:20.306937 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 16 00:19:20.311086 kubelet[2576]: E0416 00:19:20.311051 2576 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 16 00:19:20.311631 kubelet[2576]: I0416 00:19:20.311251 2576 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 16 00:19:20.314042 kubelet[2576]: I0416 00:19:20.314011 2576 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 16 00:19:20.314345 kubelet[2576]: I0416 00:19:20.314275 2576 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 00:19:20.314652 kubelet[2576]: I0416 00:19:20.314352 2576 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-42941c021f","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 00:19:20.314744 kubelet[2576]: I0416 00:19:20.314668 2576 topology_manager.go:143] "Creating topology manager with none policy" Apr 16 00:19:20.314744 kubelet[2576]: I0416 00:19:20.314685 2576 container_manager_linux.go:308] "Creating device plugin manager" Apr 16 00:19:20.314744 kubelet[2576]: I0416 00:19:20.314724 2576 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 16 00:19:20.315057 kubelet[2576]: I0416 00:19:20.315040 2576 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 16 00:19:20.315362 kubelet[2576]: I0416 00:19:20.315292 2576 kubelet.go:482] "Attempting to sync node with API server" Apr 16 00:19:20.315362 kubelet[2576]: I0416 00:19:20.315342 2576 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 00:19:20.315427 kubelet[2576]: I0416 00:19:20.315392 2576 kubelet.go:394] "Adding apiserver pod source" Apr 16 00:19:20.315427 kubelet[2576]: I0416 00:19:20.315409 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 00:19:20.319718 kubelet[2576]: I0416 00:19:20.319672 2576 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 16 00:19:20.320676 kubelet[2576]: I0416 00:19:20.320598 2576 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 00:19:20.320746 kubelet[2576]: I0416 00:19:20.320688 2576 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 16 00:19:20.326384 kubelet[2576]: I0416 00:19:20.326340 2576 server.go:1257] "Started kubelet" Apr 16 00:19:20.327634 kubelet[2576]: I0416 00:19:20.326760 2576 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 00:19:20.327858 kubelet[2576]: I0416 00:19:20.327843 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 00:19:20.339844 kubelet[2576]: I0416 00:19:20.339812 2576 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 16 00:19:20.341594 kubelet[2576]: I0416 00:19:20.328047 2576 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 00:19:20.341757 kubelet[2576]: I0416 00:19:20.341630 2576 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 16 00:19:20.341892 kubelet[2576]: I0416 00:19:20.341810 2576 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 00:19:20.350352 kubelet[2576]: I0416 00:19:20.350308 2576 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 16 00:19:20.362166 kubelet[2576]: E0416 00:19:20.352595 2576 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-42941c021f\" not found" Apr 16 00:19:20.363701 kubelet[2576]: I0416 00:19:20.352394 2576 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 16 00:19:20.364637 kubelet[2576]: I0416 00:19:20.352373 2576 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 16 00:19:20.364637 kubelet[2576]: I0416 00:19:20.364019 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 00:19:20.364637 kubelet[2576]: I0416 00:19:20.364117 2576 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 16 00:19:20.364637 kubelet[2576]: I0416 00:19:20.364501 2576 reconciler.go:29] "Reconciler: start to sync state" Apr 16 00:19:20.373649 kubelet[2576]: I0416 00:19:20.372166 2576 factory.go:223] Registration of the containerd container factory successfully Apr 16 00:19:20.377095 kubelet[2576]: I0416 00:19:20.377047 2576 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 16 00:19:20.379019 kubelet[2576]: I0416 00:19:20.378928 2576 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 16 00:19:20.379019 kubelet[2576]: I0416 00:19:20.378972 2576 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 16 00:19:20.379019 kubelet[2576]: I0416 00:19:20.378995 2576 kubelet.go:2501] "Starting kubelet main sync loop" Apr 16 00:19:20.379191 kubelet[2576]: E0416 00:19:20.379041 2576 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 16 00:19:20.431739 kubelet[2576]: I0416 00:19:20.431708 2576 cpu_manager.go:225] "Starting" policy="none" Apr 16 00:19:20.431739 kubelet[2576]: I0416 00:19:20.431730 2576 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 16 00:19:20.431895 kubelet[2576]: I0416 00:19:20.431755 2576 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 16 00:19:20.431921 kubelet[2576]: I0416 00:19:20.431896 2576 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Apr 16 00:19:20.431921 kubelet[2576]: I0416 00:19:20.431906 2576 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Apr 16 00:19:20.431962 kubelet[2576]: I0416 00:19:20.431924 2576 policy_none.go:50] "Start" Apr 16 00:19:20.431962 kubelet[2576]: I0416 00:19:20.431931 2576 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 16 00:19:20.431962 kubelet[2576]: I0416 00:19:20.431939 2576 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 16 00:19:20.432058 kubelet[2576]: I0416 00:19:20.432035 2576 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Apr 16 00:19:20.432058 kubelet[2576]: I0416 00:19:20.432057 2576 policy_none.go:44] "Start" Apr 16 00:19:20.436552 kubelet[2576]: E0416 00:19:20.436506 2576 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 00:19:20.436754 kubelet[2576]: I0416 00:19:20.436713 2576 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 16 00:19:20.436754 kubelet[2576]: I0416 00:19:20.436731 2576 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 00:19:20.437525 kubelet[2576]: I0416 00:19:20.437481 2576 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 16 00:19:20.441587 kubelet[2576]: E0416 00:19:20.441556 2576 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 16 00:19:20.482415 kubelet[2576]: I0416 00:19:20.480794 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.482415 kubelet[2576]: I0416 00:19:20.480921 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.483635 kubelet[2576]: I0416 00:19:20.482925 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.489481 kubelet[2576]: E0416 00:19:20.489427 2576 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-42941c021f\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.490520 kubelet[2576]: E0416 00:19:20.490378 2576 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-42941c021f\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.491148 kubelet[2576]: E0416 00:19:20.491077 2576 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.541757 kubelet[2576]: I0416 00:19:20.540981 2576 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.555457 kubelet[2576]: I0416 00:19:20.555007 2576 kubelet_node_status.go:123] "Node was previously registered" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.555457 kubelet[2576]: I0416 00:19:20.555119 2576 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566092 kubelet[2576]: I0416 00:19:20.565764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566092 kubelet[2576]: I0416 00:19:20.565802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6ab78f6094756e520d51c8bf9343dea4-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-42941c021f\" (UID: \"6ab78f6094756e520d51c8bf9343dea4\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566092 kubelet[2576]: I0416 00:19:20.565821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9f438274a8734294643326ce7c1f08d0-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-42941c021f\" (UID: \"9f438274a8734294643326ce7c1f08d0\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566092 kubelet[2576]: I0416 00:19:20.565836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566092 kubelet[2576]: I0416 00:19:20.565858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566393 kubelet[2576]: I0416 00:19:20.565874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9f438274a8734294643326ce7c1f08d0-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-42941c021f\" (UID: \"9f438274a8734294643326ce7c1f08d0\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566393 kubelet[2576]: I0416 00:19:20.565892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9f438274a8734294643326ce7c1f08d0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-42941c021f\" (UID: \"9f438274a8734294643326ce7c1f08d0\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566393 kubelet[2576]: I0416 00:19:20.565907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:20.566393 kubelet[2576]: I0416 00:19:20.565952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/39a56de86cb9b07fd25b0e6c91fb0bde-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-42941c021f\" (UID: \"39a56de86cb9b07fd25b0e6c91fb0bde\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" Apr 16 00:19:21.317456 kubelet[2576]: I0416 00:19:21.317142 2576 apiserver.go:52] "Watching apiserver" Apr 16 00:19:21.364539 kubelet[2576]: I0416 00:19:21.364464 2576 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 16 00:19:21.417091 kubelet[2576]: I0416 00:19:21.415169 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:21.427621 kubelet[2576]: E0416 00:19:21.427574 2576 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-42941c021f\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" Apr 16 00:19:21.494383 kubelet[2576]: I0416 00:19:21.494321 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-n-42941c021f" podStartSLOduration=3.494287995 podStartE2EDuration="3.494287995s" podCreationTimestamp="2026-04-16 00:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:19:21.468109953 +0000 UTC m=+1.225983718" watchObservedRunningTime="2026-04-16 00:19:21.494287995 +0000 UTC m=+1.252161760" Apr 16 00:19:21.494951 kubelet[2576]: I0416 00:19:21.494769 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-n-42941c021f" podStartSLOduration=3.494761876 podStartE2EDuration="3.494761876s" podCreationTimestamp="2026-04-16 00:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:19:21.494705012 +0000 UTC m=+1.252578777" watchObservedRunningTime="2026-04-16 00:19:21.494761876 +0000 UTC m=+1.252635641" Apr 16 00:19:21.851914 kubelet[2576]: I0416 00:19:21.851832 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-42941c021f" podStartSLOduration=2.851811644 podStartE2EDuration="2.851811644s" podCreationTimestamp="2026-04-16 00:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:19:21.513128639 +0000 UTC m=+1.271002404" watchObservedRunningTime="2026-04-16 00:19:21.851811644 +0000 UTC m=+1.609685449" Apr 16 00:19:26.635713 kubelet[2576]: I0416 00:19:26.635330 2576 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 16 00:19:26.636754 containerd[1477]: time="2026-04-16T00:19:26.636456699Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 16 00:19:26.637176 kubelet[2576]: I0416 00:19:26.636757 2576 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 16 00:19:27.798707 systemd[1]: Created slice kubepods-besteffort-pod63d94dd3_eddd_4551_b72e_6d1408abfc0a.slice - libcontainer container kubepods-besteffort-pod63d94dd3_eddd_4551_b72e_6d1408abfc0a.slice. Apr 16 00:19:27.816308 kubelet[2576]: I0416 00:19:27.815787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/63d94dd3-eddd-4551-b72e-6d1408abfc0a-xtables-lock\") pod \"kube-proxy-ntwsw\" (UID: \"63d94dd3-eddd-4551-b72e-6d1408abfc0a\") " pod="kube-system/kube-proxy-ntwsw" Apr 16 00:19:27.816308 kubelet[2576]: I0416 00:19:27.815869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63d94dd3-eddd-4551-b72e-6d1408abfc0a-lib-modules\") pod \"kube-proxy-ntwsw\" (UID: \"63d94dd3-eddd-4551-b72e-6d1408abfc0a\") " pod="kube-system/kube-proxy-ntwsw" Apr 16 00:19:27.816308 kubelet[2576]: I0416 00:19:27.815901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/63d94dd3-eddd-4551-b72e-6d1408abfc0a-kube-proxy\") pod \"kube-proxy-ntwsw\" (UID: \"63d94dd3-eddd-4551-b72e-6d1408abfc0a\") " pod="kube-system/kube-proxy-ntwsw" Apr 16 00:19:27.816308 kubelet[2576]: I0416 00:19:27.815933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99mkt\" (UniqueName: \"kubernetes.io/projected/63d94dd3-eddd-4551-b72e-6d1408abfc0a-kube-api-access-99mkt\") pod \"kube-proxy-ntwsw\" (UID: \"63d94dd3-eddd-4551-b72e-6d1408abfc0a\") " pod="kube-system/kube-proxy-ntwsw" Apr 16 00:19:27.916048 systemd[1]: Created slice kubepods-besteffort-pod6f175294_c028_4b99_ac19_f259bb1556cf.slice - libcontainer container kubepods-besteffort-pod6f175294_c028_4b99_ac19_f259bb1556cf.slice. Apr 16 00:19:28.017849 kubelet[2576]: I0416 00:19:28.017727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6f175294-c028-4b99-ac19-f259bb1556cf-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-5djhg\" (UID: \"6f175294-c028-4b99-ac19-f259bb1556cf\") " pod="tigera-operator/tigera-operator-6cf4cccc57-5djhg" Apr 16 00:19:28.017849 kubelet[2576]: I0416 00:19:28.017809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm5qm\" (UniqueName: \"kubernetes.io/projected/6f175294-c028-4b99-ac19-f259bb1556cf-kube-api-access-hm5qm\") pod \"tigera-operator-6cf4cccc57-5djhg\" (UID: \"6f175294-c028-4b99-ac19-f259bb1556cf\") " pod="tigera-operator/tigera-operator-6cf4cccc57-5djhg" Apr 16 00:19:28.113792 containerd[1477]: time="2026-04-16T00:19:28.113294480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ntwsw,Uid:63d94dd3-eddd-4551-b72e-6d1408abfc0a,Namespace:kube-system,Attempt:0,}" Apr 16 00:19:28.153642 containerd[1477]: time="2026-04-16T00:19:28.153516678Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:28.153642 containerd[1477]: time="2026-04-16T00:19:28.153595460Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:28.153861 containerd[1477]: time="2026-04-16T00:19:28.153826962Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:28.154166 containerd[1477]: time="2026-04-16T00:19:28.154063346Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:28.189985 systemd[1]: Started cri-containerd-d1a5b88941541fddf546e13b3a2397adfcf8d11cc1c5dd9cbc72c475ffb11a5e.scope - libcontainer container d1a5b88941541fddf546e13b3a2397adfcf8d11cc1c5dd9cbc72c475ffb11a5e. Apr 16 00:19:28.224695 containerd[1477]: time="2026-04-16T00:19:28.224643515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-5djhg,Uid:6f175294-c028-4b99-ac19-f259bb1556cf,Namespace:tigera-operator,Attempt:0,}" Apr 16 00:19:28.229545 containerd[1477]: time="2026-04-16T00:19:28.229459337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ntwsw,Uid:63d94dd3-eddd-4551-b72e-6d1408abfc0a,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1a5b88941541fddf546e13b3a2397adfcf8d11cc1c5dd9cbc72c475ffb11a5e\"" Apr 16 00:19:28.239481 containerd[1477]: time="2026-04-16T00:19:28.239021723Z" level=info msg="CreateContainer within sandbox \"d1a5b88941541fddf546e13b3a2397adfcf8d11cc1c5dd9cbc72c475ffb11a5e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 16 00:19:28.263574 containerd[1477]: time="2026-04-16T00:19:28.263117080Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:28.263574 containerd[1477]: time="2026-04-16T00:19:28.263186859Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:28.263574 containerd[1477]: time="2026-04-16T00:19:28.263204303Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:28.263574 containerd[1477]: time="2026-04-16T00:19:28.263308892Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:28.264932 containerd[1477]: time="2026-04-16T00:19:28.264888559Z" level=info msg="CreateContainer within sandbox \"d1a5b88941541fddf546e13b3a2397adfcf8d11cc1c5dd9cbc72c475ffb11a5e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d3fc078a5787c56c3e985593e1c3c0db8fbf9d9cc441a4f7795551f9751b2c6b\"" Apr 16 00:19:28.268223 containerd[1477]: time="2026-04-16T00:19:28.268174968Z" level=info msg="StartContainer for \"d3fc078a5787c56c3e985593e1c3c0db8fbf9d9cc441a4f7795551f9751b2c6b\"" Apr 16 00:19:28.288878 systemd[1]: Started cri-containerd-1acf50b3fb716e85622da341e438eee949ae218f340edc6a637a418bf38e3555.scope - libcontainer container 1acf50b3fb716e85622da341e438eee949ae218f340edc6a637a418bf38e3555. Apr 16 00:19:28.310870 systemd[1]: Started cri-containerd-d3fc078a5787c56c3e985593e1c3c0db8fbf9d9cc441a4f7795551f9751b2c6b.scope - libcontainer container d3fc078a5787c56c3e985593e1c3c0db8fbf9d9cc441a4f7795551f9751b2c6b. Apr 16 00:19:28.350035 containerd[1477]: time="2026-04-16T00:19:28.349921796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-5djhg,Uid:6f175294-c028-4b99-ac19-f259bb1556cf,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"1acf50b3fb716e85622da341e438eee949ae218f340edc6a637a418bf38e3555\"" Apr 16 00:19:28.355480 containerd[1477]: time="2026-04-16T00:19:28.355210347Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 16 00:19:28.361481 containerd[1477]: time="2026-04-16T00:19:28.361403382Z" level=info msg="StartContainer for \"d3fc078a5787c56c3e985593e1c3c0db8fbf9d9cc441a4f7795551f9751b2c6b\" returns successfully" Apr 16 00:19:29.867903 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1861206155.mount: Deactivated successfully. Apr 16 00:19:30.320758 containerd[1477]: time="2026-04-16T00:19:30.320686654Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:30.321943 containerd[1477]: time="2026-04-16T00:19:30.321738304Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Apr 16 00:19:30.323670 containerd[1477]: time="2026-04-16T00:19:30.322867653Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:30.326022 containerd[1477]: time="2026-04-16T00:19:30.325970910Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:30.331038 containerd[1477]: time="2026-04-16T00:19:30.330989063Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 1.975717541s" Apr 16 00:19:30.331258 containerd[1477]: time="2026-04-16T00:19:30.331239683Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Apr 16 00:19:30.338829 containerd[1477]: time="2026-04-16T00:19:30.338766472Z" level=info msg="CreateContainer within sandbox \"1acf50b3fb716e85622da341e438eee949ae218f340edc6a637a418bf38e3555\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 16 00:19:30.364183 containerd[1477]: time="2026-04-16T00:19:30.364030958Z" level=info msg="CreateContainer within sandbox \"1acf50b3fb716e85622da341e438eee949ae218f340edc6a637a418bf38e3555\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a\"" Apr 16 00:19:30.366365 containerd[1477]: time="2026-04-16T00:19:30.364717721Z" level=info msg="StartContainer for \"85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a\"" Apr 16 00:19:30.397894 systemd[1]: Started cri-containerd-85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a.scope - libcontainer container 85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a. Apr 16 00:19:30.436301 containerd[1477]: time="2026-04-16T00:19:30.436241643Z" level=info msg="StartContainer for \"85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a\" returns successfully" Apr 16 00:19:30.470264 kubelet[2576]: I0416 00:19:30.469872 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-ntwsw" podStartSLOduration=3.469852432 podStartE2EDuration="3.469852432s" podCreationTimestamp="2026-04-16 00:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:19:28.452199177 +0000 UTC m=+8.210072942" watchObservedRunningTime="2026-04-16 00:19:30.469852432 +0000 UTC m=+10.227726157" Apr 16 00:19:30.719345 kubelet[2576]: I0416 00:19:30.719203 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-5djhg" podStartSLOduration=1.739410176 podStartE2EDuration="3.719184341s" podCreationTimestamp="2026-04-16 00:19:27 +0000 UTC" firstStartedPulling="2026-04-16 00:19:28.353433626 +0000 UTC m=+8.111307391" lastFinishedPulling="2026-04-16 00:19:30.333207791 +0000 UTC m=+10.091081556" observedRunningTime="2026-04-16 00:19:30.471893158 +0000 UTC m=+10.229766923" watchObservedRunningTime="2026-04-16 00:19:30.719184341 +0000 UTC m=+10.477058106" Apr 16 00:19:36.568872 sudo[1707]: pam_unix(sudo:session): session closed for user root Apr 16 00:19:36.586135 sshd[1704]: pam_unix(sshd:session): session closed for user core Apr 16 00:19:36.591866 systemd[1]: sshd@6-78.46.194.74:22-4.175.71.9:41006.service: Deactivated successfully. Apr 16 00:19:36.596058 systemd[1]: session-7.scope: Deactivated successfully. Apr 16 00:19:36.596918 systemd[1]: session-7.scope: Consumed 5.360s CPU time, 151.5M memory peak, 0B memory swap peak. Apr 16 00:19:36.599890 systemd-logind[1452]: Session 7 logged out. Waiting for processes to exit. Apr 16 00:19:36.601295 systemd-logind[1452]: Removed session 7. Apr 16 00:19:42.275001 systemd[1]: Created slice kubepods-besteffort-pod04dd2bb2_652f_4f05_8106_5e08437c1acc.slice - libcontainer container kubepods-besteffort-pod04dd2bb2_652f_4f05_8106_5e08437c1acc.slice. Apr 16 00:19:42.311650 kubelet[2576]: I0416 00:19:42.311590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04dd2bb2-652f-4f05-8106-5e08437c1acc-tigera-ca-bundle\") pod \"calico-typha-794956d8f5-6mrxx\" (UID: \"04dd2bb2-652f-4f05-8106-5e08437c1acc\") " pod="calico-system/calico-typha-794956d8f5-6mrxx" Apr 16 00:19:42.311650 kubelet[2576]: I0416 00:19:42.311653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/04dd2bb2-652f-4f05-8106-5e08437c1acc-typha-certs\") pod \"calico-typha-794956d8f5-6mrxx\" (UID: \"04dd2bb2-652f-4f05-8106-5e08437c1acc\") " pod="calico-system/calico-typha-794956d8f5-6mrxx" Apr 16 00:19:42.312135 kubelet[2576]: I0416 00:19:42.311676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcc6l\" (UniqueName: \"kubernetes.io/projected/04dd2bb2-652f-4f05-8106-5e08437c1acc-kube-api-access-hcc6l\") pod \"calico-typha-794956d8f5-6mrxx\" (UID: \"04dd2bb2-652f-4f05-8106-5e08437c1acc\") " pod="calico-system/calico-typha-794956d8f5-6mrxx" Apr 16 00:19:42.412890 systemd[1]: Created slice kubepods-besteffort-pod20962894_658e_4403_89f1_02e8bd9a131c.slice - libcontainer container kubepods-besteffort-pod20962894_658e_4403_89f1_02e8bd9a131c.slice. Apr 16 00:19:42.513692 kubelet[2576]: I0416 00:19:42.513623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-var-lib-calico\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.513692 kubelet[2576]: I0416 00:19:42.513679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7j2x\" (UniqueName: \"kubernetes.io/projected/20962894-658e-4403-89f1-02e8bd9a131c-kube-api-access-g7j2x\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.513692 kubelet[2576]: I0416 00:19:42.513698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-cni-net-dir\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.513917 kubelet[2576]: I0416 00:19:42.513713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-lib-modules\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.513917 kubelet[2576]: I0416 00:19:42.513730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-cni-log-dir\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.513917 kubelet[2576]: I0416 00:19:42.513745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-var-run-calico\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.513917 kubelet[2576]: I0416 00:19:42.513759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-policysync\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.513917 kubelet[2576]: I0416 00:19:42.513776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/20962894-658e-4403-89f1-02e8bd9a131c-node-certs\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.514027 kubelet[2576]: I0416 00:19:42.513789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-nodeproc\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.514027 kubelet[2576]: I0416 00:19:42.513806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-cni-bin-dir\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.514027 kubelet[2576]: I0416 00:19:42.513823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-flexvol-driver-host\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.514027 kubelet[2576]: I0416 00:19:42.513839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20962894-658e-4403-89f1-02e8bd9a131c-tigera-ca-bundle\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.514027 kubelet[2576]: I0416 00:19:42.513852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-xtables-lock\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.514134 kubelet[2576]: I0416 00:19:42.513867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-bpffs\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.514134 kubelet[2576]: I0416 00:19:42.513880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/20962894-658e-4403-89f1-02e8bd9a131c-sys-fs\") pod \"calico-node-8sr9s\" (UID: \"20962894-658e-4403-89f1-02e8bd9a131c\") " pod="calico-system/calico-node-8sr9s" Apr 16 00:19:42.545902 kubelet[2576]: E0416 00:19:42.545584 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:42.582308 containerd[1477]: time="2026-04-16T00:19:42.581801102Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-794956d8f5-6mrxx,Uid:04dd2bb2-652f-4f05-8106-5e08437c1acc,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:42.615752 kubelet[2576]: I0416 00:19:42.615698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e5a5131-3036-4d73-9c8f-e82ea8bfcf76-registration-dir\") pod \"csi-node-driver-bjzcx\" (UID: \"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76\") " pod="calico-system/csi-node-driver-bjzcx" Apr 16 00:19:42.615910 kubelet[2576]: I0416 00:19:42.615787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e5a5131-3036-4d73-9c8f-e82ea8bfcf76-kubelet-dir\") pod \"csi-node-driver-bjzcx\" (UID: \"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76\") " pod="calico-system/csi-node-driver-bjzcx" Apr 16 00:19:42.615910 kubelet[2576]: I0416 00:19:42.615860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7grd\" (UniqueName: \"kubernetes.io/projected/6e5a5131-3036-4d73-9c8f-e82ea8bfcf76-kube-api-access-g7grd\") pod \"csi-node-driver-bjzcx\" (UID: \"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76\") " pod="calico-system/csi-node-driver-bjzcx" Apr 16 00:19:42.615962 kubelet[2576]: I0416 00:19:42.615914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e5a5131-3036-4d73-9c8f-e82ea8bfcf76-socket-dir\") pod \"csi-node-driver-bjzcx\" (UID: \"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76\") " pod="calico-system/csi-node-driver-bjzcx" Apr 16 00:19:42.615962 kubelet[2576]: I0416 00:19:42.615932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6e5a5131-3036-4d73-9c8f-e82ea8bfcf76-varrun\") pod \"csi-node-driver-bjzcx\" (UID: \"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76\") " pod="calico-system/csi-node-driver-bjzcx" Apr 16 00:19:42.626470 kubelet[2576]: E0416 00:19:42.626426 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.626470 kubelet[2576]: W0416 00:19:42.626455 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.626470 kubelet[2576]: E0416 00:19:42.626479 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.628236 kubelet[2576]: E0416 00:19:42.628200 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.628236 kubelet[2576]: W0416 00:19:42.628226 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.628236 kubelet[2576]: E0416 00:19:42.628244 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.628477 kubelet[2576]: E0416 00:19:42.628456 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.628477 kubelet[2576]: W0416 00:19:42.628470 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.628539 kubelet[2576]: E0416 00:19:42.628480 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.629023 kubelet[2576]: E0416 00:19:42.628986 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.629023 kubelet[2576]: W0416 00:19:42.629003 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.629023 kubelet[2576]: E0416 00:19:42.629014 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.632096 kubelet[2576]: E0416 00:19:42.632068 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.632096 kubelet[2576]: W0416 00:19:42.632090 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.632207 kubelet[2576]: E0416 00:19:42.632105 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.632750 kubelet[2576]: E0416 00:19:42.632719 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.632750 kubelet[2576]: W0416 00:19:42.632743 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.633024 kubelet[2576]: E0416 00:19:42.632756 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.635377 kubelet[2576]: E0416 00:19:42.635336 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.635377 kubelet[2576]: W0416 00:19:42.635363 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.635377 kubelet[2576]: E0416 00:19:42.635380 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.636688 kubelet[2576]: E0416 00:19:42.636540 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.636688 kubelet[2576]: W0416 00:19:42.636560 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.636688 kubelet[2576]: E0416 00:19:42.636587 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.639168 kubelet[2576]: E0416 00:19:42.639128 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.639168 kubelet[2576]: W0416 00:19:42.639153 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.639168 kubelet[2576]: E0416 00:19:42.639172 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.640247 kubelet[2576]: E0416 00:19:42.640215 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.640247 kubelet[2576]: W0416 00:19:42.640238 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.640354 kubelet[2576]: E0416 00:19:42.640254 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.642249 kubelet[2576]: E0416 00:19:42.642219 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.642249 kubelet[2576]: W0416 00:19:42.642242 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.642517 kubelet[2576]: E0416 00:19:42.642427 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.645190 containerd[1477]: time="2026-04-16T00:19:42.644959464Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:42.645190 containerd[1477]: time="2026-04-16T00:19:42.645034072Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:42.645190 containerd[1477]: time="2026-04-16T00:19:42.645050314Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:42.645190 containerd[1477]: time="2026-04-16T00:19:42.645129442Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:42.645860 kubelet[2576]: E0416 00:19:42.645704 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.645860 kubelet[2576]: W0416 00:19:42.645765 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.645860 kubelet[2576]: E0416 00:19:42.645786 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.647487 kubelet[2576]: E0416 00:19:42.647454 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.647487 kubelet[2576]: W0416 00:19:42.647481 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.647670 kubelet[2576]: E0416 00:19:42.647500 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.655684 kubelet[2576]: E0416 00:19:42.651279 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.655684 kubelet[2576]: W0416 00:19:42.651314 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.655684 kubelet[2576]: E0416 00:19:42.651336 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.655684 kubelet[2576]: E0416 00:19:42.655181 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.655684 kubelet[2576]: W0416 00:19:42.655201 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.655684 kubelet[2576]: E0416 00:19:42.655225 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.655684 kubelet[2576]: E0416 00:19:42.655489 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.655684 kubelet[2576]: W0416 00:19:42.655498 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.655684 kubelet[2576]: E0416 00:19:42.655507 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.655997 kubelet[2576]: E0416 00:19:42.655803 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.655997 kubelet[2576]: W0416 00:19:42.655813 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.655997 kubelet[2576]: E0416 00:19:42.655826 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.656062 kubelet[2576]: E0416 00:19:42.656049 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.656062 kubelet[2576]: W0416 00:19:42.656058 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.656107 kubelet[2576]: E0416 00:19:42.656067 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.658554 kubelet[2576]: E0416 00:19:42.656321 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.658554 kubelet[2576]: W0416 00:19:42.656339 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.658554 kubelet[2576]: E0416 00:19:42.656349 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.658554 kubelet[2576]: E0416 00:19:42.656870 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.658554 kubelet[2576]: W0416 00:19:42.656882 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.658554 kubelet[2576]: E0416 00:19:42.656895 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.658554 kubelet[2576]: E0416 00:19:42.657430 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.658554 kubelet[2576]: W0416 00:19:42.657441 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.658554 kubelet[2576]: E0416 00:19:42.657461 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.658554 kubelet[2576]: E0416 00:19:42.657799 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.658880 kubelet[2576]: W0416 00:19:42.657810 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.658880 kubelet[2576]: E0416 00:19:42.657820 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.658880 kubelet[2576]: E0416 00:19:42.658245 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.658880 kubelet[2576]: W0416 00:19:42.658256 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.658880 kubelet[2576]: E0416 00:19:42.658266 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.658880 kubelet[2576]: E0416 00:19:42.658446 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.658880 kubelet[2576]: W0416 00:19:42.658453 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.658880 kubelet[2576]: E0416 00:19:42.658462 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.661043 kubelet[2576]: E0416 00:19:42.660904 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.661043 kubelet[2576]: W0416 00:19:42.660929 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.661043 kubelet[2576]: E0416 00:19:42.660944 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.662119 kubelet[2576]: E0416 00:19:42.662094 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.662119 kubelet[2576]: W0416 00:19:42.662119 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.662398 kubelet[2576]: E0416 00:19:42.662132 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.662736 kubelet[2576]: E0416 00:19:42.662717 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.662736 kubelet[2576]: W0416 00:19:42.662735 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.662815 kubelet[2576]: E0416 00:19:42.662748 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.663381 kubelet[2576]: E0416 00:19:42.663362 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.663381 kubelet[2576]: W0416 00:19:42.663379 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.663457 kubelet[2576]: E0416 00:19:42.663392 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.664195 kubelet[2576]: E0416 00:19:42.664167 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.664195 kubelet[2576]: W0416 00:19:42.664186 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.664195 kubelet[2576]: E0416 00:19:42.664200 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.665864 kubelet[2576]: E0416 00:19:42.664949 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.665864 kubelet[2576]: W0416 00:19:42.665855 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.665864 kubelet[2576]: E0416 00:19:42.665875 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.667829 kubelet[2576]: E0416 00:19:42.667784 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.667829 kubelet[2576]: W0416 00:19:42.667803 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.667829 kubelet[2576]: E0416 00:19:42.667819 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.684868 systemd[1]: Started cri-containerd-a05b9bf6691dd06edbfc3b183560dfe7fe55c3672f31778e6eec3bec274a8d6e.scope - libcontainer container a05b9bf6691dd06edbfc3b183560dfe7fe55c3672f31778e6eec3bec274a8d6e. Apr 16 00:19:42.692075 kubelet[2576]: E0416 00:19:42.692036 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.692075 kubelet[2576]: W0416 00:19:42.692060 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.692075 kubelet[2576]: E0416 00:19:42.692079 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.717459 kubelet[2576]: E0416 00:19:42.716973 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.717459 kubelet[2576]: W0416 00:19:42.717004 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.717459 kubelet[2576]: E0416 00:19:42.717030 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.718228 kubelet[2576]: E0416 00:19:42.718086 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.718892 kubelet[2576]: W0416 00:19:42.718848 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.718979 kubelet[2576]: E0416 00:19:42.718884 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.721988 kubelet[2576]: E0416 00:19:42.719219 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.721988 kubelet[2576]: W0416 00:19:42.719285 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.721988 kubelet[2576]: E0416 00:19:42.719297 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.721988 kubelet[2576]: E0416 00:19:42.719782 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.721988 kubelet[2576]: W0416 00:19:42.719797 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.721988 kubelet[2576]: E0416 00:19:42.719809 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.721988 kubelet[2576]: E0416 00:19:42.720707 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.721988 kubelet[2576]: W0416 00:19:42.720731 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.721988 kubelet[2576]: E0416 00:19:42.720744 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.721988 kubelet[2576]: E0416 00:19:42.721444 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.722282 kubelet[2576]: W0416 00:19:42.721457 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.722282 kubelet[2576]: E0416 00:19:42.721494 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.722282 kubelet[2576]: E0416 00:19:42.721755 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.722282 kubelet[2576]: W0416 00:19:42.721782 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.722282 kubelet[2576]: E0416 00:19:42.721794 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.722282 kubelet[2576]: E0416 00:19:42.721983 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.722282 kubelet[2576]: W0416 00:19:42.721993 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.722282 kubelet[2576]: E0416 00:19:42.722002 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.723221 kubelet[2576]: E0416 00:19:42.723089 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.723221 kubelet[2576]: W0416 00:19:42.723148 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.723221 kubelet[2576]: E0416 00:19:42.723167 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.724061 kubelet[2576]: E0416 00:19:42.724041 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.724333 kubelet[2576]: W0416 00:19:42.724149 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.724333 kubelet[2576]: E0416 00:19:42.724235 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.724556 kubelet[2576]: E0416 00:19:42.724541 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.725530 kubelet[2576]: W0416 00:19:42.724643 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.725530 kubelet[2576]: E0416 00:19:42.724661 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.725530 kubelet[2576]: E0416 00:19:42.724833 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.725530 kubelet[2576]: W0416 00:19:42.724844 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.725530 kubelet[2576]: E0416 00:19:42.724854 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.725530 kubelet[2576]: E0416 00:19:42.725149 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.725530 kubelet[2576]: W0416 00:19:42.725158 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.725530 kubelet[2576]: E0416 00:19:42.725168 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.725530 kubelet[2576]: E0416 00:19:42.725321 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.725530 kubelet[2576]: W0416 00:19:42.725328 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.725861 kubelet[2576]: E0416 00:19:42.725336 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.725861 kubelet[2576]: E0416 00:19:42.725463 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.725861 kubelet[2576]: W0416 00:19:42.725469 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.725861 kubelet[2576]: E0416 00:19:42.725478 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.726215 kubelet[2576]: E0416 00:19:42.726068 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.726215 kubelet[2576]: W0416 00:19:42.726083 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.726215 kubelet[2576]: E0416 00:19:42.726095 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.726440 kubelet[2576]: E0416 00:19:42.726426 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.726513 kubelet[2576]: W0416 00:19:42.726500 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.726575 kubelet[2576]: E0416 00:19:42.726563 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.726883 kubelet[2576]: E0416 00:19:42.726869 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.726951 kubelet[2576]: W0416 00:19:42.726939 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.727030 kubelet[2576]: E0416 00:19:42.727018 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.727238 kubelet[2576]: E0416 00:19:42.727226 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.727401 kubelet[2576]: W0416 00:19:42.727321 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.727401 kubelet[2576]: E0416 00:19:42.727338 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.727645 kubelet[2576]: E0416 00:19:42.727630 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.727726 kubelet[2576]: W0416 00:19:42.727713 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.727790 kubelet[2576]: E0416 00:19:42.727777 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.728314 containerd[1477]: time="2026-04-16T00:19:42.728268073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8sr9s,Uid:20962894-658e-4403-89f1-02e8bd9a131c,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:42.728710 kubelet[2576]: E0416 00:19:42.728692 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.728813 kubelet[2576]: W0416 00:19:42.728798 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.728920 kubelet[2576]: E0416 00:19:42.728906 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.729559 kubelet[2576]: E0416 00:19:42.729543 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.729724 kubelet[2576]: W0416 00:19:42.729708 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.729788 kubelet[2576]: E0416 00:19:42.729777 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.730177 kubelet[2576]: E0416 00:19:42.730163 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.730400 kubelet[2576]: W0416 00:19:42.730380 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.730481 kubelet[2576]: E0416 00:19:42.730469 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.730826 kubelet[2576]: E0416 00:19:42.730810 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.730901 kubelet[2576]: W0416 00:19:42.730889 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.730966 kubelet[2576]: E0416 00:19:42.730955 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.731723 kubelet[2576]: E0416 00:19:42.731701 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.731843 kubelet[2576]: W0416 00:19:42.731792 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.731843 kubelet[2576]: E0416 00:19:42.731809 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.737680 containerd[1477]: time="2026-04-16T00:19:42.737424477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-794956d8f5-6mrxx,Uid:04dd2bb2-652f-4f05-8106-5e08437c1acc,Namespace:calico-system,Attempt:0,} returns sandbox id \"a05b9bf6691dd06edbfc3b183560dfe7fe55c3672f31778e6eec3bec274a8d6e\"" Apr 16 00:19:42.741059 containerd[1477]: time="2026-04-16T00:19:42.740864493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 16 00:19:42.743655 kubelet[2576]: E0416 00:19:42.743529 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:42.743655 kubelet[2576]: W0416 00:19:42.743650 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:42.743902 kubelet[2576]: E0416 00:19:42.743674 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:42.761051 containerd[1477]: time="2026-04-16T00:19:42.760940614Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:42.761051 containerd[1477]: time="2026-04-16T00:19:42.761011101Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:42.761476 containerd[1477]: time="2026-04-16T00:19:42.761270570Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:42.761476 containerd[1477]: time="2026-04-16T00:19:42.761409625Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:42.785895 systemd[1]: Started cri-containerd-5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1.scope - libcontainer container 5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1. Apr 16 00:19:42.814049 containerd[1477]: time="2026-04-16T00:19:42.812755972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8sr9s,Uid:20962894-658e-4403-89f1-02e8bd9a131c,Namespace:calico-system,Attempt:0,} returns sandbox id \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\"" Apr 16 00:19:44.181055 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2788605896.mount: Deactivated successfully. Apr 16 00:19:44.380270 kubelet[2576]: E0416 00:19:44.380229 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:44.658913 containerd[1477]: time="2026-04-16T00:19:44.657978774Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:44.658913 containerd[1477]: time="2026-04-16T00:19:44.658868700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Apr 16 00:19:44.659554 containerd[1477]: time="2026-04-16T00:19:44.659522043Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:44.662374 containerd[1477]: time="2026-04-16T00:19:44.662333874Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:44.663460 containerd[1477]: time="2026-04-16T00:19:44.663422659Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 1.922512681s" Apr 16 00:19:44.663460 containerd[1477]: time="2026-04-16T00:19:44.663456182Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Apr 16 00:19:44.665890 containerd[1477]: time="2026-04-16T00:19:44.665857493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 16 00:19:44.680831 containerd[1477]: time="2026-04-16T00:19:44.680749328Z" level=info msg="CreateContainer within sandbox \"a05b9bf6691dd06edbfc3b183560dfe7fe55c3672f31778e6eec3bec274a8d6e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 16 00:19:44.702674 containerd[1477]: time="2026-04-16T00:19:44.702596392Z" level=info msg="CreateContainer within sandbox \"a05b9bf6691dd06edbfc3b183560dfe7fe55c3672f31778e6eec3bec274a8d6e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"14e81898f8f78234bf24242e0c7b4bcbec695e13efcfe70824d756a0f2283822\"" Apr 16 00:19:44.706822 containerd[1477]: time="2026-04-16T00:19:44.706750632Z" level=info msg="StartContainer for \"14e81898f8f78234bf24242e0c7b4bcbec695e13efcfe70824d756a0f2283822\"" Apr 16 00:19:44.748998 systemd[1]: Started cri-containerd-14e81898f8f78234bf24242e0c7b4bcbec695e13efcfe70824d756a0f2283822.scope - libcontainer container 14e81898f8f78234bf24242e0c7b4bcbec695e13efcfe70824d756a0f2283822. Apr 16 00:19:44.790986 containerd[1477]: time="2026-04-16T00:19:44.790898737Z" level=info msg="StartContainer for \"14e81898f8f78234bf24242e0c7b4bcbec695e13efcfe70824d756a0f2283822\" returns successfully" Apr 16 00:19:45.506976 kubelet[2576]: I0416 00:19:45.506291 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-794956d8f5-6mrxx" podStartSLOduration=1.581894577 podStartE2EDuration="3.506274562s" podCreationTimestamp="2026-04-16 00:19:42 +0000 UTC" firstStartedPulling="2026-04-16 00:19:42.740521256 +0000 UTC m=+22.498395021" lastFinishedPulling="2026-04-16 00:19:44.664901281 +0000 UTC m=+24.422775006" observedRunningTime="2026-04-16 00:19:45.50592041 +0000 UTC m=+25.263794215" watchObservedRunningTime="2026-04-16 00:19:45.506274562 +0000 UTC m=+25.264148327" Apr 16 00:19:45.516130 kubelet[2576]: E0416 00:19:45.516092 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.516257 kubelet[2576]: W0416 00:19:45.516123 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.516257 kubelet[2576]: E0416 00:19:45.516184 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.516587 kubelet[2576]: E0416 00:19:45.516472 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.516587 kubelet[2576]: W0416 00:19:45.516516 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.516587 kubelet[2576]: E0416 00:19:45.516532 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.517068 kubelet[2576]: E0416 00:19:45.517044 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.517068 kubelet[2576]: W0416 00:19:45.517067 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.517170 kubelet[2576]: E0416 00:19:45.517081 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.517324 kubelet[2576]: E0416 00:19:45.517308 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.517324 kubelet[2576]: W0416 00:19:45.517324 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.517417 kubelet[2576]: E0416 00:19:45.517335 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.517537 kubelet[2576]: E0416 00:19:45.517516 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.517537 kubelet[2576]: W0416 00:19:45.517531 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.517595 kubelet[2576]: E0416 00:19:45.517543 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.517741 kubelet[2576]: E0416 00:19:45.517726 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.517741 kubelet[2576]: W0416 00:19:45.517740 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.517919 kubelet[2576]: E0416 00:19:45.517750 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.517945 kubelet[2576]: E0416 00:19:45.517937 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.517969 kubelet[2576]: W0416 00:19:45.517945 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.517969 kubelet[2576]: E0416 00:19:45.517953 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.518152 kubelet[2576]: E0416 00:19:45.518138 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.518152 kubelet[2576]: W0416 00:19:45.518152 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.518218 kubelet[2576]: E0416 00:19:45.518161 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.518336 kubelet[2576]: E0416 00:19:45.518325 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.518364 kubelet[2576]: W0416 00:19:45.518337 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.518364 kubelet[2576]: E0416 00:19:45.518345 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.518485 kubelet[2576]: E0416 00:19:45.518474 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.518485 kubelet[2576]: W0416 00:19:45.518484 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.518540 kubelet[2576]: E0416 00:19:45.518492 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.518643 kubelet[2576]: E0416 00:19:45.518633 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.518680 kubelet[2576]: W0416 00:19:45.518643 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.518680 kubelet[2576]: E0416 00:19:45.518651 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.518795 kubelet[2576]: E0416 00:19:45.518784 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.518795 kubelet[2576]: W0416 00:19:45.518794 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.519207 kubelet[2576]: E0416 00:19:45.518802 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.519207 kubelet[2576]: E0416 00:19:45.518955 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.519207 kubelet[2576]: W0416 00:19:45.518964 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.519207 kubelet[2576]: E0416 00:19:45.518972 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.519207 kubelet[2576]: E0416 00:19:45.519113 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.519207 kubelet[2576]: W0416 00:19:45.519121 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.519207 kubelet[2576]: E0416 00:19:45.519127 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.519407 kubelet[2576]: E0416 00:19:45.519268 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.519407 kubelet[2576]: W0416 00:19:45.519274 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.519407 kubelet[2576]: E0416 00:19:45.519281 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.540750 kubelet[2576]: E0416 00:19:45.540716 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.540750 kubelet[2576]: W0416 00:19:45.540743 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.541006 kubelet[2576]: E0416 00:19:45.540768 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.541066 kubelet[2576]: E0416 00:19:45.541041 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.541066 kubelet[2576]: W0416 00:19:45.541052 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.541271 kubelet[2576]: E0416 00:19:45.541066 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.541409 kubelet[2576]: E0416 00:19:45.541394 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.541558 kubelet[2576]: W0416 00:19:45.541473 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.541558 kubelet[2576]: E0416 00:19:45.541494 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.541918 kubelet[2576]: E0416 00:19:45.541897 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.542068 kubelet[2576]: W0416 00:19:45.541985 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.542068 kubelet[2576]: E0416 00:19:45.542009 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.542461 kubelet[2576]: E0416 00:19:45.542296 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.542461 kubelet[2576]: W0416 00:19:45.542312 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.542461 kubelet[2576]: E0416 00:19:45.542328 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.542750 kubelet[2576]: E0416 00:19:45.542733 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.542967 kubelet[2576]: W0416 00:19:45.542824 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.542967 kubelet[2576]: E0416 00:19:45.542873 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.543279 kubelet[2576]: E0416 00:19:45.543266 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.543361 kubelet[2576]: W0416 00:19:45.543347 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.543491 kubelet[2576]: E0416 00:19:45.543409 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.543728 kubelet[2576]: E0416 00:19:45.543714 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.543895 kubelet[2576]: W0416 00:19:45.543788 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.543895 kubelet[2576]: E0416 00:19:45.543806 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.544392 kubelet[2576]: E0416 00:19:45.544376 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.544520 kubelet[2576]: W0416 00:19:45.544479 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.544520 kubelet[2576]: E0416 00:19:45.544500 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.545028 kubelet[2576]: E0416 00:19:45.544966 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.545028 kubelet[2576]: W0416 00:19:45.544985 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.545028 kubelet[2576]: E0416 00:19:45.544999 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.545934 kubelet[2576]: E0416 00:19:45.545792 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.545934 kubelet[2576]: W0416 00:19:45.545810 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.545934 kubelet[2576]: E0416 00:19:45.545826 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.546406 kubelet[2576]: E0416 00:19:45.546330 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.546482 kubelet[2576]: W0416 00:19:45.546343 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.546482 kubelet[2576]: E0416 00:19:45.546468 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.546961 kubelet[2576]: E0416 00:19:45.546942 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.547154 kubelet[2576]: W0416 00:19:45.547045 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.547154 kubelet[2576]: E0416 00:19:45.547064 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.547435 kubelet[2576]: E0416 00:19:45.547422 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.547691 kubelet[2576]: W0416 00:19:45.547547 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.547691 kubelet[2576]: E0416 00:19:45.547571 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.548053 kubelet[2576]: E0416 00:19:45.548029 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.548213 kubelet[2576]: W0416 00:19:45.548055 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.548213 kubelet[2576]: E0416 00:19:45.548099 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.548780 kubelet[2576]: E0416 00:19:45.548759 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.548838 kubelet[2576]: W0416 00:19:45.548784 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.548838 kubelet[2576]: E0416 00:19:45.548829 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.549548 kubelet[2576]: E0416 00:19:45.549386 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.549548 kubelet[2576]: W0416 00:19:45.549403 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.549548 kubelet[2576]: E0416 00:19:45.549417 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.550648 kubelet[2576]: E0416 00:19:45.550551 2576 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 00:19:45.550648 kubelet[2576]: W0416 00:19:45.550566 2576 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 00:19:45.550648 kubelet[2576]: E0416 00:19:45.550580 2576 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 00:19:45.672379 systemd[1]: run-containerd-runc-k8s.io-14e81898f8f78234bf24242e0c7b4bcbec695e13efcfe70824d756a0f2283822-runc.ApeKcr.mount: Deactivated successfully. Apr 16 00:19:46.120328 containerd[1477]: time="2026-04-16T00:19:46.120266943Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:46.121516 containerd[1477]: time="2026-04-16T00:19:46.121361196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Apr 16 00:19:46.122629 containerd[1477]: time="2026-04-16T00:19:46.122443927Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:46.125345 containerd[1477]: time="2026-04-16T00:19:46.125284648Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:46.126460 containerd[1477]: time="2026-04-16T00:19:46.126259570Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.460233421s" Apr 16 00:19:46.126460 containerd[1477]: time="2026-04-16T00:19:46.126298854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Apr 16 00:19:46.133322 containerd[1477]: time="2026-04-16T00:19:46.133188157Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 16 00:19:46.155516 containerd[1477]: time="2026-04-16T00:19:46.155331752Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2\"" Apr 16 00:19:46.156434 containerd[1477]: time="2026-04-16T00:19:46.156391081Z" level=info msg="StartContainer for \"3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2\"" Apr 16 00:19:46.192834 systemd[1]: Started cri-containerd-3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2.scope - libcontainer container 3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2. Apr 16 00:19:46.223187 containerd[1477]: time="2026-04-16T00:19:46.223095928Z" level=info msg="StartContainer for \"3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2\" returns successfully" Apr 16 00:19:46.241187 systemd[1]: cri-containerd-3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2.scope: Deactivated successfully. Apr 16 00:19:46.344273 containerd[1477]: time="2026-04-16T00:19:46.344213301Z" level=info msg="shim disconnected" id=3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2 namespace=k8s.io Apr 16 00:19:46.344688 containerd[1477]: time="2026-04-16T00:19:46.344498406Z" level=warning msg="cleaning up after shim disconnected" id=3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2 namespace=k8s.io Apr 16 00:19:46.344688 containerd[1477]: time="2026-04-16T00:19:46.344515967Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:19:46.380198 kubelet[2576]: E0416 00:19:46.379712 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:46.496694 kubelet[2576]: I0416 00:19:46.496657 2576 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:19:46.498633 containerd[1477]: time="2026-04-16T00:19:46.498494162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 16 00:19:46.673273 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3e44de6a1bdd492e9a3954f84079d09fe9257c3fc8296e45eebb3ef4182aedf2-rootfs.mount: Deactivated successfully. Apr 16 00:19:48.381129 kubelet[2576]: E0416 00:19:48.381005 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:50.381578 kubelet[2576]: E0416 00:19:50.380145 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:51.296080 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2861931720.mount: Deactivated successfully. Apr 16 00:19:51.327793 containerd[1477]: time="2026-04-16T00:19:51.327721992Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:51.330107 containerd[1477]: time="2026-04-16T00:19:51.330044694Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Apr 16 00:19:51.331394 containerd[1477]: time="2026-04-16T00:19:51.331334253Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:51.334035 containerd[1477]: time="2026-04-16T00:19:51.333937373Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:51.334858 containerd[1477]: time="2026-04-16T00:19:51.334815786Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 4.83626734s" Apr 16 00:19:51.334935 containerd[1477]: time="2026-04-16T00:19:51.334859349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Apr 16 00:19:51.340665 containerd[1477]: time="2026-04-16T00:19:51.340454692Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 16 00:19:51.358640 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3614877500.mount: Deactivated successfully. Apr 16 00:19:51.361819 containerd[1477]: time="2026-04-16T00:19:51.361768719Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0\"" Apr 16 00:19:51.364071 containerd[1477]: time="2026-04-16T00:19:51.364023817Z" level=info msg="StartContainer for \"74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0\"" Apr 16 00:19:51.400749 systemd[1]: Started cri-containerd-74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0.scope - libcontainer container 74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0. Apr 16 00:19:51.441543 containerd[1477]: time="2026-04-16T00:19:51.441492527Z" level=info msg="StartContainer for \"74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0\" returns successfully" Apr 16 00:19:51.649652 systemd[1]: cri-containerd-74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0.scope: Deactivated successfully. Apr 16 00:19:51.802970 containerd[1477]: time="2026-04-16T00:19:51.802786837Z" level=info msg="shim disconnected" id=74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0 namespace=k8s.io Apr 16 00:19:51.802970 containerd[1477]: time="2026-04-16T00:19:51.802854521Z" level=warning msg="cleaning up after shim disconnected" id=74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0 namespace=k8s.io Apr 16 00:19:51.802970 containerd[1477]: time="2026-04-16T00:19:51.802863162Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:19:52.297121 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-74d0aabfacc7959a55362493f87f1893ee371236e54d3aef70836ac8fe2dd5d0-rootfs.mount: Deactivated successfully. Apr 16 00:19:52.380714 kubelet[2576]: E0416 00:19:52.380041 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:52.526061 containerd[1477]: time="2026-04-16T00:19:52.524848457Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 16 00:19:54.382066 kubelet[2576]: E0416 00:19:54.379933 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:55.018791 containerd[1477]: time="2026-04-16T00:19:55.018712941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:55.020791 containerd[1477]: time="2026-04-16T00:19:55.020732157Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Apr 16 00:19:55.021832 containerd[1477]: time="2026-04-16T00:19:55.021783527Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:55.024867 containerd[1477]: time="2026-04-16T00:19:55.024580459Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:19:55.026094 containerd[1477]: time="2026-04-16T00:19:55.025962804Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.501070185s" Apr 16 00:19:55.026094 containerd[1477]: time="2026-04-16T00:19:55.025999806Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Apr 16 00:19:55.033701 containerd[1477]: time="2026-04-16T00:19:55.033586285Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 16 00:19:55.051566 containerd[1477]: time="2026-04-16T00:19:55.051499854Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f\"" Apr 16 00:19:55.053013 containerd[1477]: time="2026-04-16T00:19:55.052969403Z" level=info msg="StartContainer for \"c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f\"" Apr 16 00:19:55.093844 systemd[1]: Started cri-containerd-c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f.scope - libcontainer container c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f. Apr 16 00:19:55.125716 containerd[1477]: time="2026-04-16T00:19:55.125669447Z" level=info msg="StartContainer for \"c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f\" returns successfully" Apr 16 00:19:55.699988 containerd[1477]: time="2026-04-16T00:19:55.699922603Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 16 00:19:55.703515 systemd[1]: cri-containerd-c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f.scope: Deactivated successfully. Apr 16 00:19:55.712778 kubelet[2576]: I0416 00:19:55.712572 2576 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Apr 16 00:19:55.747959 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f-rootfs.mount: Deactivated successfully. Apr 16 00:19:55.837183 containerd[1477]: time="2026-04-16T00:19:55.837016696Z" level=info msg="shim disconnected" id=c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f namespace=k8s.io Apr 16 00:19:55.837183 containerd[1477]: time="2026-04-16T00:19:55.837113820Z" level=warning msg="cleaning up after shim disconnected" id=c267039526d4cbd305e2e0feacfefc8cbc4ac1f921bbe0ccd216bea47a7d813f namespace=k8s.io Apr 16 00:19:55.837183 containerd[1477]: time="2026-04-16T00:19:55.837147982Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:19:55.841308 systemd[1]: Created slice kubepods-burstable-podcf25c361_9e08_4e96_8a6d_6349a738a504.slice - libcontainer container kubepods-burstable-podcf25c361_9e08_4e96_8a6d_6349a738a504.slice. Apr 16 00:19:55.859560 systemd[1]: Created slice kubepods-burstable-pod60276bc4_09cf_4ae6_9503_beaf347fe010.slice - libcontainer container kubepods-burstable-pod60276bc4_09cf_4ae6_9503_beaf347fe010.slice. Apr 16 00:19:55.884538 systemd[1]: Created slice kubepods-besteffort-pod160db7e1_a4ae_471a_bd06_9c5e497e7d3a.slice - libcontainer container kubepods-besteffort-pod160db7e1_a4ae_471a_bd06_9c5e497e7d3a.slice. Apr 16 00:19:55.907557 systemd[1]: Created slice kubepods-besteffort-pod59287dd3_2523_4d48_9666_1ef09e5795a0.slice - libcontainer container kubepods-besteffort-pod59287dd3_2523_4d48_9666_1ef09e5795a0.slice. Apr 16 00:19:55.914414 systemd[1]: Created slice kubepods-besteffort-pod7b041080_6884_4ac4_b0b2_bb667ac35753.slice - libcontainer container kubepods-besteffort-pod7b041080_6884_4ac4_b0b2_bb667ac35753.slice. Apr 16 00:19:55.919477 kubelet[2576]: I0416 00:19:55.919407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz7m7\" (UniqueName: \"kubernetes.io/projected/60276bc4-09cf-4ae6-9503-beaf347fe010-kube-api-access-dz7m7\") pod \"coredns-7d764666f9-xph97\" (UID: \"60276bc4-09cf-4ae6-9503-beaf347fe010\") " pod="kube-system/coredns-7d764666f9-xph97" Apr 16 00:19:55.919887 kubelet[2576]: I0416 00:19:55.919720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-backend-key-pair\") pod \"whisker-5b9458fc5c-d2f82\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " pod="calico-system/whisker-5b9458fc5c-d2f82" Apr 16 00:19:55.920514 kubelet[2576]: I0416 00:19:55.920176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c032d05-2ae2-4d00-a7b9-b6421d73e412-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-xxdtw\" (UID: \"1c032d05-2ae2-4d00-a7b9-b6421d73e412\") " pod="calico-system/goldmane-9f7667bb8-xxdtw" Apr 16 00:19:55.920514 kubelet[2576]: I0416 00:19:55.920210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60276bc4-09cf-4ae6-9503-beaf347fe010-config-volume\") pod \"coredns-7d764666f9-xph97\" (UID: \"60276bc4-09cf-4ae6-9503-beaf347fe010\") " pod="kube-system/coredns-7d764666f9-xph97" Apr 16 00:19:55.920514 kubelet[2576]: I0416 00:19:55.920229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/160db7e1-a4ae-471a-bd06-9c5e497e7d3a-tigera-ca-bundle\") pod \"calico-kube-controllers-64786d454d-bl4hx\" (UID: \"160db7e1-a4ae-471a-bd06-9c5e497e7d3a\") " pod="calico-system/calico-kube-controllers-64786d454d-bl4hx" Apr 16 00:19:55.920514 kubelet[2576]: I0416 00:19:55.920353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/1c032d05-2ae2-4d00-a7b9-b6421d73e412-goldmane-key-pair\") pod \"goldmane-9f7667bb8-xxdtw\" (UID: \"1c032d05-2ae2-4d00-a7b9-b6421d73e412\") " pod="calico-system/goldmane-9f7667bb8-xxdtw" Apr 16 00:19:55.921181 kubelet[2576]: I0416 00:19:55.920916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/59287dd3-2523-4d48-9666-1ef09e5795a0-calico-apiserver-certs\") pod \"calico-apiserver-7dd9588bc7-fkgbz\" (UID: \"59287dd3-2523-4d48-9666-1ef09e5795a0\") " pod="calico-system/calico-apiserver-7dd9588bc7-fkgbz" Apr 16 00:19:55.921181 kubelet[2576]: I0416 00:19:55.921075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45q9\" (UniqueName: \"kubernetes.io/projected/59287dd3-2523-4d48-9666-1ef09e5795a0-kube-api-access-w45q9\") pod \"calico-apiserver-7dd9588bc7-fkgbz\" (UID: \"59287dd3-2523-4d48-9666-1ef09e5795a0\") " pod="calico-system/calico-apiserver-7dd9588bc7-fkgbz" Apr 16 00:19:55.921820 kubelet[2576]: I0416 00:19:55.921701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7b041080-6884-4ac4-b0b2-bb667ac35753-calico-apiserver-certs\") pod \"calico-apiserver-7dd9588bc7-f6npc\" (UID: \"7b041080-6884-4ac4-b0b2-bb667ac35753\") " pod="calico-system/calico-apiserver-7dd9588bc7-f6npc" Apr 16 00:19:55.921820 kubelet[2576]: I0416 00:19:55.921761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c032d05-2ae2-4d00-a7b9-b6421d73e412-config\") pod \"goldmane-9f7667bb8-xxdtw\" (UID: \"1c032d05-2ae2-4d00-a7b9-b6421d73e412\") " pod="calico-system/goldmane-9f7667bb8-xxdtw" Apr 16 00:19:55.921820 kubelet[2576]: I0416 00:19:55.921783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vsl8\" (UniqueName: \"kubernetes.io/projected/7b041080-6884-4ac4-b0b2-bb667ac35753-kube-api-access-2vsl8\") pod \"calico-apiserver-7dd9588bc7-f6npc\" (UID: \"7b041080-6884-4ac4-b0b2-bb667ac35753\") " pod="calico-system/calico-apiserver-7dd9588bc7-f6npc" Apr 16 00:19:55.921820 kubelet[2576]: I0416 00:19:55.921805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlkq5\" (UniqueName: \"kubernetes.io/projected/160db7e1-a4ae-471a-bd06-9c5e497e7d3a-kube-api-access-tlkq5\") pod \"calico-kube-controllers-64786d454d-bl4hx\" (UID: \"160db7e1-a4ae-471a-bd06-9c5e497e7d3a\") " pod="calico-system/calico-kube-controllers-64786d454d-bl4hx" Apr 16 00:19:55.921820 kubelet[2576]: I0416 00:19:55.921822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-nginx-config\") pod \"whisker-5b9458fc5c-d2f82\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " pod="calico-system/whisker-5b9458fc5c-d2f82" Apr 16 00:19:55.921983 kubelet[2576]: I0416 00:19:55.921845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-ca-bundle\") pod \"whisker-5b9458fc5c-d2f82\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " pod="calico-system/whisker-5b9458fc5c-d2f82" Apr 16 00:19:55.921983 kubelet[2576]: I0416 00:19:55.921860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7qh9\" (UniqueName: \"kubernetes.io/projected/0326abb9-695f-49d0-97be-8ec08eebabaf-kube-api-access-k7qh9\") pod \"whisker-5b9458fc5c-d2f82\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " pod="calico-system/whisker-5b9458fc5c-d2f82" Apr 16 00:19:55.921983 kubelet[2576]: I0416 00:19:55.921888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr4h2\" (UniqueName: \"kubernetes.io/projected/1c032d05-2ae2-4d00-a7b9-b6421d73e412-kube-api-access-hr4h2\") pod \"goldmane-9f7667bb8-xxdtw\" (UID: \"1c032d05-2ae2-4d00-a7b9-b6421d73e412\") " pod="calico-system/goldmane-9f7667bb8-xxdtw" Apr 16 00:19:55.921983 kubelet[2576]: I0416 00:19:55.921904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf25c361-9e08-4e96-8a6d-6349a738a504-config-volume\") pod \"coredns-7d764666f9-zjtd7\" (UID: \"cf25c361-9e08-4e96-8a6d-6349a738a504\") " pod="kube-system/coredns-7d764666f9-zjtd7" Apr 16 00:19:55.921983 kubelet[2576]: I0416 00:19:55.921918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vds8\" (UniqueName: \"kubernetes.io/projected/cf25c361-9e08-4e96-8a6d-6349a738a504-kube-api-access-8vds8\") pod \"coredns-7d764666f9-zjtd7\" (UID: \"cf25c361-9e08-4e96-8a6d-6349a738a504\") " pod="kube-system/coredns-7d764666f9-zjtd7" Apr 16 00:19:55.929102 systemd[1]: Created slice kubepods-besteffort-pod1c032d05_2ae2_4d00_a7b9_b6421d73e412.slice - libcontainer container kubepods-besteffort-pod1c032d05_2ae2_4d00_a7b9_b6421d73e412.slice. Apr 16 00:19:55.939438 systemd[1]: Created slice kubepods-besteffort-pod0326abb9_695f_49d0_97be_8ec08eebabaf.slice - libcontainer container kubepods-besteffort-pod0326abb9_695f_49d0_97be_8ec08eebabaf.slice. Apr 16 00:19:55.980480 kubelet[2576]: I0416 00:19:55.980019 2576 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:19:56.162675 containerd[1477]: time="2026-04-16T00:19:56.162563514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zjtd7,Uid:cf25c361-9e08-4e96-8a6d-6349a738a504,Namespace:kube-system,Attempt:0,}" Apr 16 00:19:56.180705 containerd[1477]: time="2026-04-16T00:19:56.180304902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xph97,Uid:60276bc4-09cf-4ae6-9503-beaf347fe010,Namespace:kube-system,Attempt:0,}" Apr 16 00:19:56.201708 containerd[1477]: time="2026-04-16T00:19:56.201640489Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64786d454d-bl4hx,Uid:160db7e1-a4ae-471a-bd06-9c5e497e7d3a,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:56.215233 containerd[1477]: time="2026-04-16T00:19:56.214987002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-fkgbz,Uid:59287dd3-2523-4d48-9666-1ef09e5795a0,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:56.224281 containerd[1477]: time="2026-04-16T00:19:56.224213052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-f6npc,Uid:7b041080-6884-4ac4-b0b2-bb667ac35753,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:56.238868 containerd[1477]: time="2026-04-16T00:19:56.238738136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-xxdtw,Uid:1c032d05-2ae2-4d00-a7b9-b6421d73e412,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:56.246348 containerd[1477]: time="2026-04-16T00:19:56.246118344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b9458fc5c-d2f82,Uid:0326abb9-695f-49d0-97be-8ec08eebabaf,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:56.370777 containerd[1477]: time="2026-04-16T00:19:56.370548789Z" level=error msg="Failed to destroy network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.371174 containerd[1477]: time="2026-04-16T00:19:56.371141575Z" level=error msg="encountered an error cleaning up failed sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.371320 containerd[1477]: time="2026-04-16T00:19:56.371273141Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xph97,Uid:60276bc4-09cf-4ae6-9503-beaf347fe010,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.371724 kubelet[2576]: E0416 00:19:56.371674 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.371802 kubelet[2576]: E0416 00:19:56.371754 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-xph97" Apr 16 00:19:56.371802 kubelet[2576]: E0416 00:19:56.371778 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-xph97" Apr 16 00:19:56.371852 kubelet[2576]: E0416 00:19:56.371825 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-xph97_kube-system(60276bc4-09cf-4ae6-9503-beaf347fe010)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-xph97_kube-system(60276bc4-09cf-4ae6-9503-beaf347fe010)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-xph97" podUID="60276bc4-09cf-4ae6-9503-beaf347fe010" Apr 16 00:19:56.388461 containerd[1477]: time="2026-04-16T00:19:56.388330058Z" level=error msg="Failed to destroy network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.391292 containerd[1477]: time="2026-04-16T00:19:56.391221707Z" level=error msg="Failed to destroy network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.394176 containerd[1477]: time="2026-04-16T00:19:56.392243872Z" level=error msg="encountered an error cleaning up failed sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.394176 containerd[1477]: time="2026-04-16T00:19:56.392322116Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-fkgbz,Uid:59287dd3-2523-4d48-9666-1ef09e5795a0,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.393792 systemd[1]: Created slice kubepods-besteffort-pod6e5a5131_3036_4d73_9c8f_e82ea8bfcf76.slice - libcontainer container kubepods-besteffort-pod6e5a5131_3036_4d73_9c8f_e82ea8bfcf76.slice. Apr 16 00:19:56.394722 kubelet[2576]: E0416 00:19:56.394168 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.394722 kubelet[2576]: E0416 00:19:56.394219 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7dd9588bc7-fkgbz" Apr 16 00:19:56.394722 kubelet[2576]: E0416 00:19:56.394343 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7dd9588bc7-fkgbz" Apr 16 00:19:56.395061 kubelet[2576]: E0416 00:19:56.394394 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7dd9588bc7-fkgbz_calico-system(59287dd3-2523-4d48-9666-1ef09e5795a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7dd9588bc7-fkgbz_calico-system(59287dd3-2523-4d48-9666-1ef09e5795a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7dd9588bc7-fkgbz" podUID="59287dd3-2523-4d48-9666-1ef09e5795a0" Apr 16 00:19:56.397376 containerd[1477]: time="2026-04-16T00:19:56.397298257Z" level=error msg="encountered an error cleaning up failed sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.397492 containerd[1477]: time="2026-04-16T00:19:56.397382100Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64786d454d-bl4hx,Uid:160db7e1-a4ae-471a-bd06-9c5e497e7d3a,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.397738 kubelet[2576]: E0416 00:19:56.397661 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.397738 kubelet[2576]: E0416 00:19:56.397726 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-64786d454d-bl4hx" Apr 16 00:19:56.397829 kubelet[2576]: E0416 00:19:56.397745 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-64786d454d-bl4hx" Apr 16 00:19:56.397829 kubelet[2576]: E0416 00:19:56.397794 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-64786d454d-bl4hx_calico-system(160db7e1-a4ae-471a-bd06-9c5e497e7d3a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-64786d454d-bl4hx_calico-system(160db7e1-a4ae-471a-bd06-9c5e497e7d3a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-64786d454d-bl4hx" podUID="160db7e1-a4ae-471a-bd06-9c5e497e7d3a" Apr 16 00:19:56.407233 containerd[1477]: time="2026-04-16T00:19:56.407194376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bjzcx,Uid:6e5a5131-3036-4d73-9c8f-e82ea8bfcf76,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:56.436898 containerd[1477]: time="2026-04-16T00:19:56.436833252Z" level=error msg="Failed to destroy network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.437396 containerd[1477]: time="2026-04-16T00:19:56.437362115Z" level=error msg="encountered an error cleaning up failed sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.437557 containerd[1477]: time="2026-04-16T00:19:56.437533603Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zjtd7,Uid:cf25c361-9e08-4e96-8a6d-6349a738a504,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.437893 kubelet[2576]: E0416 00:19:56.437849 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.437964 kubelet[2576]: E0416 00:19:56.437911 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-zjtd7" Apr 16 00:19:56.437964 kubelet[2576]: E0416 00:19:56.437931 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-zjtd7" Apr 16 00:19:56.438043 kubelet[2576]: E0416 00:19:56.437981 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-zjtd7_kube-system(cf25c361-9e08-4e96-8a6d-6349a738a504)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-zjtd7_kube-system(cf25c361-9e08-4e96-8a6d-6349a738a504)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-zjtd7" podUID="cf25c361-9e08-4e96-8a6d-6349a738a504" Apr 16 00:19:56.473970 containerd[1477]: time="2026-04-16T00:19:56.473914178Z" level=error msg="Failed to destroy network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.474592 containerd[1477]: time="2026-04-16T00:19:56.474407200Z" level=error msg="encountered an error cleaning up failed sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.474592 containerd[1477]: time="2026-04-16T00:19:56.474495644Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b9458fc5c-d2f82,Uid:0326abb9-695f-49d0-97be-8ec08eebabaf,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.474927 kubelet[2576]: E0416 00:19:56.474755 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.474927 kubelet[2576]: E0416 00:19:56.474820 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b9458fc5c-d2f82" Apr 16 00:19:56.474927 kubelet[2576]: E0416 00:19:56.474845 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b9458fc5c-d2f82" Apr 16 00:19:56.475043 kubelet[2576]: E0416 00:19:56.474894 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5b9458fc5c-d2f82_calico-system(0326abb9-695f-49d0-97be-8ec08eebabaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5b9458fc5c-d2f82_calico-system(0326abb9-695f-49d0-97be-8ec08eebabaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5b9458fc5c-d2f82" podUID="0326abb9-695f-49d0-97be-8ec08eebabaf" Apr 16 00:19:56.508286 containerd[1477]: time="2026-04-16T00:19:56.508066935Z" level=error msg="Failed to destroy network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.509825 containerd[1477]: time="2026-04-16T00:19:56.509435875Z" level=error msg="encountered an error cleaning up failed sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.509825 containerd[1477]: time="2026-04-16T00:19:56.509580282Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-f6npc,Uid:7b041080-6884-4ac4-b0b2-bb667ac35753,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.510780 kubelet[2576]: E0416 00:19:56.510698 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.510780 kubelet[2576]: E0416 00:19:56.510767 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7dd9588bc7-f6npc" Apr 16 00:19:56.510995 kubelet[2576]: E0416 00:19:56.510787 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7dd9588bc7-f6npc" Apr 16 00:19:56.510995 kubelet[2576]: E0416 00:19:56.510843 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7dd9588bc7-f6npc_calico-system(7b041080-6884-4ac4-b0b2-bb667ac35753)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7dd9588bc7-f6npc_calico-system(7b041080-6884-4ac4-b0b2-bb667ac35753)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7dd9588bc7-f6npc" podUID="7b041080-6884-4ac4-b0b2-bb667ac35753" Apr 16 00:19:56.517851 containerd[1477]: time="2026-04-16T00:19:56.517674081Z" level=error msg="Failed to destroy network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.518183 containerd[1477]: time="2026-04-16T00:19:56.518034577Z" level=error msg="encountered an error cleaning up failed sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.518183 containerd[1477]: time="2026-04-16T00:19:56.518113021Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-xxdtw,Uid:1c032d05-2ae2-4d00-a7b9-b6421d73e412,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.518358 kubelet[2576]: E0416 00:19:56.518320 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.518424 kubelet[2576]: E0416 00:19:56.518377 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-xxdtw" Apr 16 00:19:56.518424 kubelet[2576]: E0416 00:19:56.518396 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-xxdtw" Apr 16 00:19:56.518516 kubelet[2576]: E0416 00:19:56.518443 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-xxdtw_calico-system(1c032d05-2ae2-4d00-a7b9-b6421d73e412)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-xxdtw_calico-system(1c032d05-2ae2-4d00-a7b9-b6421d73e412)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-xxdtw" podUID="1c032d05-2ae2-4d00-a7b9-b6421d73e412" Apr 16 00:19:56.531215 containerd[1477]: time="2026-04-16T00:19:56.531155160Z" level=error msg="Failed to destroy network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.531663 containerd[1477]: time="2026-04-16T00:19:56.531630141Z" level=error msg="encountered an error cleaning up failed sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.531729 containerd[1477]: time="2026-04-16T00:19:56.531692824Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bjzcx,Uid:6e5a5131-3036-4d73-9c8f-e82ea8bfcf76,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.532027 kubelet[2576]: E0416 00:19:56.531986 2576 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.532071 kubelet[2576]: E0416 00:19:56.532051 2576 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bjzcx" Apr 16 00:19:56.532098 kubelet[2576]: E0416 00:19:56.532071 2576 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bjzcx" Apr 16 00:19:56.532412 kubelet[2576]: E0416 00:19:56.532130 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bjzcx_calico-system(6e5a5131-3036-4d73-9c8f-e82ea8bfcf76)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bjzcx_calico-system(6e5a5131-3036-4d73-9c8f-e82ea8bfcf76)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:56.538030 kubelet[2576]: I0416 00:19:56.537984 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:19:56.539343 containerd[1477]: time="2026-04-16T00:19:56.539183796Z" level=info msg="StopPodSandbox for \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\"" Apr 16 00:19:56.539875 containerd[1477]: time="2026-04-16T00:19:56.539830425Z" level=info msg="Ensure that sandbox 846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23 in task-service has been cleanup successfully" Apr 16 00:19:56.540905 kubelet[2576]: I0416 00:19:56.540876 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:19:56.542960 containerd[1477]: time="2026-04-16T00:19:56.542687712Z" level=info msg="StopPodSandbox for \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\"" Apr 16 00:19:56.543730 containerd[1477]: time="2026-04-16T00:19:56.543700077Z" level=info msg="Ensure that sandbox f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754 in task-service has been cleanup successfully" Apr 16 00:19:56.547350 kubelet[2576]: I0416 00:19:56.547316 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:19:56.547993 containerd[1477]: time="2026-04-16T00:19:56.547961906Z" level=info msg="StopPodSandbox for \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\"" Apr 16 00:19:56.548654 containerd[1477]: time="2026-04-16T00:19:56.548571773Z" level=info msg="Ensure that sandbox c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9 in task-service has been cleanup successfully" Apr 16 00:19:56.551534 kubelet[2576]: I0416 00:19:56.551394 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:19:56.553367 containerd[1477]: time="2026-04-16T00:19:56.552788680Z" level=info msg="StopPodSandbox for \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\"" Apr 16 00:19:56.555137 containerd[1477]: time="2026-04-16T00:19:56.554795289Z" level=info msg="Ensure that sandbox 4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731 in task-service has been cleanup successfully" Apr 16 00:19:56.577690 kubelet[2576]: I0416 00:19:56.577293 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:19:56.583584 containerd[1477]: time="2026-04-16T00:19:56.583544086Z" level=info msg="StopPodSandbox for \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\"" Apr 16 00:19:56.585763 containerd[1477]: time="2026-04-16T00:19:56.585437690Z" level=info msg="Ensure that sandbox 05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a in task-service has been cleanup successfully" Apr 16 00:19:56.594340 kubelet[2576]: I0416 00:19:56.594110 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:19:56.598978 containerd[1477]: time="2026-04-16T00:19:56.598925609Z" level=info msg="StopPodSandbox for \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\"" Apr 16 00:19:56.599422 containerd[1477]: time="2026-04-16T00:19:56.599401950Z" level=info msg="Ensure that sandbox d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6 in task-service has been cleanup successfully" Apr 16 00:19:56.607358 containerd[1477]: time="2026-04-16T00:19:56.607152614Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 16 00:19:56.614597 kubelet[2576]: I0416 00:19:56.614540 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:19:56.621551 containerd[1477]: time="2026-04-16T00:19:56.621448969Z" level=info msg="StopPodSandbox for \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\"" Apr 16 00:19:56.628065 containerd[1477]: time="2026-04-16T00:19:56.627927096Z" level=info msg="Ensure that sandbox c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058 in task-service has been cleanup successfully" Apr 16 00:19:56.648409 kubelet[2576]: I0416 00:19:56.648336 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:19:56.656919 containerd[1477]: time="2026-04-16T00:19:56.656525766Z" level=info msg="StopPodSandbox for \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\"" Apr 16 00:19:56.658100 containerd[1477]: time="2026-04-16T00:19:56.658048074Z" level=info msg="Ensure that sandbox da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45 in task-service has been cleanup successfully" Apr 16 00:19:56.735354 containerd[1477]: time="2026-04-16T00:19:56.735306024Z" level=info msg="CreateContainer within sandbox \"5440140815d27208342b394cc3358fc4b1328bc0c4f4f3a699aa4a3216395fe1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ce3147a1f53e0cf2310b4b89d3d7f0f532b41911d3651ee20d62e2440f6d90a0\"" Apr 16 00:19:56.741294 containerd[1477]: time="2026-04-16T00:19:56.738817820Z" level=info msg="StartContainer for \"ce3147a1f53e0cf2310b4b89d3d7f0f532b41911d3651ee20d62e2440f6d90a0\"" Apr 16 00:19:56.770369 containerd[1477]: time="2026-04-16T00:19:56.770210134Z" level=error msg="StopPodSandbox for \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\" failed" error="failed to destroy network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.771715 kubelet[2576]: E0416 00:19:56.771453 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:19:56.772400 kubelet[2576]: E0416 00:19:56.772194 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23"} Apr 16 00:19:56.772400 kubelet[2576]: E0416 00:19:56.772315 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.772400 kubelet[2576]: E0416 00:19:56.772344 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bjzcx" podUID="6e5a5131-3036-4d73-9c8f-e82ea8bfcf76" Apr 16 00:19:56.776128 containerd[1477]: time="2026-04-16T00:19:56.776070034Z" level=error msg="StopPodSandbox for \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\" failed" error="failed to destroy network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.776646 kubelet[2576]: E0416 00:19:56.776429 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:19:56.776646 kubelet[2576]: E0416 00:19:56.776496 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754"} Apr 16 00:19:56.776646 kubelet[2576]: E0416 00:19:56.776533 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"160db7e1-a4ae-471a-bd06-9c5e497e7d3a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.776646 kubelet[2576]: E0416 00:19:56.776559 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"160db7e1-a4ae-471a-bd06-9c5e497e7d3a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-64786d454d-bl4hx" podUID="160db7e1-a4ae-471a-bd06-9c5e497e7d3a" Apr 16 00:19:56.789080 containerd[1477]: time="2026-04-16T00:19:56.789021689Z" level=error msg="StopPodSandbox for \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\" failed" error="failed to destroy network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.789911 kubelet[2576]: E0416 00:19:56.789739 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:19:56.789911 kubelet[2576]: E0416 00:19:56.789805 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731"} Apr 16 00:19:56.789911 kubelet[2576]: E0416 00:19:56.789838 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"cf25c361-9e08-4e96-8a6d-6349a738a504\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.789911 kubelet[2576]: E0416 00:19:56.789864 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"cf25c361-9e08-4e96-8a6d-6349a738a504\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-zjtd7" podUID="cf25c361-9e08-4e96-8a6d-6349a738a504" Apr 16 00:19:56.800927 containerd[1477]: time="2026-04-16T00:19:56.800857415Z" level=error msg="StopPodSandbox for \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\" failed" error="failed to destroy network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.801781 kubelet[2576]: E0416 00:19:56.801593 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:19:56.801781 kubelet[2576]: E0416 00:19:56.801696 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9"} Apr 16 00:19:56.801781 kubelet[2576]: E0416 00:19:56.801727 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"60276bc4-09cf-4ae6-9503-beaf347fe010\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.801781 kubelet[2576]: E0416 00:19:56.801755 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"60276bc4-09cf-4ae6-9503-beaf347fe010\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-xph97" podUID="60276bc4-09cf-4ae6-9503-beaf347fe010" Apr 16 00:19:56.816573 containerd[1477]: time="2026-04-16T00:19:56.816051129Z" level=error msg="StopPodSandbox for \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\" failed" error="failed to destroy network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.816768 kubelet[2576]: E0416 00:19:56.816293 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:19:56.816768 kubelet[2576]: E0416 00:19:56.816345 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058"} Apr 16 00:19:56.816768 kubelet[2576]: E0416 00:19:56.816377 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1c032d05-2ae2-4d00-a7b9-b6421d73e412\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.816768 kubelet[2576]: E0416 00:19:56.816416 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1c032d05-2ae2-4d00-a7b9-b6421d73e412\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-xxdtw" podUID="1c032d05-2ae2-4d00-a7b9-b6421d73e412" Apr 16 00:19:56.820908 containerd[1477]: time="2026-04-16T00:19:56.819309714Z" level=error msg="StopPodSandbox for \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\" failed" error="failed to destroy network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.821072 kubelet[2576]: E0416 00:19:56.819725 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:19:56.821072 kubelet[2576]: E0416 00:19:56.819790 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a"} Apr 16 00:19:56.821072 kubelet[2576]: E0416 00:19:56.819822 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59287dd3-2523-4d48-9666-1ef09e5795a0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.821072 kubelet[2576]: E0416 00:19:56.819893 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59287dd3-2523-4d48-9666-1ef09e5795a0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7dd9588bc7-fkgbz" podUID="59287dd3-2523-4d48-9666-1ef09e5795a0" Apr 16 00:19:56.825210 containerd[1477]: time="2026-04-16T00:19:56.825143053Z" level=error msg="StopPodSandbox for \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\" failed" error="failed to destroy network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.825684 kubelet[2576]: E0416 00:19:56.825426 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:19:56.825684 kubelet[2576]: E0416 00:19:56.825539 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6"} Apr 16 00:19:56.825684 kubelet[2576]: E0416 00:19:56.825572 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0326abb9-695f-49d0-97be-8ec08eebabaf\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.825684 kubelet[2576]: E0416 00:19:56.825599 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0326abb9-695f-49d0-97be-8ec08eebabaf\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5b9458fc5c-d2f82" podUID="0326abb9-695f-49d0-97be-8ec08eebabaf" Apr 16 00:19:56.836843 systemd[1]: Started cri-containerd-ce3147a1f53e0cf2310b4b89d3d7f0f532b41911d3651ee20d62e2440f6d90a0.scope - libcontainer container ce3147a1f53e0cf2310b4b89d3d7f0f532b41911d3651ee20d62e2440f6d90a0. Apr 16 00:19:56.845662 containerd[1477]: time="2026-04-16T00:19:56.843576311Z" level=error msg="StopPodSandbox for \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\" failed" error="failed to destroy network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 00:19:56.845837 kubelet[2576]: E0416 00:19:56.843887 2576 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:19:56.845837 kubelet[2576]: E0416 00:19:56.843937 2576 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45"} Apr 16 00:19:56.845837 kubelet[2576]: E0416 00:19:56.843968 2576 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7b041080-6884-4ac4-b0b2-bb667ac35753\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 16 00:19:56.845837 kubelet[2576]: E0416 00:19:56.844026 2576 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7b041080-6884-4ac4-b0b2-bb667ac35753\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7dd9588bc7-f6npc" podUID="7b041080-6884-4ac4-b0b2-bb667ac35753" Apr 16 00:19:56.872278 containerd[1477]: time="2026-04-16T00:19:56.872125339Z" level=info msg="StartContainer for \"ce3147a1f53e0cf2310b4b89d3d7f0f532b41911d3651ee20d62e2440f6d90a0\" returns successfully" Apr 16 00:19:57.657045 containerd[1477]: time="2026-04-16T00:19:57.656579548Z" level=info msg="StopPodSandbox for \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\"" Apr 16 00:19:57.710845 kubelet[2576]: I0416 00:19:57.710642 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-8sr9s" podStartSLOduration=1.958256497 podStartE2EDuration="15.710624558s" podCreationTimestamp="2026-04-16 00:19:42 +0000 UTC" firstStartedPulling="2026-04-16 00:19:42.816357847 +0000 UTC m=+22.574231612" lastFinishedPulling="2026-04-16 00:19:56.568725908 +0000 UTC m=+36.326599673" observedRunningTime="2026-04-16 00:19:57.705588548 +0000 UTC m=+37.463462313" watchObservedRunningTime="2026-04-16 00:19:57.710624558 +0000 UTC m=+37.468498323" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.740 [INFO][3818] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.741 [INFO][3818] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" iface="eth0" netns="/var/run/netns/cni-88de4400-62f3-b3b4-e4f4-26334b723c56" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.742 [INFO][3818] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" iface="eth0" netns="/var/run/netns/cni-88de4400-62f3-b3b4-e4f4-26334b723c56" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.743 [INFO][3818] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" iface="eth0" netns="/var/run/netns/cni-88de4400-62f3-b3b4-e4f4-26334b723c56" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.743 [INFO][3818] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.743 [INFO][3818] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.793 [INFO][3825] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.793 [INFO][3825] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.793 [INFO][3825] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.803 [WARNING][3825] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.803 [INFO][3825] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.806 [INFO][3825] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:19:57.812397 containerd[1477]: 2026-04-16 00:19:57.809 [INFO][3818] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:19:57.816082 containerd[1477]: time="2026-04-16T00:19:57.815873779Z" level=info msg="TearDown network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\" successfully" Apr 16 00:19:57.816082 containerd[1477]: time="2026-04-16T00:19:57.815925101Z" level=info msg="StopPodSandbox for \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\" returns successfully" Apr 16 00:19:57.816553 systemd[1]: run-netns-cni\x2d88de4400\x2d62f3\x2db3b4\x2de4f4\x2d26334b723c56.mount: Deactivated successfully. Apr 16 00:19:57.946667 kubelet[2576]: I0416 00:19:57.946472 2576 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-ca-bundle\") pod \"0326abb9-695f-49d0-97be-8ec08eebabaf\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " Apr 16 00:19:57.946667 kubelet[2576]: I0416 00:19:57.946589 2576 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-backend-key-pair\") pod \"0326abb9-695f-49d0-97be-8ec08eebabaf\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " Apr 16 00:19:57.948096 kubelet[2576]: I0416 00:19:57.947292 2576 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/0326abb9-695f-49d0-97be-8ec08eebabaf-kube-api-access-k7qh9\" (UniqueName: \"kubernetes.io/projected/0326abb9-695f-49d0-97be-8ec08eebabaf-kube-api-access-k7qh9\") pod \"0326abb9-695f-49d0-97be-8ec08eebabaf\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " Apr 16 00:19:57.948096 kubelet[2576]: I0416 00:19:57.947363 2576 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-nginx-config\" (UniqueName: \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-nginx-config\") pod \"0326abb9-695f-49d0-97be-8ec08eebabaf\" (UID: \"0326abb9-695f-49d0-97be-8ec08eebabaf\") " Apr 16 00:19:57.948096 kubelet[2576]: I0416 00:19:57.947989 2576 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-nginx-config" pod "0326abb9-695f-49d0-97be-8ec08eebabaf" (UID: "0326abb9-695f-49d0-97be-8ec08eebabaf"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 00:19:57.950176 kubelet[2576]: I0416 00:19:57.950042 2576 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-ca-bundle" pod "0326abb9-695f-49d0-97be-8ec08eebabaf" (UID: "0326abb9-695f-49d0-97be-8ec08eebabaf"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 00:19:57.955017 systemd[1]: var-lib-kubelet-pods-0326abb9\x2d695f\x2d49d0\x2d97be\x2d8ec08eebabaf-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 16 00:19:57.956789 kubelet[2576]: I0416 00:19:57.956703 2576 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0326abb9-695f-49d0-97be-8ec08eebabaf-kube-api-access-k7qh9" pod "0326abb9-695f-49d0-97be-8ec08eebabaf" (UID: "0326abb9-695f-49d0-97be-8ec08eebabaf"). InnerVolumeSpecName "kube-api-access-k7qh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 00:19:57.956789 kubelet[2576]: I0416 00:19:57.956622 2576 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-backend-key-pair" pod "0326abb9-695f-49d0-97be-8ec08eebabaf" (UID: "0326abb9-695f-49d0-97be-8ec08eebabaf"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 00:19:57.958741 systemd[1]: var-lib-kubelet-pods-0326abb9\x2d695f\x2d49d0\x2d97be\x2d8ec08eebabaf-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dk7qh9.mount: Deactivated successfully. Apr 16 00:19:58.048630 kubelet[2576]: I0416 00:19:58.048467 2576 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-ca-bundle\") on node \"ci-4081-3-6-n-42941c021f\" DevicePath \"\"" Apr 16 00:19:58.048928 kubelet[2576]: I0416 00:19:58.048516 2576 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0326abb9-695f-49d0-97be-8ec08eebabaf-whisker-backend-key-pair\") on node \"ci-4081-3-6-n-42941c021f\" DevicePath \"\"" Apr 16 00:19:58.048928 kubelet[2576]: I0416 00:19:58.048841 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7qh9\" (UniqueName: \"kubernetes.io/projected/0326abb9-695f-49d0-97be-8ec08eebabaf-kube-api-access-k7qh9\") on node \"ci-4081-3-6-n-42941c021f\" DevicePath \"\"" Apr 16 00:19:58.048928 kubelet[2576]: I0416 00:19:58.048895 2576 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/0326abb9-695f-49d0-97be-8ec08eebabaf-nginx-config\") on node \"ci-4081-3-6-n-42941c021f\" DevicePath \"\"" Apr 16 00:19:58.390622 systemd[1]: Removed slice kubepods-besteffort-pod0326abb9_695f_49d0_97be_8ec08eebabaf.slice - libcontainer container kubepods-besteffort-pod0326abb9_695f_49d0_97be_8ec08eebabaf.slice. Apr 16 00:19:58.787978 systemd[1]: Created slice kubepods-besteffort-pod7bdf3dbc_7b05_495f_9143_91edc7c8339f.slice - libcontainer container kubepods-besteffort-pod7bdf3dbc_7b05_495f_9143_91edc7c8339f.slice. Apr 16 00:19:58.862157 kubelet[2576]: I0416 00:19:58.862113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/7bdf3dbc-7b05-495f-9143-91edc7c8339f-nginx-config\") pod \"whisker-5b9fc49c58-xc496\" (UID: \"7bdf3dbc-7b05-495f-9143-91edc7c8339f\") " pod="calico-system/whisker-5b9fc49c58-xc496" Apr 16 00:19:58.862624 kubelet[2576]: I0416 00:19:58.862350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7bdf3dbc-7b05-495f-9143-91edc7c8339f-whisker-backend-key-pair\") pod \"whisker-5b9fc49c58-xc496\" (UID: \"7bdf3dbc-7b05-495f-9143-91edc7c8339f\") " pod="calico-system/whisker-5b9fc49c58-xc496" Apr 16 00:19:58.862624 kubelet[2576]: I0416 00:19:58.862394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bdf3dbc-7b05-495f-9143-91edc7c8339f-whisker-ca-bundle\") pod \"whisker-5b9fc49c58-xc496\" (UID: \"7bdf3dbc-7b05-495f-9143-91edc7c8339f\") " pod="calico-system/whisker-5b9fc49c58-xc496" Apr 16 00:19:58.862624 kubelet[2576]: I0416 00:19:58.862414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhdfh\" (UniqueName: \"kubernetes.io/projected/7bdf3dbc-7b05-495f-9143-91edc7c8339f-kube-api-access-zhdfh\") pod \"whisker-5b9fc49c58-xc496\" (UID: \"7bdf3dbc-7b05-495f-9143-91edc7c8339f\") " pod="calico-system/whisker-5b9fc49c58-xc496" Apr 16 00:19:58.876837 kernel: calico-node[3875]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Apr 16 00:19:59.100119 containerd[1477]: time="2026-04-16T00:19:59.099990528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b9fc49c58-xc496,Uid:7bdf3dbc-7b05-495f-9143-91edc7c8339f,Namespace:calico-system,Attempt:0,}" Apr 16 00:19:59.296458 systemd-networkd[1354]: calie930213c74b: Link UP Apr 16 00:19:59.297250 systemd-networkd[1354]: calie930213c74b: Gained carrier Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.175 [INFO][3994] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0 whisker-5b9fc49c58- calico-system 7bdf3dbc-7b05-495f-9143-91edc7c8339f 892 0 2026-04-16 00:19:58 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5b9fc49c58 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f whisker-5b9fc49c58-xc496 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie930213c74b [] [] }} ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.176 [INFO][3994] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.212 [INFO][4005] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" HandleID="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.228 [INFO][4005] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" HandleID="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb880), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"whisker-5b9fc49c58-xc496", "timestamp":"2026-04-16 00:19:59.21210859 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001862c0)} Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.228 [INFO][4005] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.228 [INFO][4005] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.228 [INFO][4005] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.235 [INFO][4005] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.245 [INFO][4005] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.252 [INFO][4005] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.255 [INFO][4005] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.258 [INFO][4005] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.258 [INFO][4005] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.260 [INFO][4005] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.266 [INFO][4005] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.273 [INFO][4005] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.65/26] block=192.168.25.64/26 handle="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.273 [INFO][4005] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.65/26] handle="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" host="ci-4081-3-6-n-42941c021f" Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.273 [INFO][4005] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:19:59.327103 containerd[1477]: 2026-04-16 00:19:59.273 [INFO][4005] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.65/26] IPv6=[] ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" HandleID="k8s-pod-network.3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" Apr 16 00:19:59.328780 containerd[1477]: 2026-04-16 00:19:59.277 [INFO][3994] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0", GenerateName:"whisker-5b9fc49c58-", Namespace:"calico-system", SelfLink:"", UID:"7bdf3dbc-7b05-495f-9143-91edc7c8339f", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b9fc49c58", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"whisker-5b9fc49c58-xc496", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.25.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie930213c74b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:19:59.328780 containerd[1477]: 2026-04-16 00:19:59.278 [INFO][3994] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.65/32] ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" Apr 16 00:19:59.328780 containerd[1477]: 2026-04-16 00:19:59.278 [INFO][3994] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie930213c74b ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" Apr 16 00:19:59.328780 containerd[1477]: 2026-04-16 00:19:59.298 [INFO][3994] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" Apr 16 00:19:59.328780 containerd[1477]: 2026-04-16 00:19:59.299 [INFO][3994] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0", GenerateName:"whisker-5b9fc49c58-", Namespace:"calico-system", SelfLink:"", UID:"7bdf3dbc-7b05-495f-9143-91edc7c8339f", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b9fc49c58", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca", Pod:"whisker-5b9fc49c58-xc496", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.25.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie930213c74b", MAC:"8a:85:e8:95:41:d5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:19:59.328780 containerd[1477]: 2026-04-16 00:19:59.320 [INFO][3994] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca" Namespace="calico-system" Pod="whisker-5b9fc49c58-xc496" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9fc49c58--xc496-eth0" Apr 16 00:19:59.352755 containerd[1477]: time="2026-04-16T00:19:59.351518450Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:19:59.352755 containerd[1477]: time="2026-04-16T00:19:59.351691057Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:19:59.352755 containerd[1477]: time="2026-04-16T00:19:59.351750499Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:59.352755 containerd[1477]: time="2026-04-16T00:19:59.351992108Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:19:59.380014 systemd[1]: Started cri-containerd-3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca.scope - libcontainer container 3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca. Apr 16 00:19:59.434956 systemd-networkd[1354]: vxlan.calico: Link UP Apr 16 00:19:59.434965 systemd-networkd[1354]: vxlan.calico: Gained carrier Apr 16 00:19:59.457009 containerd[1477]: time="2026-04-16T00:19:59.456956748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b9fc49c58-xc496,Uid:7bdf3dbc-7b05-495f-9143-91edc7c8339f,Namespace:calico-system,Attempt:0,} returns sandbox id \"3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca\"" Apr 16 00:19:59.461270 containerd[1477]: time="2026-04-16T00:19:59.460945534Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 16 00:20:00.384585 kubelet[2576]: I0416 00:20:00.384516 2576 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="0326abb9-695f-49d0-97be-8ec08eebabaf" path="/var/lib/kubelet/pods/0326abb9-695f-49d0-97be-8ec08eebabaf/volumes" Apr 16 00:20:00.500926 systemd-networkd[1354]: calie930213c74b: Gained IPv6LL Apr 16 00:20:01.204669 systemd-networkd[1354]: vxlan.calico: Gained IPv6LL Apr 16 00:20:01.226114 containerd[1477]: time="2026-04-16T00:20:01.225952781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:01.229269 containerd[1477]: time="2026-04-16T00:20:01.229228967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Apr 16 00:20:01.231736 containerd[1477]: time="2026-04-16T00:20:01.231110987Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:01.234885 containerd[1477]: time="2026-04-16T00:20:01.234831387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:01.236368 containerd[1477]: time="2026-04-16T00:20:01.236312394Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.775317539s" Apr 16 00:20:01.236589 containerd[1477]: time="2026-04-16T00:20:01.236567483Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Apr 16 00:20:01.243369 containerd[1477]: time="2026-04-16T00:20:01.243218897Z" level=info msg="CreateContainer within sandbox \"3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 16 00:20:01.257596 containerd[1477]: time="2026-04-16T00:20:01.257548357Z" level=info msg="CreateContainer within sandbox \"3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"534411cb419cef2b942b169032817b20a7c0c89745bdcb97eae800a58817652e\"" Apr 16 00:20:01.258978 containerd[1477]: time="2026-04-16T00:20:01.258788597Z" level=info msg="StartContainer for \"534411cb419cef2b942b169032817b20a7c0c89745bdcb97eae800a58817652e\"" Apr 16 00:20:01.297977 systemd[1]: Started cri-containerd-534411cb419cef2b942b169032817b20a7c0c89745bdcb97eae800a58817652e.scope - libcontainer container 534411cb419cef2b942b169032817b20a7c0c89745bdcb97eae800a58817652e. Apr 16 00:20:01.336350 containerd[1477]: time="2026-04-16T00:20:01.335800833Z" level=info msg="StartContainer for \"534411cb419cef2b942b169032817b20a7c0c89745bdcb97eae800a58817652e\" returns successfully" Apr 16 00:20:01.340081 containerd[1477]: time="2026-04-16T00:20:01.339934686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 16 00:20:03.459075 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount828868999.mount: Deactivated successfully. Apr 16 00:20:03.479925 containerd[1477]: time="2026-04-16T00:20:03.479869671Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:03.481292 containerd[1477]: time="2026-04-16T00:20:03.481057994Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Apr 16 00:20:03.483631 containerd[1477]: time="2026-04-16T00:20:03.482429483Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:03.485095 containerd[1477]: time="2026-04-16T00:20:03.485033457Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:03.486226 containerd[1477]: time="2026-04-16T00:20:03.486067494Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.146071846s" Apr 16 00:20:03.486226 containerd[1477]: time="2026-04-16T00:20:03.486110336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Apr 16 00:20:03.493018 containerd[1477]: time="2026-04-16T00:20:03.492977383Z" level=info msg="CreateContainer within sandbox \"3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 16 00:20:03.515877 containerd[1477]: time="2026-04-16T00:20:03.514880410Z" level=info msg="CreateContainer within sandbox \"3cccc8ce89454ace53c443076e2e605224d52770e9c6be5f6cafc6e1fa82deca\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"c21807cc89bc293a34ee2198470fcd4bc30de43fe4a7f37de1e6d5edac7bd086\"" Apr 16 00:20:03.521278 containerd[1477]: time="2026-04-16T00:20:03.521211638Z" level=info msg="StartContainer for \"c21807cc89bc293a34ee2198470fcd4bc30de43fe4a7f37de1e6d5edac7bd086\"" Apr 16 00:20:03.556848 systemd[1]: Started cri-containerd-c21807cc89bc293a34ee2198470fcd4bc30de43fe4a7f37de1e6d5edac7bd086.scope - libcontainer container c21807cc89bc293a34ee2198470fcd4bc30de43fe4a7f37de1e6d5edac7bd086. Apr 16 00:20:03.597351 containerd[1477]: time="2026-04-16T00:20:03.597302493Z" level=info msg="StartContainer for \"c21807cc89bc293a34ee2198470fcd4bc30de43fe4a7f37de1e6d5edac7bd086\" returns successfully" Apr 16 00:20:07.382294 containerd[1477]: time="2026-04-16T00:20:07.381918507Z" level=info msg="StopPodSandbox for \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\"" Apr 16 00:20:07.442550 kubelet[2576]: I0416 00:20:07.441264 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-5b9fc49c58-xc496" podStartSLOduration=5.41437507 podStartE2EDuration="9.441247614s" podCreationTimestamp="2026-04-16 00:19:58 +0000 UTC" firstStartedPulling="2026-04-16 00:19:59.460437355 +0000 UTC m=+39.218311080" lastFinishedPulling="2026-04-16 00:20:03.487309859 +0000 UTC m=+43.245183624" observedRunningTime="2026-04-16 00:20:03.70987682 +0000 UTC m=+43.467750585" watchObservedRunningTime="2026-04-16 00:20:07.441247614 +0000 UTC m=+47.199121339" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.438 [INFO][4285] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.439 [INFO][4285] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" iface="eth0" netns="/var/run/netns/cni-aef80098-8d76-72ed-c16b-752d7586bcc7" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.440 [INFO][4285] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" iface="eth0" netns="/var/run/netns/cni-aef80098-8d76-72ed-c16b-752d7586bcc7" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.440 [INFO][4285] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" iface="eth0" netns="/var/run/netns/cni-aef80098-8d76-72ed-c16b-752d7586bcc7" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.440 [INFO][4285] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.440 [INFO][4285] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.467 [INFO][4292] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.467 [INFO][4292] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.467 [INFO][4292] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.478 [WARNING][4292] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.478 [INFO][4292] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.481 [INFO][4292] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:07.487646 containerd[1477]: 2026-04-16 00:20:07.483 [INFO][4285] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:07.488329 containerd[1477]: time="2026-04-16T00:20:07.488193043Z" level=info msg="TearDown network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\" successfully" Apr 16 00:20:07.488329 containerd[1477]: time="2026-04-16T00:20:07.488227684Z" level=info msg="StopPodSandbox for \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\" returns successfully" Apr 16 00:20:07.489698 systemd[1]: run-netns-cni\x2daef80098\x2d8d76\x2d72ed\x2dc16b\x2d752d7586bcc7.mount: Deactivated successfully. Apr 16 00:20:07.492777 containerd[1477]: time="2026-04-16T00:20:07.492359736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bjzcx,Uid:6e5a5131-3036-4d73-9c8f-e82ea8bfcf76,Namespace:calico-system,Attempt:1,}" Apr 16 00:20:07.646171 systemd-networkd[1354]: cali973e535c87f: Link UP Apr 16 00:20:07.646838 systemd-networkd[1354]: cali973e535c87f: Gained carrier Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.550 [INFO][4299] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0 csi-node-driver- calico-system 6e5a5131-3036-4d73-9c8f-e82ea8bfcf76 935 0 2026-04-16 00:19:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f csi-node-driver-bjzcx eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali973e535c87f [] [] }} ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.552 [INFO][4299] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.582 [INFO][4311] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" HandleID="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.596 [INFO][4311] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" HandleID="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"csi-node-driver-bjzcx", "timestamp":"2026-04-16 00:20:07.582843244 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003aaf20)} Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.596 [INFO][4311] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.596 [INFO][4311] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.596 [INFO][4311] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.599 [INFO][4311] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.605 [INFO][4311] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.613 [INFO][4311] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.616 [INFO][4311] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.620 [INFO][4311] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.620 [INFO][4311] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.623 [INFO][4311] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52 Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.630 [INFO][4311] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.639 [INFO][4311] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.66/26] block=192.168.25.64/26 handle="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.639 [INFO][4311] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.66/26] handle="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.639 [INFO][4311] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:07.672478 containerd[1477]: 2026-04-16 00:20:07.639 [INFO][4311] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.66/26] IPv6=[] ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" HandleID="k8s-pod-network.ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.674578 containerd[1477]: 2026-04-16 00:20:07.643 [INFO][4299] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"csi-node-driver-bjzcx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.25.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali973e535c87f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:07.674578 containerd[1477]: 2026-04-16 00:20:07.643 [INFO][4299] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.66/32] ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.674578 containerd[1477]: 2026-04-16 00:20:07.643 [INFO][4299] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali973e535c87f ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.674578 containerd[1477]: 2026-04-16 00:20:07.649 [INFO][4299] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.674578 containerd[1477]: 2026-04-16 00:20:07.650 [INFO][4299] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52", Pod:"csi-node-driver-bjzcx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.25.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali973e535c87f", MAC:"22:e6:b2:81:e2:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:07.674578 containerd[1477]: 2026-04-16 00:20:07.666 [INFO][4299] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52" Namespace="calico-system" Pod="csi-node-driver-bjzcx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:07.694275 containerd[1477]: time="2026-04-16T00:20:07.693947015Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:20:07.694275 containerd[1477]: time="2026-04-16T00:20:07.694004617Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:20:07.694275 containerd[1477]: time="2026-04-16T00:20:07.694027778Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:07.694275 containerd[1477]: time="2026-04-16T00:20:07.694137101Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:07.731878 systemd[1]: Started cri-containerd-ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52.scope - libcontainer container ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52. Apr 16 00:20:07.760752 containerd[1477]: time="2026-04-16T00:20:07.760701240Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bjzcx,Uid:6e5a5131-3036-4d73-9c8f-e82ea8bfcf76,Namespace:calico-system,Attempt:1,} returns sandbox id \"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52\"" Apr 16 00:20:07.764576 containerd[1477]: time="2026-04-16T00:20:07.764290916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 16 00:20:08.383258 containerd[1477]: time="2026-04-16T00:20:08.383223071Z" level=info msg="StopPodSandbox for \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\"" Apr 16 00:20:08.385112 containerd[1477]: time="2026-04-16T00:20:08.384343226Z" level=info msg="StopPodSandbox for \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\"" Apr 16 00:20:08.388723 containerd[1477]: time="2026-04-16T00:20:08.388364872Z" level=info msg="StopPodSandbox for \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\"" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.485 [INFO][4408] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.486 [INFO][4408] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" iface="eth0" netns="/var/run/netns/cni-5cf80333-549c-6151-1101-c56c4fc746a9" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.488 [INFO][4408] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" iface="eth0" netns="/var/run/netns/cni-5cf80333-549c-6151-1101-c56c4fc746a9" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.491 [INFO][4408] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" iface="eth0" netns="/var/run/netns/cni-5cf80333-549c-6151-1101-c56c4fc746a9" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.491 [INFO][4408] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.491 [INFO][4408] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.534 [INFO][4439] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.534 [INFO][4439] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.535 [INFO][4439] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.549 [WARNING][4439] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.549 [INFO][4439] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.552 [INFO][4439] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:08.559372 containerd[1477]: 2026-04-16 00:20:08.555 [INFO][4408] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:08.561921 containerd[1477]: time="2026-04-16T00:20:08.561758612Z" level=info msg="TearDown network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\" successfully" Apr 16 00:20:08.561921 containerd[1477]: time="2026-04-16T00:20:08.561793173Z" level=info msg="StopPodSandbox for \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\" returns successfully" Apr 16 00:20:08.566476 containerd[1477]: time="2026-04-16T00:20:08.565912342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xph97,Uid:60276bc4-09cf-4ae6-9503-beaf347fe010,Namespace:kube-system,Attempt:1,}" Apr 16 00:20:08.565950 systemd[1]: run-netns-cni\x2d5cf80333\x2d549c\x2d6151\x2d1101\x2dc56c4fc746a9.mount: Deactivated successfully. Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.477 [INFO][4410] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.477 [INFO][4410] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" iface="eth0" netns="/var/run/netns/cni-c17d8469-30b0-ae66-98bd-5bb20d4afb36" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.477 [INFO][4410] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" iface="eth0" netns="/var/run/netns/cni-c17d8469-30b0-ae66-98bd-5bb20d4afb36" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.477 [INFO][4410] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" iface="eth0" netns="/var/run/netns/cni-c17d8469-30b0-ae66-98bd-5bb20d4afb36" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.477 [INFO][4410] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.477 [INFO][4410] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.544 [INFO][4428] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.544 [INFO][4428] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.552 [INFO][4428] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.569 [WARNING][4428] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.569 [INFO][4428] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.571 [INFO][4428] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:08.575779 containerd[1477]: 2026-04-16 00:20:08.573 [INFO][4410] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:08.579853 containerd[1477]: time="2026-04-16T00:20:08.579732294Z" level=info msg="TearDown network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\" successfully" Apr 16 00:20:08.579853 containerd[1477]: time="2026-04-16T00:20:08.579772855Z" level=info msg="StopPodSandbox for \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\" returns successfully" Apr 16 00:20:08.580720 systemd[1]: run-netns-cni\x2dc17d8469\x2d30b0\x2dae66\x2d98bd\x2d5bb20d4afb36.mount: Deactivated successfully. Apr 16 00:20:08.583328 containerd[1477]: time="2026-04-16T00:20:08.583295885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-xxdtw,Uid:1c032d05-2ae2-4d00-a7b9-b6421d73e412,Namespace:calico-system,Attempt:1,}" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.487 [INFO][4407] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.487 [INFO][4407] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" iface="eth0" netns="/var/run/netns/cni-20ab520b-4d3f-2a97-6988-e3924dd0a1db" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.488 [INFO][4407] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" iface="eth0" netns="/var/run/netns/cni-20ab520b-4d3f-2a97-6988-e3924dd0a1db" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.488 [INFO][4407] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" iface="eth0" netns="/var/run/netns/cni-20ab520b-4d3f-2a97-6988-e3924dd0a1db" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.488 [INFO][4407] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.488 [INFO][4407] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.573 [INFO][4434] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.573 [INFO][4434] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.573 [INFO][4434] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.591 [WARNING][4434] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.591 [INFO][4434] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.594 [INFO][4434] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:08.603509 containerd[1477]: 2026-04-16 00:20:08.599 [INFO][4407] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:08.606444 systemd[1]: run-netns-cni\x2d20ab520b\x2d4d3f\x2d2a97\x2d6988\x2de3924dd0a1db.mount: Deactivated successfully. Apr 16 00:20:08.606888 containerd[1477]: time="2026-04-16T00:20:08.606476850Z" level=info msg="TearDown network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\" successfully" Apr 16 00:20:08.606888 containerd[1477]: time="2026-04-16T00:20:08.606512291Z" level=info msg="StopPodSandbox for \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\" returns successfully" Apr 16 00:20:08.610860 containerd[1477]: time="2026-04-16T00:20:08.610528937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-fkgbz,Uid:59287dd3-2523-4d48-9666-1ef09e5795a0,Namespace:calico-system,Attempt:1,}" Apr 16 00:20:08.805716 systemd-networkd[1354]: calid0009655d70: Link UP Apr 16 00:20:08.807651 systemd-networkd[1354]: calid0009655d70: Gained carrier Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.681 [INFO][4460] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0 goldmane-9f7667bb8- calico-system 1c032d05-2ae2-4d00-a7b9-b6421d73e412 945 0 2026-04-16 00:19:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f goldmane-9f7667bb8-xxdtw eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid0009655d70 [] [] }} ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.681 [INFO][4460] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.739 [INFO][4490] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" HandleID="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.757 [INFO][4490] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" HandleID="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c190), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"goldmane-9f7667bb8-xxdtw", "timestamp":"2026-04-16 00:20:08.739909061 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400036b4a0)} Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.757 [INFO][4490] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.757 [INFO][4490] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.757 [INFO][4490] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.762 [INFO][4490] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.768 [INFO][4490] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.775 [INFO][4490] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.777 [INFO][4490] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.780 [INFO][4490] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.780 [INFO][4490] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.782 [INFO][4490] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3 Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.788 [INFO][4490] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.798 [INFO][4490] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.67/26] block=192.168.25.64/26 handle="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.799 [INFO][4490] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.67/26] handle="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.799 [INFO][4490] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:08.828937 containerd[1477]: 2026-04-16 00:20:08.799 [INFO][4490] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.67/26] IPv6=[] ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" HandleID="k8s-pod-network.00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.829596 containerd[1477]: 2026-04-16 00:20:08.801 [INFO][4460] cni-plugin/k8s.go 418: Populated endpoint ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"1c032d05-2ae2-4d00-a7b9-b6421d73e412", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"goldmane-9f7667bb8-xxdtw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.25.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid0009655d70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:08.829596 containerd[1477]: 2026-04-16 00:20:08.801 [INFO][4460] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.67/32] ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.829596 containerd[1477]: 2026-04-16 00:20:08.802 [INFO][4460] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0009655d70 ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.829596 containerd[1477]: 2026-04-16 00:20:08.808 [INFO][4460] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.829596 containerd[1477]: 2026-04-16 00:20:08.808 [INFO][4460] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"1c032d05-2ae2-4d00-a7b9-b6421d73e412", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3", Pod:"goldmane-9f7667bb8-xxdtw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.25.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid0009655d70", MAC:"82:78:00:52:35:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:08.829596 containerd[1477]: 2026-04-16 00:20:08.822 [INFO][4460] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3" Namespace="calico-system" Pod="goldmane-9f7667bb8-xxdtw" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:08.855637 containerd[1477]: time="2026-04-16T00:20:08.854201994Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:20:08.855637 containerd[1477]: time="2026-04-16T00:20:08.854273396Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:20:08.855637 containerd[1477]: time="2026-04-16T00:20:08.854288036Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:08.855637 containerd[1477]: time="2026-04-16T00:20:08.854376439Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:08.885972 systemd[1]: Started cri-containerd-00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3.scope - libcontainer container 00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3. Apr 16 00:20:08.938746 systemd-networkd[1354]: cali61e670d94ff: Link UP Apr 16 00:20:08.942985 systemd-networkd[1354]: cali61e670d94ff: Gained carrier Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.681 [INFO][4450] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0 coredns-7d764666f9- kube-system 60276bc4-09cf-4ae6-9503-beaf347fe010 946 0 2026-04-16 00:19:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f coredns-7d764666f9-xph97 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali61e670d94ff [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.681 [INFO][4450] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.740 [INFO][4485] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" HandleID="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.761 [INFO][4485] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" HandleID="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbce0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"coredns-7d764666f9-xph97", "timestamp":"2026-04-16 00:20:08.740233711 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004311e0)} Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.761 [INFO][4485] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.799 [INFO][4485] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.799 [INFO][4485] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.865 [INFO][4485] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.885 [INFO][4485] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.894 [INFO][4485] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.897 [INFO][4485] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.901 [INFO][4485] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.902 [INFO][4485] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.904 [INFO][4485] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.913 [INFO][4485] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.927 [INFO][4485] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.68/26] block=192.168.25.64/26 handle="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.927 [INFO][4485] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.68/26] handle="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.928 [INFO][4485] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:08.973952 containerd[1477]: 2026-04-16 00:20:08.928 [INFO][4485] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.68/26] IPv6=[] ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" HandleID="k8s-pod-network.ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.975698 containerd[1477]: 2026-04-16 00:20:08.931 [INFO][4450] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"60276bc4-09cf-4ae6-9503-beaf347fe010", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"coredns-7d764666f9-xph97", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali61e670d94ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:08.975698 containerd[1477]: 2026-04-16 00:20:08.932 [INFO][4450] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.68/32] ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.975698 containerd[1477]: 2026-04-16 00:20:08.932 [INFO][4450] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali61e670d94ff ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.975698 containerd[1477]: 2026-04-16 00:20:08.944 [INFO][4450] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.975698 containerd[1477]: 2026-04-16 00:20:08.947 [INFO][4450] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"60276bc4-09cf-4ae6-9503-beaf347fe010", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e", Pod:"coredns-7d764666f9-xph97", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali61e670d94ff", MAC:"82:4e:3f:59:d2:98", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:08.975952 containerd[1477]: 2026-04-16 00:20:08.969 [INFO][4450] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e" Namespace="kube-system" Pod="coredns-7d764666f9-xph97" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:08.981015 containerd[1477]: time="2026-04-16T00:20:08.979550432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-xxdtw,Uid:1c032d05-2ae2-4d00-a7b9-b6421d73e412,Namespace:calico-system,Attempt:1,} returns sandbox id \"00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3\"" Apr 16 00:20:09.013413 containerd[1477]: time="2026-04-16T00:20:09.013238595Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:20:09.013776 containerd[1477]: time="2026-04-16T00:20:09.013642327Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:20:09.014880 containerd[1477]: time="2026-04-16T00:20:09.014123742Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:09.015652 containerd[1477]: time="2026-04-16T00:20:09.015618867Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:09.037810 systemd[1]: Started cri-containerd-ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e.scope - libcontainer container ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e. Apr 16 00:20:09.047910 systemd-networkd[1354]: calid60a44e5bb1: Link UP Apr 16 00:20:09.048169 systemd-networkd[1354]: calid60a44e5bb1: Gained carrier Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.710 [INFO][4471] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0 calico-apiserver-7dd9588bc7- calico-system 59287dd3-2523-4d48-9666-1ef09e5795a0 947 0 2026-04-16 00:19:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7dd9588bc7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f calico-apiserver-7dd9588bc7-fkgbz eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calid60a44e5bb1 [] [] }} ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.711 [INFO][4471] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.762 [INFO][4496] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" HandleID="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.773 [INFO][4496] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" HandleID="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbdd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"calico-apiserver-7dd9588bc7-fkgbz", "timestamp":"2026-04-16 00:20:08.762293801 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003b5b80)} Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.773 [INFO][4496] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.928 [INFO][4496] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.928 [INFO][4496] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.964 [INFO][4496] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.987 [INFO][4496] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:08.998 [INFO][4496] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.002 [INFO][4496] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.006 [INFO][4496] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.006 [INFO][4496] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.010 [INFO][4496] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97 Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.017 [INFO][4496] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.032 [INFO][4496] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.69/26] block=192.168.25.64/26 handle="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.032 [INFO][4496] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.69/26] handle="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.033 [INFO][4496] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:09.074376 containerd[1477]: 2026-04-16 00:20:09.033 [INFO][4496] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.69/26] IPv6=[] ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" HandleID="k8s-pod-network.753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:09.075457 containerd[1477]: 2026-04-16 00:20:09.040 [INFO][4471] cni-plugin/k8s.go 418: Populated endpoint ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"59287dd3-2523-4d48-9666-1ef09e5795a0", ResourceVersion:"947", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"calico-apiserver-7dd9588bc7-fkgbz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid60a44e5bb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:09.075457 containerd[1477]: 2026-04-16 00:20:09.041 [INFO][4471] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.69/32] ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:09.075457 containerd[1477]: 2026-04-16 00:20:09.041 [INFO][4471] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid60a44e5bb1 ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:09.075457 containerd[1477]: 2026-04-16 00:20:09.047 [INFO][4471] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:09.075457 containerd[1477]: 2026-04-16 00:20:09.051 [INFO][4471] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"59287dd3-2523-4d48-9666-1ef09e5795a0", ResourceVersion:"947", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97", Pod:"calico-apiserver-7dd9588bc7-fkgbz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid60a44e5bb1", MAC:"36:d4:18:fe:2e:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:09.075457 containerd[1477]: 2026-04-16 00:20:09.067 [INFO][4471] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-fkgbz" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:09.117791 containerd[1477]: time="2026-04-16T00:20:09.117658250Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:20:09.117791 containerd[1477]: time="2026-04-16T00:20:09.117726412Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:20:09.117791 containerd[1477]: time="2026-04-16T00:20:09.117760813Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:09.118714 containerd[1477]: time="2026-04-16T00:20:09.118387672Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:09.119357 containerd[1477]: time="2026-04-16T00:20:09.119289059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-xph97,Uid:60276bc4-09cf-4ae6-9503-beaf347fe010,Namespace:kube-system,Attempt:1,} returns sandbox id \"ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e\"" Apr 16 00:20:09.130235 containerd[1477]: time="2026-04-16T00:20:09.130192791Z" level=info msg="CreateContainer within sandbox \"ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 16 00:20:09.154172 systemd[1]: Started cri-containerd-753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97.scope - libcontainer container 753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97. Apr 16 00:20:09.164216 containerd[1477]: time="2026-04-16T00:20:09.164158944Z" level=info msg="CreateContainer within sandbox \"ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"48359155b51a8b6c96bce431055652e218f06d7ee0abe957bee7d829bd5dbb00\"" Apr 16 00:20:09.165228 containerd[1477]: time="2026-04-16T00:20:09.164969808Z" level=info msg="StartContainer for \"48359155b51a8b6c96bce431055652e218f06d7ee0abe957bee7d829bd5dbb00\"" Apr 16 00:20:09.231289 containerd[1477]: time="2026-04-16T00:20:09.231241704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-fkgbz,Uid:59287dd3-2523-4d48-9666-1ef09e5795a0,Namespace:calico-system,Attempt:1,} returns sandbox id \"753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97\"" Apr 16 00:20:09.231818 systemd[1]: Started cri-containerd-48359155b51a8b6c96bce431055652e218f06d7ee0abe957bee7d829bd5dbb00.scope - libcontainer container 48359155b51a8b6c96bce431055652e218f06d7ee0abe957bee7d829bd5dbb00. Apr 16 00:20:09.275018 containerd[1477]: time="2026-04-16T00:20:09.274962633Z" level=info msg="StartContainer for \"48359155b51a8b6c96bce431055652e218f06d7ee0abe957bee7d829bd5dbb00\" returns successfully" Apr 16 00:20:09.381522 containerd[1477]: time="2026-04-16T00:20:09.381465431Z" level=info msg="StopPodSandbox for \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\"" Apr 16 00:20:09.396488 systemd-networkd[1354]: cali973e535c87f: Gained IPv6LL Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.453 [INFO][4726] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.454 [INFO][4726] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" iface="eth0" netns="/var/run/netns/cni-dd369e49-8852-a3ce-83f4-c02cdc45a255" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.455 [INFO][4726] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" iface="eth0" netns="/var/run/netns/cni-dd369e49-8852-a3ce-83f4-c02cdc45a255" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.457 [INFO][4726] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" iface="eth0" netns="/var/run/netns/cni-dd369e49-8852-a3ce-83f4-c02cdc45a255" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.457 [INFO][4726] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.457 [INFO][4726] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.481 [INFO][4737] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.482 [INFO][4737] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.482 [INFO][4737] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.492 [WARNING][4737] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.493 [INFO][4737] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.495 [INFO][4737] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:09.499822 containerd[1477]: 2026-04-16 00:20:09.497 [INFO][4726] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:09.501424 containerd[1477]: time="2026-04-16T00:20:09.500527332Z" level=info msg="TearDown network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\" successfully" Apr 16 00:20:09.501424 containerd[1477]: time="2026-04-16T00:20:09.500560533Z" level=info msg="StopPodSandbox for \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\" returns successfully" Apr 16 00:20:09.504342 containerd[1477]: time="2026-04-16T00:20:09.504299406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zjtd7,Uid:cf25c361-9e08-4e96-8a6d-6349a738a504,Namespace:kube-system,Attempt:1,}" Apr 16 00:20:09.588874 systemd[1]: run-netns-cni\x2ddd369e49\x2d8852\x2da3ce\x2d83f4\x2dc02cdc45a255.mount: Deactivated successfully. Apr 16 00:20:09.702136 systemd-networkd[1354]: cali52222d88012: Link UP Apr 16 00:20:09.704207 systemd-networkd[1354]: cali52222d88012: Gained carrier Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.585 [INFO][4745] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0 coredns-7d764666f9- kube-system cf25c361-9e08-4e96-8a6d-6349a738a504 966 0 2026-04-16 00:19:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f coredns-7d764666f9-zjtd7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali52222d88012 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.585 [INFO][4745] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.626 [INFO][4756] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" HandleID="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.641 [INFO][4756] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" HandleID="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002733d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"coredns-7d764666f9-zjtd7", "timestamp":"2026-04-16 00:20:09.626190673 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400010e2c0)} Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.641 [INFO][4756] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.641 [INFO][4756] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.641 [INFO][4756] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.645 [INFO][4756] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.654 [INFO][4756] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.662 [INFO][4756] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.666 [INFO][4756] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.671 [INFO][4756] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.671 [INFO][4756] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.675 [INFO][4756] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.684 [INFO][4756] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.695 [INFO][4756] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.70/26] block=192.168.25.64/26 handle="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.695 [INFO][4756] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.70/26] handle="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.695 [INFO][4756] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:09.744715 containerd[1477]: 2026-04-16 00:20:09.695 [INFO][4756] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.70/26] IPv6=[] ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" HandleID="k8s-pod-network.29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.745374 containerd[1477]: 2026-04-16 00:20:09.697 [INFO][4745] cni-plugin/k8s.go 418: Populated endpoint ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"cf25c361-9e08-4e96-8a6d-6349a738a504", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"coredns-7d764666f9-zjtd7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali52222d88012", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:09.745374 containerd[1477]: 2026-04-16 00:20:09.697 [INFO][4745] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.70/32] ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.745374 containerd[1477]: 2026-04-16 00:20:09.697 [INFO][4745] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali52222d88012 ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.745374 containerd[1477]: 2026-04-16 00:20:09.702 [INFO][4745] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.745374 containerd[1477]: 2026-04-16 00:20:09.707 [INFO][4745] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"cf25c361-9e08-4e96-8a6d-6349a738a504", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c", Pod:"coredns-7d764666f9-zjtd7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali52222d88012", MAC:"1e:06:21:60:76:11", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:09.745546 containerd[1477]: 2026-04-16 00:20:09.729 [INFO][4745] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c" Namespace="kube-system" Pod="coredns-7d764666f9-zjtd7" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:09.799991 kubelet[2576]: I0416 00:20:09.799369 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-xph97" podStartSLOduration=42.799164292 podStartE2EDuration="42.799164292s" podCreationTimestamp="2026-04-16 00:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:20:09.773163581 +0000 UTC m=+49.531037386" watchObservedRunningTime="2026-04-16 00:20:09.799164292 +0000 UTC m=+49.557038057" Apr 16 00:20:09.813526 containerd[1477]: time="2026-04-16T00:20:09.812974872Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:20:09.817138 containerd[1477]: time="2026-04-16T00:20:09.817060076Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:20:09.817138 containerd[1477]: time="2026-04-16T00:20:09.817131798Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:09.817410 containerd[1477]: time="2026-04-16T00:20:09.817320164Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:09.872311 systemd[1]: Started cri-containerd-29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c.scope - libcontainer container 29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c. Apr 16 00:20:09.940735 containerd[1477]: time="2026-04-16T00:20:09.940585872Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zjtd7,Uid:cf25c361-9e08-4e96-8a6d-6349a738a504,Namespace:kube-system,Attempt:1,} returns sandbox id \"29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c\"" Apr 16 00:20:09.951366 containerd[1477]: time="2026-04-16T00:20:09.951310678Z" level=info msg="CreateContainer within sandbox \"29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 16 00:20:09.990718 containerd[1477]: time="2026-04-16T00:20:09.990351865Z" level=info msg="CreateContainer within sandbox \"29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7168a602d79e757365308b386932fcf4cb80562ae9896920334d32f5dabbcc6c\"" Apr 16 00:20:09.993805 containerd[1477]: time="2026-04-16T00:20:09.993752049Z" level=info msg="StartContainer for \"7168a602d79e757365308b386932fcf4cb80562ae9896920334d32f5dabbcc6c\"" Apr 16 00:20:10.062986 systemd[1]: Started cri-containerd-7168a602d79e757365308b386932fcf4cb80562ae9896920334d32f5dabbcc6c.scope - libcontainer container 7168a602d79e757365308b386932fcf4cb80562ae9896920334d32f5dabbcc6c. Apr 16 00:20:10.076741 containerd[1477]: time="2026-04-16T00:20:10.076185094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:10.078218 containerd[1477]: time="2026-04-16T00:20:10.078162992Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Apr 16 00:20:10.080495 containerd[1477]: time="2026-04-16T00:20:10.080455660Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:10.088118 containerd[1477]: time="2026-04-16T00:20:10.088062245Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:10.090814 containerd[1477]: time="2026-04-16T00:20:10.090761805Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 2.326422208s" Apr 16 00:20:10.090814 containerd[1477]: time="2026-04-16T00:20:10.090813126Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Apr 16 00:20:10.096259 containerd[1477]: time="2026-04-16T00:20:10.095511225Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 16 00:20:10.105713 containerd[1477]: time="2026-04-16T00:20:10.105405638Z" level=info msg="StartContainer for \"7168a602d79e757365308b386932fcf4cb80562ae9896920334d32f5dabbcc6c\" returns successfully" Apr 16 00:20:10.106559 containerd[1477]: time="2026-04-16T00:20:10.106003216Z" level=info msg="CreateContainer within sandbox \"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 16 00:20:10.149580 containerd[1477]: time="2026-04-16T00:20:10.149399419Z" level=info msg="CreateContainer within sandbox \"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"eef6ed3fa9f0bf06c829e5b8beec874ef721c30f9105317edf75181a7ff41bad\"" Apr 16 00:20:10.153649 containerd[1477]: time="2026-04-16T00:20:10.152232503Z" level=info msg="StartContainer for \"eef6ed3fa9f0bf06c829e5b8beec874ef721c30f9105317edf75181a7ff41bad\"" Apr 16 00:20:10.189897 systemd[1]: Started cri-containerd-eef6ed3fa9f0bf06c829e5b8beec874ef721c30f9105317edf75181a7ff41bad.scope - libcontainer container eef6ed3fa9f0bf06c829e5b8beec874ef721c30f9105317edf75181a7ff41bad. Apr 16 00:20:10.224120 containerd[1477]: time="2026-04-16T00:20:10.224070988Z" level=info msg="StartContainer for \"eef6ed3fa9f0bf06c829e5b8beec874ef721c30f9105317edf75181a7ff41bad\" returns successfully" Apr 16 00:20:10.382448 containerd[1477]: time="2026-04-16T00:20:10.382078982Z" level=info msg="StopPodSandbox for \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\"" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.447 [INFO][4920] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.447 [INFO][4920] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" iface="eth0" netns="/var/run/netns/cni-0b9cdad6-3005-f9d8-cc79-0a74837eca08" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.453 [INFO][4920] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" iface="eth0" netns="/var/run/netns/cni-0b9cdad6-3005-f9d8-cc79-0a74837eca08" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.453 [INFO][4920] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" iface="eth0" netns="/var/run/netns/cni-0b9cdad6-3005-f9d8-cc79-0a74837eca08" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.453 [INFO][4920] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.453 [INFO][4920] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.486 [INFO][4928] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.486 [INFO][4928] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.486 [INFO][4928] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.499 [WARNING][4928] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.499 [INFO][4928] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.503 [INFO][4928] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:10.509517 containerd[1477]: 2026-04-16 00:20:10.505 [INFO][4920] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:10.510266 containerd[1477]: time="2026-04-16T00:20:10.509790920Z" level=info msg="TearDown network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\" successfully" Apr 16 00:20:10.510266 containerd[1477]: time="2026-04-16T00:20:10.509831241Z" level=info msg="StopPodSandbox for \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\" returns successfully" Apr 16 00:20:10.513437 containerd[1477]: time="2026-04-16T00:20:10.512890732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-f6npc,Uid:7b041080-6884-4ac4-b0b2-bb667ac35753,Namespace:calico-system,Attempt:1,}" Apr 16 00:20:10.569135 systemd[1]: run-netns-cni\x2d0b9cdad6\x2d3005\x2df9d8\x2dcc79\x2d0a74837eca08.mount: Deactivated successfully. Apr 16 00:20:10.698238 systemd-networkd[1354]: cali4a3c2c51308: Link UP Apr 16 00:20:10.701579 systemd-networkd[1354]: cali4a3c2c51308: Gained carrier Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.574 [INFO][4936] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0 calico-apiserver-7dd9588bc7- calico-system 7b041080-6884-4ac4-b0b2-bb667ac35753 988 0 2026-04-16 00:19:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7dd9588bc7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f calico-apiserver-7dd9588bc7-f6npc eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali4a3c2c51308 [] [] }} ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.574 [INFO][4936] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.630 [INFO][4948] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" HandleID="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.648 [INFO][4948] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" HandleID="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fb860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"calico-apiserver-7dd9588bc7-f6npc", "timestamp":"2026-04-16 00:20:10.630294845 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000375760)} Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.648 [INFO][4948] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.648 [INFO][4948] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.649 [INFO][4948] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.652 [INFO][4948] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.658 [INFO][4948] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.665 [INFO][4948] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.668 [INFO][4948] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.671 [INFO][4948] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.671 [INFO][4948] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.673 [INFO][4948] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3 Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.679 [INFO][4948] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.687 [INFO][4948] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.71/26] block=192.168.25.64/26 handle="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.687 [INFO][4948] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.71/26] handle="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.687 [INFO][4948] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:10.724273 containerd[1477]: 2026-04-16 00:20:10.687 [INFO][4948] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.71/26] IPv6=[] ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" HandleID="k8s-pod-network.37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.726083 containerd[1477]: 2026-04-16 00:20:10.691 [INFO][4936] cni-plugin/k8s.go 418: Populated endpoint ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"7b041080-6884-4ac4-b0b2-bb667ac35753", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"calico-apiserver-7dd9588bc7-f6npc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4a3c2c51308", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:10.726083 containerd[1477]: 2026-04-16 00:20:10.691 [INFO][4936] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.71/32] ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.726083 containerd[1477]: 2026-04-16 00:20:10.691 [INFO][4936] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a3c2c51308 ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.726083 containerd[1477]: 2026-04-16 00:20:10.702 [INFO][4936] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.726083 containerd[1477]: 2026-04-16 00:20:10.703 [INFO][4936] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"7b041080-6884-4ac4-b0b2-bb667ac35753", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3", Pod:"calico-apiserver-7dd9588bc7-f6npc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4a3c2c51308", MAC:"e2:02:74:9a:a4:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:10.726083 containerd[1477]: 2026-04-16 00:20:10.721 [INFO][4936] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3" Namespace="calico-system" Pod="calico-apiserver-7dd9588bc7-f6npc" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:10.772288 containerd[1477]: time="2026-04-16T00:20:10.769734650Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:20:10.772288 containerd[1477]: time="2026-04-16T00:20:10.770568834Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:20:10.772288 containerd[1477]: time="2026-04-16T00:20:10.770996127Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:10.772288 containerd[1477]: time="2026-04-16T00:20:10.771182452Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:10.804326 systemd-networkd[1354]: calid0009655d70: Gained IPv6LL Apr 16 00:20:10.808386 systemd[1]: Started cri-containerd-37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3.scope - libcontainer container 37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3. Apr 16 00:20:10.869214 containerd[1477]: time="2026-04-16T00:20:10.868588934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7dd9588bc7-f6npc,Uid:7b041080-6884-4ac4-b0b2-bb667ac35753,Namespace:calico-system,Attempt:1,} returns sandbox id \"37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3\"" Apr 16 00:20:10.996160 systemd-networkd[1354]: cali61e670d94ff: Gained IPv6LL Apr 16 00:20:11.060802 systemd-networkd[1354]: calid60a44e5bb1: Gained IPv6LL Apr 16 00:20:11.315782 systemd-networkd[1354]: cali52222d88012: Gained IPv6LL Apr 16 00:20:11.382361 containerd[1477]: time="2026-04-16T00:20:11.382321546Z" level=info msg="StopPodSandbox for \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\"" Apr 16 00:20:11.439451 kubelet[2576]: I0416 00:20:11.438678 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-zjtd7" podStartSLOduration=44.438661128 podStartE2EDuration="44.438661128s" podCreationTimestamp="2026-04-16 00:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 00:20:10.772151521 +0000 UTC m=+50.530025286" watchObservedRunningTime="2026-04-16 00:20:11.438661128 +0000 UTC m=+51.196534893" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.441 [INFO][5039] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.441 [INFO][5039] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" iface="eth0" netns="/var/run/netns/cni-c45b033e-8b32-af4d-4dcd-7f7a2106af7f" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.441 [INFO][5039] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" iface="eth0" netns="/var/run/netns/cni-c45b033e-8b32-af4d-4dcd-7f7a2106af7f" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.443 [INFO][5039] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" iface="eth0" netns="/var/run/netns/cni-c45b033e-8b32-af4d-4dcd-7f7a2106af7f" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.443 [INFO][5039] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.443 [INFO][5039] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.467 [INFO][5046] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.467 [INFO][5046] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.467 [INFO][5046] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.481 [WARNING][5046] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.481 [INFO][5046] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.484 [INFO][5046] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:11.489570 containerd[1477]: 2026-04-16 00:20:11.487 [INFO][5039] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:11.494275 containerd[1477]: time="2026-04-16T00:20:11.492693683Z" level=info msg="TearDown network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\" successfully" Apr 16 00:20:11.494275 containerd[1477]: time="2026-04-16T00:20:11.492732004Z" level=info msg="StopPodSandbox for \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\" returns successfully" Apr 16 00:20:11.492982 systemd[1]: run-netns-cni\x2dc45b033e\x2d8b32\x2daf4d\x2d4dcd\x2d7f7a2106af7f.mount: Deactivated successfully. Apr 16 00:20:11.498386 containerd[1477]: time="2026-04-16T00:20:11.497811190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64786d454d-bl4hx,Uid:160db7e1-a4ae-471a-bd06-9c5e497e7d3a,Namespace:calico-system,Attempt:1,}" Apr 16 00:20:11.677987 systemd-networkd[1354]: calidc6ecbf906f: Link UP Apr 16 00:20:11.678568 systemd-networkd[1354]: calidc6ecbf906f: Gained carrier Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.582 [INFO][5052] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0 calico-kube-controllers-64786d454d- calico-system 160db7e1-a4ae-471a-bd06-9c5e497e7d3a 999 0 2026-04-16 00:19:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:64786d454d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-n-42941c021f calico-kube-controllers-64786d454d-bl4hx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidc6ecbf906f [] [] }} ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.582 [INFO][5052] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.610 [INFO][5064] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" HandleID="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.621 [INFO][5064] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" HandleID="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-42941c021f", "pod":"calico-kube-controllers-64786d454d-bl4hx", "timestamp":"2026-04-16 00:20:11.610049221 +0000 UTC"}, Hostname:"ci-4081-3-6-n-42941c021f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400040f080)} Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.621 [INFO][5064] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.621 [INFO][5064] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.621 [INFO][5064] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-42941c021f' Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.625 [INFO][5064] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.633 [INFO][5064] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.639 [INFO][5064] ipam/ipam.go 526: Trying affinity for 192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.643 [INFO][5064] ipam/ipam.go 160: Attempting to load block cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.646 [INFO][5064] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.646 [INFO][5064] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.649 [INFO][5064] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953 Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.655 [INFO][5064] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.667 [INFO][5064] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.25.72/26] block=192.168.25.64/26 handle="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.668 [INFO][5064] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.25.72/26] handle="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" host="ci-4081-3-6-n-42941c021f" Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.668 [INFO][5064] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:11.699100 containerd[1477]: 2026-04-16 00:20:11.668 [INFO][5064] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.25.72/26] IPv6=[] ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" HandleID="k8s-pod-network.b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.700775 containerd[1477]: 2026-04-16 00:20:11.672 [INFO][5052] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0", GenerateName:"calico-kube-controllers-64786d454d-", Namespace:"calico-system", SelfLink:"", UID:"160db7e1-a4ae-471a-bd06-9c5e497e7d3a", ResourceVersion:"999", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"64786d454d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"", Pod:"calico-kube-controllers-64786d454d-bl4hx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.25.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc6ecbf906f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:11.700775 containerd[1477]: 2026-04-16 00:20:11.672 [INFO][5052] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.25.72/32] ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.700775 containerd[1477]: 2026-04-16 00:20:11.672 [INFO][5052] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc6ecbf906f ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.700775 containerd[1477]: 2026-04-16 00:20:11.675 [INFO][5052] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.700775 containerd[1477]: 2026-04-16 00:20:11.679 [INFO][5052] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0", GenerateName:"calico-kube-controllers-64786d454d-", Namespace:"calico-system", SelfLink:"", UID:"160db7e1-a4ae-471a-bd06-9c5e497e7d3a", ResourceVersion:"999", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"64786d454d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953", Pod:"calico-kube-controllers-64786d454d-bl4hx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.25.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc6ecbf906f", MAC:"5a:29:71:02:13:eb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:11.700775 containerd[1477]: 2026-04-16 00:20:11.694 [INFO][5052] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953" Namespace="calico-system" Pod="calico-kube-controllers-64786d454d-bl4hx" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:11.730804 containerd[1477]: time="2026-04-16T00:20:11.729322133Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 16 00:20:11.730804 containerd[1477]: time="2026-04-16T00:20:11.729402296Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 16 00:20:11.730804 containerd[1477]: time="2026-04-16T00:20:11.729437937Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:11.730804 containerd[1477]: time="2026-04-16T00:20:11.729595381Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 16 00:20:11.766832 systemd[1]: Started cri-containerd-b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953.scope - libcontainer container b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953. Apr 16 00:20:11.923135 containerd[1477]: time="2026-04-16T00:20:11.923085790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-64786d454d-bl4hx,Uid:160db7e1-a4ae-471a-bd06-9c5e497e7d3a,Namespace:calico-system,Attempt:1,} returns sandbox id \"b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953\"" Apr 16 00:20:12.532363 systemd-networkd[1354]: cali4a3c2c51308: Gained IPv6LL Apr 16 00:20:12.852359 systemd-networkd[1354]: calidc6ecbf906f: Gained IPv6LL Apr 16 00:20:12.903818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1301103727.mount: Deactivated successfully. Apr 16 00:20:13.248226 containerd[1477]: time="2026-04-16T00:20:13.248170737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:13.250118 containerd[1477]: time="2026-04-16T00:20:13.249869144Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Apr 16 00:20:13.251246 containerd[1477]: time="2026-04-16T00:20:13.251021495Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:13.254873 containerd[1477]: time="2026-04-16T00:20:13.254456309Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:13.255424 containerd[1477]: time="2026-04-16T00:20:13.255387814Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 3.158939801s" Apr 16 00:20:13.255424 containerd[1477]: time="2026-04-16T00:20:13.255422855Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Apr 16 00:20:13.258852 containerd[1477]: time="2026-04-16T00:20:13.258790187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 16 00:20:13.264578 containerd[1477]: time="2026-04-16T00:20:13.264518543Z" level=info msg="CreateContainer within sandbox \"00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 16 00:20:13.287977 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1083523317.mount: Deactivated successfully. Apr 16 00:20:13.290106 containerd[1477]: time="2026-04-16T00:20:13.289971517Z" level=info msg="CreateContainer within sandbox \"00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619\"" Apr 16 00:20:13.291056 containerd[1477]: time="2026-04-16T00:20:13.290947103Z" level=info msg="StartContainer for \"2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619\"" Apr 16 00:20:13.329160 systemd[1]: Started cri-containerd-2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619.scope - libcontainer container 2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619. Apr 16 00:20:13.399264 containerd[1477]: time="2026-04-16T00:20:13.399118332Z" level=info msg="StartContainer for \"2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619\" returns successfully" Apr 16 00:20:13.808708 kubelet[2576]: I0416 00:20:13.807049 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-xxdtw" podStartSLOduration=28.534723223 podStartE2EDuration="32.80703225s" podCreationTimestamp="2026-04-16 00:19:41 +0000 UTC" firstStartedPulling="2026-04-16 00:20:08.984392943 +0000 UTC m=+48.742266708" lastFinishedPulling="2026-04-16 00:20:13.25670193 +0000 UTC m=+53.014575735" observedRunningTime="2026-04-16 00:20:13.806463875 +0000 UTC m=+53.564337640" watchObservedRunningTime="2026-04-16 00:20:13.80703225 +0000 UTC m=+53.564906015" Apr 16 00:20:14.807025 systemd[1]: run-containerd-runc-k8s.io-2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619-runc.duqwAi.mount: Deactivated successfully. Apr 16 00:20:16.756279 containerd[1477]: time="2026-04-16T00:20:16.756221561Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:16.758025 containerd[1477]: time="2026-04-16T00:20:16.757975405Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Apr 16 00:20:16.763210 containerd[1477]: time="2026-04-16T00:20:16.763150455Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.504295307s" Apr 16 00:20:16.763210 containerd[1477]: time="2026-04-16T00:20:16.763198656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 16 00:20:16.766332 containerd[1477]: time="2026-04-16T00:20:16.766186811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 16 00:20:16.777271 containerd[1477]: time="2026-04-16T00:20:16.777223769Z" level=info msg="CreateContainer within sandbox \"753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 16 00:20:16.783146 containerd[1477]: time="2026-04-16T00:20:16.782246935Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:16.783391 containerd[1477]: time="2026-04-16T00:20:16.783361123Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:16.801043 containerd[1477]: time="2026-04-16T00:20:16.800974006Z" level=info msg="CreateContainer within sandbox \"753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"006a8dc1aa149d3c1a3e059f02bc995316e250271138132646c2f9bfba53e4a2\"" Apr 16 00:20:16.801963 containerd[1477]: time="2026-04-16T00:20:16.801929510Z" level=info msg="StartContainer for \"006a8dc1aa149d3c1a3e059f02bc995316e250271138132646c2f9bfba53e4a2\"" Apr 16 00:20:16.853445 systemd[1]: Started cri-containerd-006a8dc1aa149d3c1a3e059f02bc995316e250271138132646c2f9bfba53e4a2.scope - libcontainer container 006a8dc1aa149d3c1a3e059f02bc995316e250271138132646c2f9bfba53e4a2. Apr 16 00:20:16.895206 containerd[1477]: time="2026-04-16T00:20:16.894893327Z" level=info msg="StartContainer for \"006a8dc1aa149d3c1a3e059f02bc995316e250271138132646c2f9bfba53e4a2\" returns successfully" Apr 16 00:20:17.827472 kubelet[2576]: I0416 00:20:17.827385 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-7dd9588bc7-fkgbz" podStartSLOduration=29.29807148 podStartE2EDuration="36.827365626s" podCreationTimestamp="2026-04-16 00:19:41 +0000 UTC" firstStartedPulling="2026-04-16 00:20:09.234952176 +0000 UTC m=+48.992825941" lastFinishedPulling="2026-04-16 00:20:16.764246322 +0000 UTC m=+56.522120087" observedRunningTime="2026-04-16 00:20:17.82548086 +0000 UTC m=+57.583354625" watchObservedRunningTime="2026-04-16 00:20:17.827365626 +0000 UTC m=+57.585239391" Apr 16 00:20:18.810639 kubelet[2576]: I0416 00:20:18.810539 2576 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:20:18.815562 containerd[1477]: time="2026-04-16T00:20:18.814707317Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:18.816101 containerd[1477]: time="2026-04-16T00:20:18.816071390Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Apr 16 00:20:18.817055 containerd[1477]: time="2026-04-16T00:20:18.817030013Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:18.820092 containerd[1477]: time="2026-04-16T00:20:18.820060205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:18.821106 containerd[1477]: time="2026-04-16T00:20:18.821075869Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 2.054841337s" Apr 16 00:20:18.821224 containerd[1477]: time="2026-04-16T00:20:18.821205792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Apr 16 00:20:18.822512 containerd[1477]: time="2026-04-16T00:20:18.822488063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 16 00:20:18.828059 containerd[1477]: time="2026-04-16T00:20:18.827974074Z" level=info msg="CreateContainer within sandbox \"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 16 00:20:18.847818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2630539679.mount: Deactivated successfully. Apr 16 00:20:18.857026 containerd[1477]: time="2026-04-16T00:20:18.856956005Z" level=info msg="CreateContainer within sandbox \"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"fe808b1d11a72849ad7eab7ab66020b97abf2d91293487d45ef6be5e7062b35f\"" Apr 16 00:20:18.859563 containerd[1477]: time="2026-04-16T00:20:18.859510946Z" level=info msg="StartContainer for \"fe808b1d11a72849ad7eab7ab66020b97abf2d91293487d45ef6be5e7062b35f\"" Apr 16 00:20:18.953913 systemd[1]: Started cri-containerd-fe808b1d11a72849ad7eab7ab66020b97abf2d91293487d45ef6be5e7062b35f.scope - libcontainer container fe808b1d11a72849ad7eab7ab66020b97abf2d91293487d45ef6be5e7062b35f. Apr 16 00:20:18.986885 containerd[1477]: time="2026-04-16T00:20:18.986827381Z" level=info msg="StartContainer for \"fe808b1d11a72849ad7eab7ab66020b97abf2d91293487d45ef6be5e7062b35f\" returns successfully" Apr 16 00:20:19.241740 containerd[1477]: time="2026-04-16T00:20:19.241678908Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:19.245696 containerd[1477]: time="2026-04-16T00:20:19.244471973Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 16 00:20:19.248237 containerd[1477]: time="2026-04-16T00:20:19.248182379Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 425.570473ms" Apr 16 00:20:19.248237 containerd[1477]: time="2026-04-16T00:20:19.248230460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 16 00:20:19.249638 containerd[1477]: time="2026-04-16T00:20:19.249575211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 16 00:20:19.255903 containerd[1477]: time="2026-04-16T00:20:19.255865317Z" level=info msg="CreateContainer within sandbox \"37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 16 00:20:19.273545 containerd[1477]: time="2026-04-16T00:20:19.273464726Z" level=info msg="CreateContainer within sandbox \"37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"94b8864fd4c888e69a0045655e82d95b2bd0865b60577712ce6170f334b766a0\"" Apr 16 00:20:19.275065 containerd[1477]: time="2026-04-16T00:20:19.274958601Z" level=info msg="StartContainer for \"94b8864fd4c888e69a0045655e82d95b2bd0865b60577712ce6170f334b766a0\"" Apr 16 00:20:19.318921 systemd[1]: Started cri-containerd-94b8864fd4c888e69a0045655e82d95b2bd0865b60577712ce6170f334b766a0.scope - libcontainer container 94b8864fd4c888e69a0045655e82d95b2bd0865b60577712ce6170f334b766a0. Apr 16 00:20:19.370371 containerd[1477]: time="2026-04-16T00:20:19.370261174Z" level=info msg="StartContainer for \"94b8864fd4c888e69a0045655e82d95b2bd0865b60577712ce6170f334b766a0\" returns successfully" Apr 16 00:20:19.487894 kubelet[2576]: I0416 00:20:19.487675 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 16 00:20:19.492366 kubelet[2576]: I0416 00:20:19.491995 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 16 00:20:19.848752 kubelet[2576]: I0416 00:20:19.847974 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-7dd9588bc7-f6npc" podStartSLOduration=30.472431389 podStartE2EDuration="38.847958587s" podCreationTimestamp="2026-04-16 00:19:41 +0000 UTC" firstStartedPulling="2026-04-16 00:20:10.873847049 +0000 UTC m=+50.631720814" lastFinishedPulling="2026-04-16 00:20:19.249374247 +0000 UTC m=+59.007248012" observedRunningTime="2026-04-16 00:20:19.84592162 +0000 UTC m=+59.603795345" watchObservedRunningTime="2026-04-16 00:20:19.847958587 +0000 UTC m=+59.605832352" Apr 16 00:20:20.404731 containerd[1477]: time="2026-04-16T00:20:20.404320624Z" level=info msg="StopPodSandbox for \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\"" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.480 [WARNING][5385] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0", GenerateName:"calico-kube-controllers-64786d454d-", Namespace:"calico-system", SelfLink:"", UID:"160db7e1-a4ae-471a-bd06-9c5e497e7d3a", ResourceVersion:"1002", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"64786d454d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953", Pod:"calico-kube-controllers-64786d454d-bl4hx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.25.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc6ecbf906f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.481 [INFO][5385] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.481 [INFO][5385] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" iface="eth0" netns="" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.481 [INFO][5385] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.481 [INFO][5385] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.519 [INFO][5392] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.520 [INFO][5392] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.520 [INFO][5392] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.540 [WARNING][5392] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.540 [INFO][5392] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.543 [INFO][5392] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:20.551382 containerd[1477]: 2026-04-16 00:20:20.547 [INFO][5385] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.552072 containerd[1477]: time="2026-04-16T00:20:20.551419712Z" level=info msg="TearDown network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\" successfully" Apr 16 00:20:20.552072 containerd[1477]: time="2026-04-16T00:20:20.551444352Z" level=info msg="StopPodSandbox for \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\" returns successfully" Apr 16 00:20:20.553181 containerd[1477]: time="2026-04-16T00:20:20.552692540Z" level=info msg="RemovePodSandbox for \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\"" Apr 16 00:20:20.555550 containerd[1477]: time="2026-04-16T00:20:20.555491644Z" level=info msg="Forcibly stopping sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\"" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.614 [WARNING][5407] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0", GenerateName:"calico-kube-controllers-64786d454d-", Namespace:"calico-system", SelfLink:"", UID:"160db7e1-a4ae-471a-bd06-9c5e497e7d3a", ResourceVersion:"1002", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"64786d454d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953", Pod:"calico-kube-controllers-64786d454d-bl4hx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.25.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc6ecbf906f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.614 [INFO][5407] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.614 [INFO][5407] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" iface="eth0" netns="" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.614 [INFO][5407] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.614 [INFO][5407] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.672 [INFO][5414] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.672 [INFO][5414] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.673 [INFO][5414] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.690 [WARNING][5414] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.690 [INFO][5414] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" HandleID="k8s-pod-network.f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--kube--controllers--64786d454d--bl4hx-eth0" Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.694 [INFO][5414] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:20.705407 containerd[1477]: 2026-04-16 00:20:20.701 [INFO][5407] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754" Apr 16 00:20:20.705407 containerd[1477]: time="2026-04-16T00:20:20.704901023Z" level=info msg="TearDown network for sandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\" successfully" Apr 16 00:20:20.722038 containerd[1477]: time="2026-04-16T00:20:20.721987250Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:20.722297 containerd[1477]: time="2026-04-16T00:20:20.722276016Z" level=info msg="RemovePodSandbox \"f6b1fecfaa74b5cbc0a4052639a356acc2c3b9dbd7860d2c9bbd34c1c7fcb754\" returns successfully" Apr 16 00:20:20.723175 containerd[1477]: time="2026-04-16T00:20:20.723141916Z" level=info msg="StopPodSandbox for \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\"" Apr 16 00:20:20.840212 kubelet[2576]: I0416 00:20:20.839664 2576 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.782 [WARNING][5428] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"59287dd3-2523-4d48-9666-1ef09e5795a0", ResourceVersion:"1030", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97", Pod:"calico-apiserver-7dd9588bc7-fkgbz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid60a44e5bb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.783 [INFO][5428] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.783 [INFO][5428] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" iface="eth0" netns="" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.783 [INFO][5428] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.783 [INFO][5428] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.814 [INFO][5435] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.814 [INFO][5435] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.814 [INFO][5435] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.828 [WARNING][5435] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.828 [INFO][5435] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.833 [INFO][5435] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:20.843220 containerd[1477]: 2026-04-16 00:20:20.840 [INFO][5428] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:20.844070 containerd[1477]: time="2026-04-16T00:20:20.843269553Z" level=info msg="TearDown network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\" successfully" Apr 16 00:20:20.844070 containerd[1477]: time="2026-04-16T00:20:20.843294034Z" level=info msg="StopPodSandbox for \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\" returns successfully" Apr 16 00:20:20.845667 containerd[1477]: time="2026-04-16T00:20:20.844559463Z" level=info msg="RemovePodSandbox for \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\"" Apr 16 00:20:20.845667 containerd[1477]: time="2026-04-16T00:20:20.844598704Z" level=info msg="Forcibly stopping sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\"" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.928 [WARNING][5450] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"59287dd3-2523-4d48-9666-1ef09e5795a0", ResourceVersion:"1030", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"753b66e1e13006eabeb2a3b68c36496c6a78455fa7cb52c4bc544cf9fc652b97", Pod:"calico-apiserver-7dd9588bc7-fkgbz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calid60a44e5bb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.928 [INFO][5450] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.928 [INFO][5450] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" iface="eth0" netns="" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.928 [INFO][5450] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.928 [INFO][5450] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.975 [INFO][5458] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.977 [INFO][5458] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.977 [INFO][5458] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.998 [WARNING][5458] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:20.998 [INFO][5458] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" HandleID="k8s-pod-network.05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--fkgbz-eth0" Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:21.000 [INFO][5458] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.007546 containerd[1477]: 2026-04-16 00:20:21.003 [INFO][5450] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a" Apr 16 00:20:21.007546 containerd[1477]: time="2026-04-16T00:20:21.007488184Z" level=info msg="TearDown network for sandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\" successfully" Apr 16 00:20:21.014030 containerd[1477]: time="2026-04-16T00:20:21.013915726Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:21.014030 containerd[1477]: time="2026-04-16T00:20:21.014068969Z" level=info msg="RemovePodSandbox \"05588b639efb91d88a2c0c80a25314771fd4f3d6ebf71aafbc208998cb96ce7a\" returns successfully" Apr 16 00:20:21.016283 containerd[1477]: time="2026-04-16T00:20:21.016237457Z" level=info msg="StopPodSandbox for \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\"" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.092 [WARNING][5472] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"cf25c361-9e08-4e96-8a6d-6349a738a504", ResourceVersion:"1003", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c", Pod:"coredns-7d764666f9-zjtd7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali52222d88012", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.092 [INFO][5472] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.092 [INFO][5472] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" iface="eth0" netns="" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.092 [INFO][5472] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.092 [INFO][5472] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.119 [INFO][5480] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.119 [INFO][5480] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.119 [INFO][5480] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.138 [WARNING][5480] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.138 [INFO][5480] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.142 [INFO][5480] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.147120 containerd[1477]: 2026-04-16 00:20:21.144 [INFO][5472] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.148137 containerd[1477]: time="2026-04-16T00:20:21.147166103Z" level=info msg="TearDown network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\" successfully" Apr 16 00:20:21.148137 containerd[1477]: time="2026-04-16T00:20:21.147194703Z" level=info msg="StopPodSandbox for \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\" returns successfully" Apr 16 00:20:21.148137 containerd[1477]: time="2026-04-16T00:20:21.147731915Z" level=info msg="RemovePodSandbox for \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\"" Apr 16 00:20:21.148137 containerd[1477]: time="2026-04-16T00:20:21.147760476Z" level=info msg="Forcibly stopping sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\"" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.217 [WARNING][5495] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"cf25c361-9e08-4e96-8a6d-6349a738a504", ResourceVersion:"1003", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"29a9bc48a35968d15dcce679a5063ea612bd81105e2111ab61b951cd60bdca9c", Pod:"coredns-7d764666f9-zjtd7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali52222d88012", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.217 [INFO][5495] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.217 [INFO][5495] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" iface="eth0" netns="" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.217 [INFO][5495] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.217 [INFO][5495] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.260 [INFO][5503] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.261 [INFO][5503] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.261 [INFO][5503] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.275 [WARNING][5503] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.275 [INFO][5503] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" HandleID="k8s-pod-network.4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--zjtd7-eth0" Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.278 [INFO][5503] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.285993 containerd[1477]: 2026-04-16 00:20:21.283 [INFO][5495] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731" Apr 16 00:20:21.287785 containerd[1477]: time="2026-04-16T00:20:21.286218887Z" level=info msg="TearDown network for sandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\" successfully" Apr 16 00:20:21.293813 containerd[1477]: time="2026-04-16T00:20:21.293756893Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:21.293954 containerd[1477]: time="2026-04-16T00:20:21.293843615Z" level=info msg="RemovePodSandbox \"4133868130469d3d0fb88663856fd86e1548085880b8c8c2dc9656ddc5cea731\" returns successfully" Apr 16 00:20:21.295823 containerd[1477]: time="2026-04-16T00:20:21.295781418Z" level=info msg="StopPodSandbox for \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\"" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.354 [WARNING][5517] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"1c032d05-2ae2-4d00-a7b9-b6421d73e412", ResourceVersion:"1016", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3", Pod:"goldmane-9f7667bb8-xxdtw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.25.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid0009655d70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.354 [INFO][5517] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.354 [INFO][5517] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" iface="eth0" netns="" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.354 [INFO][5517] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.354 [INFO][5517] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.379 [INFO][5524] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.379 [INFO][5524] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.379 [INFO][5524] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.392 [WARNING][5524] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.392 [INFO][5524] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.396 [INFO][5524] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.401876 containerd[1477]: 2026-04-16 00:20:21.399 [INFO][5517] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.401876 containerd[1477]: time="2026-04-16T00:20:21.401865956Z" level=info msg="TearDown network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\" successfully" Apr 16 00:20:21.401876 containerd[1477]: time="2026-04-16T00:20:21.401893677Z" level=info msg="StopPodSandbox for \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\" returns successfully" Apr 16 00:20:21.403440 containerd[1477]: time="2026-04-16T00:20:21.402437569Z" level=info msg="RemovePodSandbox for \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\"" Apr 16 00:20:21.403440 containerd[1477]: time="2026-04-16T00:20:21.402507210Z" level=info msg="Forcibly stopping sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\"" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.458 [WARNING][5538] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"1c032d05-2ae2-4d00-a7b9-b6421d73e412", ResourceVersion:"1016", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"00698c9e50616f823ccf7a80344dbb6556557bd8e0eba6edde5488c2b4ac2df3", Pod:"goldmane-9f7667bb8-xxdtw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.25.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid0009655d70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.459 [INFO][5538] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.459 [INFO][5538] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" iface="eth0" netns="" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.459 [INFO][5538] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.459 [INFO][5538] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.482 [INFO][5545] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.483 [INFO][5545] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.483 [INFO][5545] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.496 [WARNING][5545] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.496 [INFO][5545] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" HandleID="k8s-pod-network.c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Workload="ci--4081--3--6--n--42941c021f-k8s-goldmane--9f7667bb8--xxdtw-eth0" Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.498 [INFO][5545] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.502349 containerd[1477]: 2026-04-16 00:20:21.500 [INFO][5538] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058" Apr 16 00:20:21.504815 containerd[1477]: time="2026-04-16T00:20:21.502693098Z" level=info msg="TearDown network for sandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\" successfully" Apr 16 00:20:21.506794 containerd[1477]: time="2026-04-16T00:20:21.506729947Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:21.507164 containerd[1477]: time="2026-04-16T00:20:21.507140836Z" level=info msg="RemovePodSandbox \"c542c52da32c3b5eaee8355f3b550cadcb9f6ca9601345f1f984afc786e35058\" returns successfully" Apr 16 00:20:21.507991 containerd[1477]: time="2026-04-16T00:20:21.507951494Z" level=info msg="StopPodSandbox for \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\"" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.558 [WARNING][5559] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.558 [INFO][5559] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.558 [INFO][5559] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" iface="eth0" netns="" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.558 [INFO][5559] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.558 [INFO][5559] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.581 [INFO][5566] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.581 [INFO][5566] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.581 [INFO][5566] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.602 [WARNING][5566] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.602 [INFO][5566] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.608 [INFO][5566] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.616672 containerd[1477]: 2026-04-16 00:20:21.612 [INFO][5559] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.618551 containerd[1477]: time="2026-04-16T00:20:21.616650930Z" level=info msg="TearDown network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\" successfully" Apr 16 00:20:21.618551 containerd[1477]: time="2026-04-16T00:20:21.617568830Z" level=info msg="StopPodSandbox for \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\" returns successfully" Apr 16 00:20:21.618551 containerd[1477]: time="2026-04-16T00:20:21.618344007Z" level=info msg="RemovePodSandbox for \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\"" Apr 16 00:20:21.618551 containerd[1477]: time="2026-04-16T00:20:21.618377208Z" level=info msg="Forcibly stopping sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\"" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.658 [WARNING][5580] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" WorkloadEndpoint="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.658 [INFO][5580] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.658 [INFO][5580] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" iface="eth0" netns="" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.658 [INFO][5580] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.658 [INFO][5580] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.683 [INFO][5587] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.683 [INFO][5587] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.683 [INFO][5587] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.696 [WARNING][5587] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.696 [INFO][5587] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" HandleID="k8s-pod-network.d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Workload="ci--4081--3--6--n--42941c021f-k8s-whisker--5b9458fc5c--d2f82-eth0" Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.699 [INFO][5587] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.706427 containerd[1477]: 2026-04-16 00:20:21.703 [INFO][5580] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6" Apr 16 00:20:21.707313 containerd[1477]: time="2026-04-16T00:20:21.707100523Z" level=info msg="TearDown network for sandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\" successfully" Apr 16 00:20:21.712871 containerd[1477]: time="2026-04-16T00:20:21.712820129Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:21.713027 containerd[1477]: time="2026-04-16T00:20:21.712908411Z" level=info msg="RemovePodSandbox \"d18ed41af768a5c0bafef5544761b2d0d39ec8de9ad880fb6c2b081f67f28ce6\" returns successfully" Apr 16 00:20:21.713539 containerd[1477]: time="2026-04-16T00:20:21.713513584Z" level=info msg="StopPodSandbox for \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\"" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.782 [WARNING][5605] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76", ResourceVersion:"1049", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52", Pod:"csi-node-driver-bjzcx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.25.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali973e535c87f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.782 [INFO][5605] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.782 [INFO][5605] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" iface="eth0" netns="" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.782 [INFO][5605] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.782 [INFO][5605] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.844 [INFO][5612] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.845 [INFO][5612] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.845 [INFO][5612] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.872 [WARNING][5612] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.872 [INFO][5612] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.876 [INFO][5612] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:21.883691 containerd[1477]: 2026-04-16 00:20:21.880 [INFO][5605] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:21.884316 containerd[1477]: time="2026-04-16T00:20:21.883722016Z" level=info msg="TearDown network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\" successfully" Apr 16 00:20:21.884316 containerd[1477]: time="2026-04-16T00:20:21.883753376Z" level=info msg="StopPodSandbox for \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\" returns successfully" Apr 16 00:20:21.885097 containerd[1477]: time="2026-04-16T00:20:21.885054085Z" level=info msg="RemovePodSandbox for \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\"" Apr 16 00:20:21.885217 containerd[1477]: time="2026-04-16T00:20:21.885107006Z" level=info msg="Forcibly stopping sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\"" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.945 [WARNING][5632] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6e5a5131-3036-4d73-9c8f-e82ea8bfcf76", ResourceVersion:"1049", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"ab9321f0275c5de361d7143a5ab4b6873ee638bf5b5c7d0cfe3c95f5af9b3b52", Pod:"csi-node-driver-bjzcx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.25.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali973e535c87f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.946 [INFO][5632] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.947 [INFO][5632] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" iface="eth0" netns="" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.947 [INFO][5632] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.947 [INFO][5632] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.982 [INFO][5639] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.982 [INFO][5639] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.982 [INFO][5639] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.997 [WARNING][5639] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:21.997 [INFO][5639] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" HandleID="k8s-pod-network.846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Workload="ci--4081--3--6--n--42941c021f-k8s-csi--node--driver--bjzcx-eth0" Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:22.002 [INFO][5639] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:22.009354 containerd[1477]: 2026-04-16 00:20:22.006 [INFO][5632] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23" Apr 16 00:20:22.010630 containerd[1477]: time="2026-04-16T00:20:22.010416242Z" level=info msg="TearDown network for sandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\" successfully" Apr 16 00:20:22.023229 containerd[1477]: time="2026-04-16T00:20:22.022581783Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:22.023229 containerd[1477]: time="2026-04-16T00:20:22.022745547Z" level=info msg="RemovePodSandbox \"846a18e1a76ad902fd0026a51ec702f4a4a6977fa93827c0beaa65bf017f4c23\" returns successfully" Apr 16 00:20:22.023943 containerd[1477]: time="2026-04-16T00:20:22.023748449Z" level=info msg="StopPodSandbox for \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\"" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.105 [WARNING][5653] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"7b041080-6884-4ac4-b0b2-bb667ac35753", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3", Pod:"calico-apiserver-7dd9588bc7-f6npc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4a3c2c51308", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.105 [INFO][5653] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.105 [INFO][5653] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" iface="eth0" netns="" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.105 [INFO][5653] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.105 [INFO][5653] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.136 [INFO][5660] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.136 [INFO][5660] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.136 [INFO][5660] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.151 [WARNING][5660] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.151 [INFO][5660] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.155 [INFO][5660] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:22.162887 containerd[1477]: 2026-04-16 00:20:22.158 [INFO][5653] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.164483 containerd[1477]: time="2026-04-16T00:20:22.164440190Z" level=info msg="TearDown network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\" successfully" Apr 16 00:20:22.164483 containerd[1477]: time="2026-04-16T00:20:22.164483991Z" level=info msg="StopPodSandbox for \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\" returns successfully" Apr 16 00:20:22.165362 containerd[1477]: time="2026-04-16T00:20:22.165324809Z" level=info msg="RemovePodSandbox for \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\"" Apr 16 00:20:22.165362 containerd[1477]: time="2026-04-16T00:20:22.165364970Z" level=info msg="Forcibly stopping sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\"" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.225 [WARNING][5674] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0", GenerateName:"calico-apiserver-7dd9588bc7-", Namespace:"calico-system", SelfLink:"", UID:"7b041080-6884-4ac4-b0b2-bb667ac35753", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7dd9588bc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"37aeee562bb8784ba43d21e09d5a02491aa4edf28df2a87fcf50057d1a55b0f3", Pod:"calico-apiserver-7dd9588bc7-f6npc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4a3c2c51308", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.225 [INFO][5674] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.225 [INFO][5674] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" iface="eth0" netns="" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.225 [INFO][5674] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.225 [INFO][5674] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.269 [INFO][5681] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.269 [INFO][5681] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.270 [INFO][5681] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.288 [WARNING][5681] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.288 [INFO][5681] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" HandleID="k8s-pod-network.da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Workload="ci--4081--3--6--n--42941c021f-k8s-calico--apiserver--7dd9588bc7--f6npc-eth0" Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.291 [INFO][5681] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:22.302951 containerd[1477]: 2026-04-16 00:20:22.297 [INFO][5674] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45" Apr 16 00:20:22.303518 containerd[1477]: time="2026-04-16T00:20:22.303046046Z" level=info msg="TearDown network for sandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\" successfully" Apr 16 00:20:22.315355 containerd[1477]: time="2026-04-16T00:20:22.315280189Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:22.315355 containerd[1477]: time="2026-04-16T00:20:22.315372231Z" level=info msg="RemovePodSandbox \"da53aa732dca7dc5cf980cbf6caf6ff91a4f8b446d16053cb74437c0c5757c45\" returns successfully" Apr 16 00:20:22.316592 containerd[1477]: time="2026-04-16T00:20:22.316450974Z" level=info msg="StopPodSandbox for \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\"" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.399 [WARNING][5696] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"60276bc4-09cf-4ae6-9503-beaf347fe010", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e", Pod:"coredns-7d764666f9-xph97", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali61e670d94ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.399 [INFO][5696] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.399 [INFO][5696] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" iface="eth0" netns="" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.399 [INFO][5696] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.399 [INFO][5696] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.443 [INFO][5703] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.443 [INFO][5703] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.443 [INFO][5703] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.460 [WARNING][5703] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.460 [INFO][5703] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.464 [INFO][5703] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:22.474544 containerd[1477]: 2026-04-16 00:20:22.466 [INFO][5696] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.474544 containerd[1477]: time="2026-04-16T00:20:22.474417607Z" level=info msg="TearDown network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\" successfully" Apr 16 00:20:22.474544 containerd[1477]: time="2026-04-16T00:20:22.474540129Z" level=info msg="StopPodSandbox for \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\" returns successfully" Apr 16 00:20:22.475335 containerd[1477]: time="2026-04-16T00:20:22.475276345Z" level=info msg="RemovePodSandbox for \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\"" Apr 16 00:20:22.475335 containerd[1477]: time="2026-04-16T00:20:22.475309546Z" level=info msg="Forcibly stopping sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\"" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.535 [WARNING][5718] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"60276bc4-09cf-4ae6-9503-beaf347fe010", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 0, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-42941c021f", ContainerID:"ee76753e4f2c40d8c8abb1a5a1fc7311255754f94e5464e95882fa02883f5d1e", Pod:"coredns-7d764666f9-xph97", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali61e670d94ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.535 [INFO][5718] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.535 [INFO][5718] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" iface="eth0" netns="" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.535 [INFO][5718] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.535 [INFO][5718] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.568 [INFO][5725] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.568 [INFO][5725] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.568 [INFO][5725] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.583 [WARNING][5725] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.583 [INFO][5725] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" HandleID="k8s-pod-network.c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Workload="ci--4081--3--6--n--42941c021f-k8s-coredns--7d764666f9--xph97-eth0" Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.587 [INFO][5725] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 00:20:22.594032 containerd[1477]: 2026-04-16 00:20:22.590 [INFO][5718] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9" Apr 16 00:20:22.594032 containerd[1477]: time="2026-04-16T00:20:22.592785749Z" level=info msg="TearDown network for sandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\" successfully" Apr 16 00:20:22.603987 containerd[1477]: time="2026-04-16T00:20:22.603728624Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 16 00:20:22.604147 containerd[1477]: time="2026-04-16T00:20:22.604036310Z" level=info msg="RemovePodSandbox \"c6c92049ad4ff7ed7d4e34b595cbdabafb50cadb3cf980a83993fa40704e67b9\" returns successfully" Apr 16 00:20:22.705934 containerd[1477]: time="2026-04-16T00:20:22.705871777Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:22.707426 containerd[1477]: time="2026-04-16T00:20:22.707380170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Apr 16 00:20:22.708287 containerd[1477]: time="2026-04-16T00:20:22.707953182Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:22.710632 containerd[1477]: time="2026-04-16T00:20:22.710354793Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 00:20:22.711567 containerd[1477]: time="2026-04-16T00:20:22.711175291Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.461535719s" Apr 16 00:20:22.711567 containerd[1477]: time="2026-04-16T00:20:22.711213892Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Apr 16 00:20:22.726828 containerd[1477]: time="2026-04-16T00:20:22.726432459Z" level=info msg="CreateContainer within sandbox \"b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 16 00:20:22.743169 containerd[1477]: time="2026-04-16T00:20:22.743112417Z" level=info msg="CreateContainer within sandbox \"b12f6f6c090d04e18633c3efab06cee797ef3c0cf83e7f11f16fdac09c8e5953\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"fca278c5fb0e2e00d0371d3e93e4a365e1b9d47bd709631347539a62b353e3e9\"" Apr 16 00:20:22.743834 containerd[1477]: time="2026-04-16T00:20:22.743652389Z" level=info msg="StartContainer for \"fca278c5fb0e2e00d0371d3e93e4a365e1b9d47bd709631347539a62b353e3e9\"" Apr 16 00:20:22.784909 systemd[1]: Started cri-containerd-fca278c5fb0e2e00d0371d3e93e4a365e1b9d47bd709631347539a62b353e3e9.scope - libcontainer container fca278c5fb0e2e00d0371d3e93e4a365e1b9d47bd709631347539a62b353e3e9. Apr 16 00:20:22.823700 containerd[1477]: time="2026-04-16T00:20:22.823107215Z" level=info msg="StartContainer for \"fca278c5fb0e2e00d0371d3e93e4a365e1b9d47bd709631347539a62b353e3e9\" returns successfully" Apr 16 00:20:22.893320 kubelet[2576]: I0416 00:20:22.891957 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-bjzcx" podStartSLOduration=29.83364086 podStartE2EDuration="40.891943773s" podCreationTimestamp="2026-04-16 00:19:42 +0000 UTC" firstStartedPulling="2026-04-16 00:20:07.764000426 +0000 UTC m=+47.521874151" lastFinishedPulling="2026-04-16 00:20:18.822303299 +0000 UTC m=+58.580177064" observedRunningTime="2026-04-16 00:20:19.865171907 +0000 UTC m=+59.623045632" watchObservedRunningTime="2026-04-16 00:20:22.891943773 +0000 UTC m=+62.649817538" Apr 16 00:20:23.942845 kubelet[2576]: I0416 00:20:23.942634 2576 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-64786d454d-bl4hx" podStartSLOduration=31.159549193 podStartE2EDuration="41.942589461s" podCreationTimestamp="2026-04-16 00:19:42 +0000 UTC" firstStartedPulling="2026-04-16 00:20:11.929278608 +0000 UTC m=+51.687152373" lastFinishedPulling="2026-04-16 00:20:22.712318876 +0000 UTC m=+62.470192641" observedRunningTime="2026-04-16 00:20:22.894844315 +0000 UTC m=+62.652718160" watchObservedRunningTime="2026-04-16 00:20:23.942589461 +0000 UTC m=+63.700463226" Apr 16 00:20:36.679399 kubelet[2576]: I0416 00:20:36.678697 2576 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:20:44.814633 systemd[1]: run-containerd-runc-k8s.io-2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619-runc.1m89oH.mount: Deactivated successfully. Apr 16 00:20:45.520886 kubelet[2576]: I0416 00:20:45.520307 2576 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 16 00:21:26.166278 systemd[1]: run-containerd-runc-k8s.io-2ea4ff7f33ab4d4483cdef06d020b713a39b3c92383a633b24ef3a352ab14619-runc.QoVGh3.mount: Deactivated successfully. Apr 16 00:21:34.391220 systemd[1]: run-containerd-runc-k8s.io-fca278c5fb0e2e00d0371d3e93e4a365e1b9d47bd709631347539a62b353e3e9-runc.PVOHdC.mount: Deactivated successfully. Apr 16 00:21:45.959727 systemd[1]: Started sshd@7-78.46.194.74:22-4.175.71.9:54638.service - OpenSSH per-connection server daemon (4.175.71.9:54638). Apr 16 00:21:46.082444 sshd[6166]: Accepted publickey for core from 4.175.71.9 port 54638 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:21:46.086051 sshd[6166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:21:46.092152 systemd-logind[1452]: New session 8 of user core. Apr 16 00:21:46.097841 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 16 00:21:46.288768 sshd[6166]: pam_unix(sshd:session): session closed for user core Apr 16 00:21:46.294394 systemd[1]: sshd@7-78.46.194.74:22-4.175.71.9:54638.service: Deactivated successfully. Apr 16 00:21:46.297338 systemd[1]: session-8.scope: Deactivated successfully. Apr 16 00:21:46.299738 systemd-logind[1452]: Session 8 logged out. Waiting for processes to exit. Apr 16 00:21:46.300952 systemd-logind[1452]: Removed session 8. Apr 16 00:21:51.328135 systemd[1]: Started sshd@8-78.46.194.74:22-4.175.71.9:54650.service - OpenSSH per-connection server daemon (4.175.71.9:54650). Apr 16 00:21:51.463654 sshd[6181]: Accepted publickey for core from 4.175.71.9 port 54650 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:21:51.465387 sshd[6181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:21:51.473233 systemd-logind[1452]: New session 9 of user core. Apr 16 00:21:51.481103 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 16 00:21:51.670911 sshd[6181]: pam_unix(sshd:session): session closed for user core Apr 16 00:21:51.679188 systemd[1]: sshd@8-78.46.194.74:22-4.175.71.9:54650.service: Deactivated successfully. Apr 16 00:21:51.684353 systemd[1]: session-9.scope: Deactivated successfully. Apr 16 00:21:51.686084 systemd-logind[1452]: Session 9 logged out. Waiting for processes to exit. Apr 16 00:21:51.687789 systemd-logind[1452]: Removed session 9. Apr 16 00:21:56.704105 systemd[1]: Started sshd@9-78.46.194.74:22-4.175.71.9:45298.service - OpenSSH per-connection server daemon (4.175.71.9:45298). Apr 16 00:21:56.839043 sshd[6213]: Accepted publickey for core from 4.175.71.9 port 45298 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:21:56.846152 sshd[6213]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:21:56.854366 systemd-logind[1452]: New session 10 of user core. Apr 16 00:21:56.861754 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 16 00:21:57.066774 sshd[6213]: pam_unix(sshd:session): session closed for user core Apr 16 00:21:57.072855 systemd[1]: sshd@9-78.46.194.74:22-4.175.71.9:45298.service: Deactivated successfully. Apr 16 00:21:57.076050 systemd[1]: session-10.scope: Deactivated successfully. Apr 16 00:21:57.078552 systemd-logind[1452]: Session 10 logged out. Waiting for processes to exit. Apr 16 00:21:57.080147 systemd-logind[1452]: Removed session 10. Apr 16 00:22:02.096810 systemd[1]: Started sshd@10-78.46.194.74:22-4.175.71.9:45300.service - OpenSSH per-connection server daemon (4.175.71.9:45300). Apr 16 00:22:02.229659 sshd[6258]: Accepted publickey for core from 4.175.71.9 port 45300 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:02.230939 sshd[6258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:02.238019 systemd-logind[1452]: New session 11 of user core. Apr 16 00:22:02.246906 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 16 00:22:02.452450 sshd[6258]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:02.458883 systemd[1]: sshd@10-78.46.194.74:22-4.175.71.9:45300.service: Deactivated successfully. Apr 16 00:22:02.463232 systemd[1]: session-11.scope: Deactivated successfully. Apr 16 00:22:02.465276 systemd-logind[1452]: Session 11 logged out. Waiting for processes to exit. Apr 16 00:22:02.468480 systemd-logind[1452]: Removed session 11. Apr 16 00:22:02.492971 systemd[1]: Started sshd@11-78.46.194.74:22-4.175.71.9:45304.service - OpenSSH per-connection server daemon (4.175.71.9:45304). Apr 16 00:22:02.617674 sshd[6281]: Accepted publickey for core from 4.175.71.9 port 45304 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:02.620110 sshd[6281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:02.627470 systemd-logind[1452]: New session 12 of user core. Apr 16 00:22:02.633992 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 16 00:22:02.896528 sshd[6281]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:02.904062 systemd[1]: sshd@11-78.46.194.74:22-4.175.71.9:45304.service: Deactivated successfully. Apr 16 00:22:02.908569 systemd[1]: session-12.scope: Deactivated successfully. Apr 16 00:22:02.916339 systemd-logind[1452]: Session 12 logged out. Waiting for processes to exit. Apr 16 00:22:02.941141 systemd[1]: Started sshd@12-78.46.194.74:22-4.175.71.9:45320.service - OpenSSH per-connection server daemon (4.175.71.9:45320). Apr 16 00:22:02.943122 systemd-logind[1452]: Removed session 12. Apr 16 00:22:03.072665 sshd[6292]: Accepted publickey for core from 4.175.71.9 port 45320 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:03.074907 sshd[6292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:03.083873 systemd-logind[1452]: New session 13 of user core. Apr 16 00:22:03.087031 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 16 00:22:03.283481 sshd[6292]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:03.291197 systemd[1]: sshd@12-78.46.194.74:22-4.175.71.9:45320.service: Deactivated successfully. Apr 16 00:22:03.294564 systemd[1]: session-13.scope: Deactivated successfully. Apr 16 00:22:03.296375 systemd-logind[1452]: Session 13 logged out. Waiting for processes to exit. Apr 16 00:22:03.298076 systemd-logind[1452]: Removed session 13. Apr 16 00:22:08.316968 systemd[1]: Started sshd@13-78.46.194.74:22-4.175.71.9:37896.service - OpenSSH per-connection server daemon (4.175.71.9:37896). Apr 16 00:22:08.432641 sshd[6304]: Accepted publickey for core from 4.175.71.9 port 37896 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:08.434932 sshd[6304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:08.440585 systemd-logind[1452]: New session 14 of user core. Apr 16 00:22:08.448200 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 16 00:22:08.628577 sshd[6304]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:08.634746 systemd[1]: sshd@13-78.46.194.74:22-4.175.71.9:37896.service: Deactivated successfully. Apr 16 00:22:08.639311 systemd[1]: session-14.scope: Deactivated successfully. Apr 16 00:22:08.640758 systemd-logind[1452]: Session 14 logged out. Waiting for processes to exit. Apr 16 00:22:08.641993 systemd-logind[1452]: Removed session 14. Apr 16 00:22:13.669184 systemd[1]: Started sshd@14-78.46.194.74:22-4.175.71.9:37908.service - OpenSSH per-connection server daemon (4.175.71.9:37908). Apr 16 00:22:13.787262 sshd[6317]: Accepted publickey for core from 4.175.71.9 port 37908 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:13.791095 sshd[6317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:13.797659 systemd-logind[1452]: New session 15 of user core. Apr 16 00:22:13.807945 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 16 00:22:14.011165 sshd[6317]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:14.020863 systemd[1]: sshd@14-78.46.194.74:22-4.175.71.9:37908.service: Deactivated successfully. Apr 16 00:22:14.026065 systemd[1]: session-15.scope: Deactivated successfully. Apr 16 00:22:14.027769 systemd-logind[1452]: Session 15 logged out. Waiting for processes to exit. Apr 16 00:22:14.050163 systemd[1]: Started sshd@15-78.46.194.74:22-4.175.71.9:37910.service - OpenSSH per-connection server daemon (4.175.71.9:37910). Apr 16 00:22:14.051589 systemd-logind[1452]: Removed session 15. Apr 16 00:22:14.186724 sshd[6330]: Accepted publickey for core from 4.175.71.9 port 37910 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:14.188008 sshd[6330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:14.196720 systemd-logind[1452]: New session 16 of user core. Apr 16 00:22:14.203960 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 16 00:22:14.573703 sshd[6330]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:14.579539 systemd[1]: sshd@15-78.46.194.74:22-4.175.71.9:37910.service: Deactivated successfully. Apr 16 00:22:14.583027 systemd[1]: session-16.scope: Deactivated successfully. Apr 16 00:22:14.585923 systemd-logind[1452]: Session 16 logged out. Waiting for processes to exit. Apr 16 00:22:14.607153 systemd[1]: Started sshd@16-78.46.194.74:22-4.175.71.9:37922.service - OpenSSH per-connection server daemon (4.175.71.9:37922). Apr 16 00:22:14.610841 systemd-logind[1452]: Removed session 16. Apr 16 00:22:14.728677 sshd[6341]: Accepted publickey for core from 4.175.71.9 port 37922 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:14.729886 sshd[6341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:14.736909 systemd-logind[1452]: New session 17 of user core. Apr 16 00:22:14.742892 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 16 00:22:15.747716 sshd[6341]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:15.753902 systemd[1]: sshd@16-78.46.194.74:22-4.175.71.9:37922.service: Deactivated successfully. Apr 16 00:22:15.757989 systemd[1]: session-17.scope: Deactivated successfully. Apr 16 00:22:15.761765 systemd-logind[1452]: Session 17 logged out. Waiting for processes to exit. Apr 16 00:22:15.777951 systemd-logind[1452]: Removed session 17. Apr 16 00:22:15.789039 systemd[1]: Started sshd@17-78.46.194.74:22-4.175.71.9:54898.service - OpenSSH per-connection server daemon (4.175.71.9:54898). Apr 16 00:22:15.915118 sshd[6377]: Accepted publickey for core from 4.175.71.9 port 54898 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:15.918276 sshd[6377]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:15.931324 systemd-logind[1452]: New session 18 of user core. Apr 16 00:22:15.937893 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 16 00:22:16.271490 sshd[6377]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:16.278056 systemd[1]: sshd@17-78.46.194.74:22-4.175.71.9:54898.service: Deactivated successfully. Apr 16 00:22:16.284288 systemd[1]: session-18.scope: Deactivated successfully. Apr 16 00:22:16.288155 systemd-logind[1452]: Session 18 logged out. Waiting for processes to exit. Apr 16 00:22:16.310262 systemd[1]: Started sshd@18-78.46.194.74:22-4.175.71.9:54902.service - OpenSSH per-connection server daemon (4.175.71.9:54902). Apr 16 00:22:16.311960 systemd-logind[1452]: Removed session 18. Apr 16 00:22:16.442095 sshd[6395]: Accepted publickey for core from 4.175.71.9 port 54902 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:16.445323 sshd[6395]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:16.451434 systemd-logind[1452]: New session 19 of user core. Apr 16 00:22:16.456827 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 16 00:22:16.645149 sshd[6395]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:16.650210 systemd[1]: sshd@18-78.46.194.74:22-4.175.71.9:54902.service: Deactivated successfully. Apr 16 00:22:16.654501 systemd[1]: session-19.scope: Deactivated successfully. Apr 16 00:22:16.657245 systemd-logind[1452]: Session 19 logged out. Waiting for processes to exit. Apr 16 00:22:16.659279 systemd-logind[1452]: Removed session 19. Apr 16 00:22:21.682088 systemd[1]: Started sshd@19-78.46.194.74:22-4.175.71.9:54916.service - OpenSSH per-connection server daemon (4.175.71.9:54916). Apr 16 00:22:21.809101 sshd[6414]: Accepted publickey for core from 4.175.71.9 port 54916 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:21.812848 sshd[6414]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:21.821657 systemd-logind[1452]: New session 20 of user core. Apr 16 00:22:21.831971 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 16 00:22:22.163025 sshd[6414]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:22.170971 systemd-logind[1452]: Session 20 logged out. Waiting for processes to exit. Apr 16 00:22:22.171900 systemd[1]: sshd@19-78.46.194.74:22-4.175.71.9:54916.service: Deactivated successfully. Apr 16 00:22:22.174756 systemd[1]: session-20.scope: Deactivated successfully. Apr 16 00:22:22.177040 systemd-logind[1452]: Removed session 20. Apr 16 00:22:27.195332 systemd[1]: Started sshd@20-78.46.194.74:22-4.175.71.9:57396.service - OpenSSH per-connection server daemon (4.175.71.9:57396). Apr 16 00:22:27.319679 sshd[6466]: Accepted publickey for core from 4.175.71.9 port 57396 ssh2: RSA SHA256:es51nA5SMoytRkY/yLSoOOH2KLr0mt1MIHk0lTLGO0M Apr 16 00:22:27.322797 sshd[6466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 00:22:27.328419 systemd-logind[1452]: New session 21 of user core. Apr 16 00:22:27.333837 systemd[1]: Started session-21.scope - Session 21 of User core. Apr 16 00:22:27.504565 sshd[6466]: pam_unix(sshd:session): session closed for user core Apr 16 00:22:27.510199 systemd-logind[1452]: Session 21 logged out. Waiting for processes to exit. Apr 16 00:22:27.511005 systemd[1]: sshd@20-78.46.194.74:22-4.175.71.9:57396.service: Deactivated successfully. Apr 16 00:22:27.516168 systemd[1]: session-21.scope: Deactivated successfully. Apr 16 00:22:27.517906 systemd-logind[1452]: Removed session 21. Apr 16 00:22:27.917069 systemd[1]: Started sshd@21-78.46.194.74:22-123.245.84.35:38535.service - OpenSSH per-connection server daemon (123.245.84.35:38535). Apr 16 00:22:28.336187 systemd[1]: Started sshd@22-78.46.194.74:22-144.123.77.12:48871.service - OpenSSH per-connection server daemon (144.123.77.12:48871). Apr 16 00:22:28.946843 sshd[6479]: Connection closed by 123.245.84.35 port 38535 Apr 16 00:22:28.948960 systemd[1]: sshd@21-78.46.194.74:22-123.245.84.35:38535.service: Deactivated successfully. Apr 16 00:22:42.002726 systemd[1]: cri-containerd-85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a.scope: Deactivated successfully. Apr 16 00:22:42.006120 systemd[1]: cri-containerd-85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a.scope: Consumed 17.918s CPU time. Apr 16 00:22:42.033490 containerd[1477]: time="2026-04-16T00:22:42.031844560Z" level=info msg="shim disconnected" id=85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a namespace=k8s.io Apr 16 00:22:42.033490 containerd[1477]: time="2026-04-16T00:22:42.031927562Z" level=warning msg="cleaning up after shim disconnected" id=85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a namespace=k8s.io Apr 16 00:22:42.033490 containerd[1477]: time="2026-04-16T00:22:42.031935962Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:22:42.033123 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a-rootfs.mount: Deactivated successfully. Apr 16 00:22:42.390744 kubelet[2576]: I0416 00:22:42.390432 2576 scope.go:122] "RemoveContainer" containerID="85476f835616919ed68da93e0f3cc2a0db8dc5f23f6e16672476e68685618a1a" Apr 16 00:22:42.395021 containerd[1477]: time="2026-04-16T00:22:42.394959127Z" level=info msg="CreateContainer within sandbox \"1acf50b3fb716e85622da341e438eee949ae218f340edc6a637a418bf38e3555\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Apr 16 00:22:42.414981 containerd[1477]: time="2026-04-16T00:22:42.414903216Z" level=info msg="CreateContainer within sandbox \"1acf50b3fb716e85622da341e438eee949ae218f340edc6a637a418bf38e3555\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"1d1bc8e521e5583ef7bd25e7d4511f16e3cecd61bdf5144febc2d65a30c6401f\"" Apr 16 00:22:42.417386 containerd[1477]: time="2026-04-16T00:22:42.415766475Z" level=info msg="StartContainer for \"1d1bc8e521e5583ef7bd25e7d4511f16e3cecd61bdf5144febc2d65a30c6401f\"" Apr 16 00:22:42.422209 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount874577290.mount: Deactivated successfully. Apr 16 00:22:42.430150 kubelet[2576]: E0416 00:22:42.429977 2576 controller.go:251] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:54216->10.0.0.2:2379: read: connection timed out" Apr 16 00:22:42.432980 systemd[1]: cri-containerd-0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c.scope: Deactivated successfully. Apr 16 00:22:42.433271 systemd[1]: cri-containerd-0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c.scope: Consumed 3.629s CPU time, 15.8M memory peak, 0B memory swap peak. Apr 16 00:22:42.484808 systemd[1]: Started cri-containerd-1d1bc8e521e5583ef7bd25e7d4511f16e3cecd61bdf5144febc2d65a30c6401f.scope - libcontainer container 1d1bc8e521e5583ef7bd25e7d4511f16e3cecd61bdf5144febc2d65a30c6401f. Apr 16 00:22:42.498796 containerd[1477]: time="2026-04-16T00:22:42.497250428Z" level=info msg="shim disconnected" id=0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c namespace=k8s.io Apr 16 00:22:42.498796 containerd[1477]: time="2026-04-16T00:22:42.497304629Z" level=warning msg="cleaning up after shim disconnected" id=0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c namespace=k8s.io Apr 16 00:22:42.498796 containerd[1477]: time="2026-04-16T00:22:42.497312789Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 16 00:22:42.538375 containerd[1477]: time="2026-04-16T00:22:42.538064226Z" level=info msg="StartContainer for \"1d1bc8e521e5583ef7bd25e7d4511f16e3cecd61bdf5144febc2d65a30c6401f\" returns successfully" Apr 16 00:22:43.033789 systemd[1]: run-containerd-runc-k8s.io-1d1bc8e521e5583ef7bd25e7d4511f16e3cecd61bdf5144febc2d65a30c6401f-runc.72r0SV.mount: Deactivated successfully. Apr 16 00:22:43.034007 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c-rootfs.mount: Deactivated successfully. Apr 16 00:22:43.316548 sshd[6481]: Connection closed by 144.123.77.12 port 48871 [preauth] Apr 16 00:22:43.319665 systemd[1]: sshd@22-78.46.194.74:22-144.123.77.12:48871.service: Deactivated successfully. Apr 16 00:22:43.398133 kubelet[2576]: I0416 00:22:43.398078 2576 scope.go:122] "RemoveContainer" containerID="0eb14f04854d02543ee0a1304f51c00c924ef307180d6ae4785116d7ada7094c" Apr 16 00:22:43.400741 containerd[1477]: time="2026-04-16T00:22:43.400567408Z" level=info msg="CreateContainer within sandbox \"0a43f610222e62207ccb773daa2be4b9c442e390a43e999200db7d2f61c8ede9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Apr 16 00:22:43.417859 containerd[1477]: time="2026-04-16T00:22:43.417791111Z" level=info msg="CreateContainer within sandbox \"0a43f610222e62207ccb773daa2be4b9c442e390a43e999200db7d2f61c8ede9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"55aa379a7094cc2bc6e4419b71b41d83c06537d26d116cf8e71d2eb3031d25fc\"" Apr 16 00:22:43.418802 containerd[1477]: time="2026-04-16T00:22:43.418407925Z" level=info msg="StartContainer for \"55aa379a7094cc2bc6e4419b71b41d83c06537d26d116cf8e71d2eb3031d25fc\"" Apr 16 00:22:43.462152 systemd[1]: Started cri-containerd-55aa379a7094cc2bc6e4419b71b41d83c06537d26d116cf8e71d2eb3031d25fc.scope - libcontainer container 55aa379a7094cc2bc6e4419b71b41d83c06537d26d116cf8e71d2eb3031d25fc. Apr 16 00:22:43.502681 containerd[1477]: time="2026-04-16T00:22:43.502557958Z" level=info msg="StartContainer for \"55aa379a7094cc2bc6e4419b71b41d83c06537d26d116cf8e71d2eb3031d25fc\" returns successfully"