Mar 7 00:52:29.875952 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 7 00:52:29.875977 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Mar 6 22:59:59 -00 2026 Mar 7 00:52:29.875987 kernel: KASLR enabled Mar 7 00:52:29.875993 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 7 00:52:29.875998 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x138595418 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Mar 7 00:52:29.876004 kernel: random: crng init done Mar 7 00:52:29.876011 kernel: ACPI: Early table checksum verification disabled Mar 7 00:52:29.876017 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Mar 7 00:52:29.876024 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Mar 7 00:52:29.876031 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876039 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876044 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876050 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876056 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876064 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876072 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876078 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876085 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 00:52:29.876091 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Mar 7 00:52:29.876097 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Mar 7 00:52:29.876104 kernel: NUMA: Failed to initialise from firmware Mar 7 00:52:29.876110 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Mar 7 00:52:29.876117 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] Mar 7 00:52:29.876123 kernel: Zone ranges: Mar 7 00:52:29.876130 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 7 00:52:29.876138 kernel: DMA32 empty Mar 7 00:52:29.876145 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Mar 7 00:52:29.876151 kernel: Movable zone start for each node Mar 7 00:52:29.876157 kernel: Early memory node ranges Mar 7 00:52:29.876163 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Mar 7 00:52:29.876170 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Mar 7 00:52:29.876176 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Mar 7 00:52:29.876182 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Mar 7 00:52:29.876189 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Mar 7 00:52:29.876195 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Mar 7 00:52:29.876201 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Mar 7 00:52:29.876208 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Mar 7 00:52:29.876216 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 7 00:52:29.876222 kernel: psci: probing for conduit method from ACPI. Mar 7 00:52:29.876228 kernel: psci: PSCIv1.1 detected in firmware. Mar 7 00:52:29.876238 kernel: psci: Using standard PSCI v0.2 function IDs Mar 7 00:52:29.876244 kernel: psci: Trusted OS migration not required Mar 7 00:52:29.876251 kernel: psci: SMC Calling Convention v1.1 Mar 7 00:52:29.876259 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 7 00:52:29.876266 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Mar 7 00:52:29.876273 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Mar 7 00:52:29.876280 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 7 00:52:29.876286 kernel: Detected PIPT I-cache on CPU0 Mar 7 00:52:29.876293 kernel: CPU features: detected: GIC system register CPU interface Mar 7 00:52:29.876300 kernel: CPU features: detected: Hardware dirty bit management Mar 7 00:52:29.876307 kernel: CPU features: detected: Spectre-v4 Mar 7 00:52:29.876313 kernel: CPU features: detected: Spectre-BHB Mar 7 00:52:29.876320 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 7 00:52:29.876328 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 7 00:52:29.876335 kernel: CPU features: detected: ARM erratum 1418040 Mar 7 00:52:29.876342 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 7 00:52:29.876349 kernel: alternatives: applying boot alternatives Mar 7 00:52:29.876357 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 00:52:29.876364 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 00:52:29.876371 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 00:52:29.876381 kernel: Fallback order for Node 0: 0 Mar 7 00:52:29.876388 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Mar 7 00:52:29.876395 kernel: Policy zone: Normal Mar 7 00:52:29.876402 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 00:52:29.876410 kernel: software IO TLB: area num 2. Mar 7 00:52:29.876417 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Mar 7 00:52:29.876424 kernel: Memory: 3882816K/4096000K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 213184K reserved, 0K cma-reserved) Mar 7 00:52:29.876431 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 00:52:29.876438 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 00:52:29.876445 kernel: rcu: RCU event tracing is enabled. Mar 7 00:52:29.876452 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 00:52:29.876459 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 00:52:29.876466 kernel: Tracing variant of Tasks RCU enabled. Mar 7 00:52:29.876473 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 00:52:29.876479 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 00:52:29.876486 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 7 00:52:29.876495 kernel: GICv3: 256 SPIs implemented Mar 7 00:52:29.876501 kernel: GICv3: 0 Extended SPIs implemented Mar 7 00:52:29.876508 kernel: Root IRQ handler: gic_handle_irq Mar 7 00:52:29.876515 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 7 00:52:29.876522 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 7 00:52:29.876529 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 7 00:52:29.876536 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Mar 7 00:52:29.876544 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Mar 7 00:52:29.876551 kernel: GICv3: using LPI property table @0x00000001000e0000 Mar 7 00:52:29.876558 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Mar 7 00:52:29.876565 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 00:52:29.876608 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 7 00:52:29.876616 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 7 00:52:29.876623 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 7 00:52:29.876630 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 7 00:52:29.876637 kernel: Console: colour dummy device 80x25 Mar 7 00:52:29.876644 kernel: ACPI: Core revision 20230628 Mar 7 00:52:29.876651 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 7 00:52:29.876659 kernel: pid_max: default: 32768 minimum: 301 Mar 7 00:52:29.876666 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 7 00:52:29.876673 kernel: landlock: Up and running. Mar 7 00:52:29.878834 kernel: SELinux: Initializing. Mar 7 00:52:29.878844 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:52:29.878852 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 00:52:29.878859 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:52:29.878867 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 00:52:29.878874 kernel: rcu: Hierarchical SRCU implementation. Mar 7 00:52:29.878882 kernel: rcu: Max phase no-delay instances is 400. Mar 7 00:52:29.878889 kernel: Platform MSI: ITS@0x8080000 domain created Mar 7 00:52:29.878896 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 7 00:52:29.878905 kernel: Remapping and enabling EFI services. Mar 7 00:52:29.878913 kernel: smp: Bringing up secondary CPUs ... Mar 7 00:52:29.878920 kernel: Detected PIPT I-cache on CPU1 Mar 7 00:52:29.878927 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 7 00:52:29.878934 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Mar 7 00:52:29.878941 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 7 00:52:29.878948 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 7 00:52:29.878955 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 00:52:29.878962 kernel: SMP: Total of 2 processors activated. Mar 7 00:52:29.878969 kernel: CPU features: detected: 32-bit EL0 Support Mar 7 00:52:29.878978 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 7 00:52:29.878985 kernel: CPU features: detected: Common not Private translations Mar 7 00:52:29.878997 kernel: CPU features: detected: CRC32 instructions Mar 7 00:52:29.879006 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 7 00:52:29.879014 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 7 00:52:29.879021 kernel: CPU features: detected: LSE atomic instructions Mar 7 00:52:29.879028 kernel: CPU features: detected: Privileged Access Never Mar 7 00:52:29.879036 kernel: CPU features: detected: RAS Extension Support Mar 7 00:52:29.879045 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 7 00:52:29.879052 kernel: CPU: All CPU(s) started at EL1 Mar 7 00:52:29.879060 kernel: alternatives: applying system-wide alternatives Mar 7 00:52:29.879067 kernel: devtmpfs: initialized Mar 7 00:52:29.879075 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 00:52:29.879082 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 00:52:29.879090 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 00:52:29.879097 kernel: SMBIOS 3.0.0 present. Mar 7 00:52:29.879106 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Mar 7 00:52:29.879114 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 00:52:29.879121 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 7 00:52:29.879129 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 7 00:52:29.879136 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 7 00:52:29.879144 kernel: audit: initializing netlink subsys (disabled) Mar 7 00:52:29.879151 kernel: audit: type=2000 audit(0.017:1): state=initialized audit_enabled=0 res=1 Mar 7 00:52:29.879159 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 00:52:29.879166 kernel: cpuidle: using governor menu Mar 7 00:52:29.879175 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 7 00:52:29.879183 kernel: ASID allocator initialised with 32768 entries Mar 7 00:52:29.879190 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 00:52:29.879198 kernel: Serial: AMBA PL011 UART driver Mar 7 00:52:29.879205 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 7 00:52:29.879213 kernel: Modules: 0 pages in range for non-PLT usage Mar 7 00:52:29.879220 kernel: Modules: 509008 pages in range for PLT usage Mar 7 00:52:29.879228 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 00:52:29.879235 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 00:52:29.879244 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 7 00:52:29.879251 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 7 00:52:29.879259 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 00:52:29.879266 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 00:52:29.879274 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 7 00:52:29.879281 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 7 00:52:29.879288 kernel: ACPI: Added _OSI(Module Device) Mar 7 00:52:29.879296 kernel: ACPI: Added _OSI(Processor Device) Mar 7 00:52:29.879303 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 00:52:29.879313 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 00:52:29.879321 kernel: ACPI: Interpreter enabled Mar 7 00:52:29.879328 kernel: ACPI: Using GIC for interrupt routing Mar 7 00:52:29.879335 kernel: ACPI: MCFG table detected, 1 entries Mar 7 00:52:29.879343 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 7 00:52:29.879351 kernel: printk: console [ttyAMA0] enabled Mar 7 00:52:29.879358 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 7 00:52:29.879523 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 7 00:52:29.879626 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 7 00:52:29.879725 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 7 00:52:29.879790 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 7 00:52:29.879853 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 7 00:52:29.879863 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 7 00:52:29.879870 kernel: PCI host bridge to bus 0000:00 Mar 7 00:52:29.879948 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 7 00:52:29.880006 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 7 00:52:29.880069 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 7 00:52:29.880125 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 7 00:52:29.880202 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 7 00:52:29.880276 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Mar 7 00:52:29.880343 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Mar 7 00:52:29.880410 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Mar 7 00:52:29.880486 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.880552 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Mar 7 00:52:29.880640 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.880730 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Mar 7 00:52:29.880805 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.880871 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Mar 7 00:52:29.880947 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.881012 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Mar 7 00:52:29.881086 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.881151 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Mar 7 00:52:29.881227 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.881293 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Mar 7 00:52:29.881367 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.881435 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Mar 7 00:52:29.881507 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.881605 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Mar 7 00:52:29.881707 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Mar 7 00:52:29.881781 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Mar 7 00:52:29.881859 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Mar 7 00:52:29.881926 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Mar 7 00:52:29.882001 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Mar 7 00:52:29.882069 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Mar 7 00:52:29.882137 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 7 00:52:29.882205 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 7 00:52:29.882279 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 7 00:52:29.882353 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Mar 7 00:52:29.882428 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Mar 7 00:52:29.882497 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Mar 7 00:52:29.882566 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Mar 7 00:52:29.882667 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Mar 7 00:52:29.887871 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Mar 7 00:52:29.887973 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 7 00:52:29.888043 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] Mar 7 00:52:29.888112 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Mar 7 00:52:29.888187 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Mar 7 00:52:29.888256 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Mar 7 00:52:29.888323 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Mar 7 00:52:29.888403 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Mar 7 00:52:29.888471 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Mar 7 00:52:29.888539 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Mar 7 00:52:29.888652 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 7 00:52:29.888751 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Mar 7 00:52:29.888821 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Mar 7 00:52:29.888887 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Mar 7 00:52:29.888962 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Mar 7 00:52:29.889028 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Mar 7 00:52:29.889093 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Mar 7 00:52:29.889164 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 7 00:52:29.889229 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Mar 7 00:52:29.889303 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Mar 7 00:52:29.889372 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 7 00:52:29.889439 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Mar 7 00:52:29.889508 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Mar 7 00:52:29.889592 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 7 00:52:29.889665 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Mar 7 00:52:29.889773 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Mar 7 00:52:29.889846 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 7 00:52:29.889913 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Mar 7 00:52:29.889977 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Mar 7 00:52:29.890050 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 7 00:52:29.890115 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Mar 7 00:52:29.890180 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Mar 7 00:52:29.890249 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 7 00:52:29.890315 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Mar 7 00:52:29.890381 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Mar 7 00:52:29.890452 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 7 00:52:29.890518 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Mar 7 00:52:29.890604 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Mar 7 00:52:29.890674 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Mar 7 00:52:29.890779 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Mar 7 00:52:29.890846 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Mar 7 00:52:29.890911 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Mar 7 00:52:29.890977 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Mar 7 00:52:29.891046 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Mar 7 00:52:29.891111 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Mar 7 00:52:29.891176 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Mar 7 00:52:29.891241 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Mar 7 00:52:29.891305 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Mar 7 00:52:29.891370 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Mar 7 00:52:29.891435 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 7 00:52:29.891505 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Mar 7 00:52:29.891599 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 7 00:52:29.891705 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Mar 7 00:52:29.891790 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 7 00:52:29.891859 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Mar 7 00:52:29.891926 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Mar 7 00:52:29.891997 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Mar 7 00:52:29.892068 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Mar 7 00:52:29.892134 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Mar 7 00:52:29.892199 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Mar 7 00:52:29.892265 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Mar 7 00:52:29.892331 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Mar 7 00:52:29.892397 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Mar 7 00:52:29.892462 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Mar 7 00:52:29.892527 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Mar 7 00:52:29.892612 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Mar 7 00:52:29.892694 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Mar 7 00:52:29.892762 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Mar 7 00:52:29.892829 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Mar 7 00:52:29.892894 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Mar 7 00:52:29.892960 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Mar 7 00:52:29.893024 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Mar 7 00:52:29.893089 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Mar 7 00:52:29.893159 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Mar 7 00:52:29.893224 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Mar 7 00:52:29.893288 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Mar 7 00:52:29.893358 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Mar 7 00:52:29.893431 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Mar 7 00:52:29.893501 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 7 00:52:29.893600 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Mar 7 00:52:29.893709 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 7 00:52:29.893787 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 7 00:52:29.893853 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Mar 7 00:52:29.893917 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Mar 7 00:52:29.893989 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Mar 7 00:52:29.894060 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 7 00:52:29.894125 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 7 00:52:29.894189 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Mar 7 00:52:29.894254 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Mar 7 00:52:29.894328 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Mar 7 00:52:29.894429 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Mar 7 00:52:29.894513 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 7 00:52:29.894618 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 7 00:52:29.894782 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Mar 7 00:52:29.894864 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Mar 7 00:52:29.894935 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Mar 7 00:52:29.895000 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 7 00:52:29.895065 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 7 00:52:29.895128 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Mar 7 00:52:29.895191 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Mar 7 00:52:29.895263 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Mar 7 00:52:29.895333 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] Mar 7 00:52:29.895396 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 7 00:52:29.895459 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 7 00:52:29.895523 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Mar 7 00:52:29.895603 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Mar 7 00:52:29.895687 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Mar 7 00:52:29.895760 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Mar 7 00:52:29.895824 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 7 00:52:29.895894 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 7 00:52:29.895962 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Mar 7 00:52:29.896026 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 7 00:52:29.896097 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Mar 7 00:52:29.896164 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Mar 7 00:52:29.896232 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Mar 7 00:52:29.896296 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 7 00:52:29.896361 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 7 00:52:29.896429 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Mar 7 00:52:29.896495 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 7 00:52:29.896560 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 7 00:52:29.896668 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 7 00:52:29.896755 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Mar 7 00:52:29.896826 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 7 00:52:29.896895 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 7 00:52:29.896960 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Mar 7 00:52:29.897030 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Mar 7 00:52:29.897094 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Mar 7 00:52:29.897162 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 7 00:52:29.897221 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 7 00:52:29.897279 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 7 00:52:29.897350 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 7 00:52:29.897410 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Mar 7 00:52:29.897474 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Mar 7 00:52:29.897542 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Mar 7 00:52:29.897617 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Mar 7 00:52:29.897689 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Mar 7 00:52:29.897760 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Mar 7 00:52:29.897821 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Mar 7 00:52:29.897885 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Mar 7 00:52:29.897953 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Mar 7 00:52:29.898013 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Mar 7 00:52:29.898087 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Mar 7 00:52:29.898164 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Mar 7 00:52:29.898226 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Mar 7 00:52:29.898299 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Mar 7 00:52:29.898373 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Mar 7 00:52:29.898435 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Mar 7 00:52:29.898498 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 7 00:52:29.898578 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Mar 7 00:52:29.898664 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Mar 7 00:52:29.899904 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 7 00:52:29.899989 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Mar 7 00:52:29.900050 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Mar 7 00:52:29.900110 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 7 00:52:29.900177 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Mar 7 00:52:29.900238 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Mar 7 00:52:29.900305 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Mar 7 00:52:29.900315 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 7 00:52:29.900324 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 7 00:52:29.900333 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 7 00:52:29.900343 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 7 00:52:29.900351 kernel: iommu: Default domain type: Translated Mar 7 00:52:29.900360 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 7 00:52:29.900368 kernel: efivars: Registered efivars operations Mar 7 00:52:29.900375 kernel: vgaarb: loaded Mar 7 00:52:29.900385 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 7 00:52:29.900393 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 00:52:29.900401 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 00:52:29.900410 kernel: pnp: PnP ACPI init Mar 7 00:52:29.900485 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 7 00:52:29.900498 kernel: pnp: PnP ACPI: found 1 devices Mar 7 00:52:29.900506 kernel: NET: Registered PF_INET protocol family Mar 7 00:52:29.900514 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 00:52:29.900524 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 00:52:29.900532 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 00:52:29.900540 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 00:52:29.900548 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 00:52:29.900556 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 00:52:29.900564 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:52:29.900589 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 00:52:29.900599 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 00:52:29.900693 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Mar 7 00:52:29.900710 kernel: PCI: CLS 0 bytes, default 64 Mar 7 00:52:29.900725 kernel: kvm [1]: HYP mode not available Mar 7 00:52:29.900736 kernel: Initialise system trusted keyrings Mar 7 00:52:29.900745 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 00:52:29.900753 kernel: Key type asymmetric registered Mar 7 00:52:29.900760 kernel: Asymmetric key parser 'x509' registered Mar 7 00:52:29.900768 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 7 00:52:29.900776 kernel: io scheduler mq-deadline registered Mar 7 00:52:29.900784 kernel: io scheduler kyber registered Mar 7 00:52:29.900794 kernel: io scheduler bfq registered Mar 7 00:52:29.900803 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 7 00:52:29.900880 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Mar 7 00:52:29.900948 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Mar 7 00:52:29.901013 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.901082 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Mar 7 00:52:29.901148 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Mar 7 00:52:29.901215 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.901281 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Mar 7 00:52:29.901345 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Mar 7 00:52:29.901409 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.901476 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Mar 7 00:52:29.901541 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Mar 7 00:52:29.901630 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.904816 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Mar 7 00:52:29.904911 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Mar 7 00:52:29.904980 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.905049 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Mar 7 00:52:29.905115 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Mar 7 00:52:29.905192 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.905261 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Mar 7 00:52:29.905326 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Mar 7 00:52:29.905391 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.905459 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Mar 7 00:52:29.905524 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Mar 7 00:52:29.905610 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.905623 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Mar 7 00:52:29.905703 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Mar 7 00:52:29.905774 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Mar 7 00:52:29.905839 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 7 00:52:29.905851 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 7 00:52:29.905862 kernel: ACPI: button: Power Button [PWRB] Mar 7 00:52:29.905871 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 7 00:52:29.905945 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Mar 7 00:52:29.906018 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Mar 7 00:52:29.906029 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 00:52:29.906040 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 7 00:52:29.906110 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Mar 7 00:52:29.906121 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Mar 7 00:52:29.906129 kernel: thunder_xcv, ver 1.0 Mar 7 00:52:29.906139 kernel: thunder_bgx, ver 1.0 Mar 7 00:52:29.906147 kernel: nicpf, ver 1.0 Mar 7 00:52:29.906155 kernel: nicvf, ver 1.0 Mar 7 00:52:29.906241 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 7 00:52:29.906305 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-07T00:52:29 UTC (1772844749) Mar 7 00:52:29.906315 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 00:52:29.906324 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 7 00:52:29.906332 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 7 00:52:29.906342 kernel: watchdog: Hard watchdog permanently disabled Mar 7 00:52:29.906350 kernel: NET: Registered PF_INET6 protocol family Mar 7 00:52:29.906358 kernel: Segment Routing with IPv6 Mar 7 00:52:29.906366 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 00:52:29.906374 kernel: NET: Registered PF_PACKET protocol family Mar 7 00:52:29.906382 kernel: Key type dns_resolver registered Mar 7 00:52:29.906390 kernel: registered taskstats version 1 Mar 7 00:52:29.906397 kernel: Loading compiled-in X.509 certificates Mar 7 00:52:29.906405 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: e62b4e4ebcb406beff1271ecc7444548c4ab67e9' Mar 7 00:52:29.906415 kernel: Key type .fscrypt registered Mar 7 00:52:29.906422 kernel: Key type fscrypt-provisioning registered Mar 7 00:52:29.906430 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 00:52:29.906437 kernel: ima: Allocated hash algorithm: sha1 Mar 7 00:52:29.906445 kernel: ima: No architecture policies found Mar 7 00:52:29.906453 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 7 00:52:29.906461 kernel: clk: Disabling unused clocks Mar 7 00:52:29.906469 kernel: Freeing unused kernel memory: 39424K Mar 7 00:52:29.906477 kernel: Run /init as init process Mar 7 00:52:29.906486 kernel: with arguments: Mar 7 00:52:29.906494 kernel: /init Mar 7 00:52:29.906501 kernel: with environment: Mar 7 00:52:29.906509 kernel: HOME=/ Mar 7 00:52:29.906516 kernel: TERM=linux Mar 7 00:52:29.906526 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 00:52:29.906536 systemd[1]: Detected virtualization kvm. Mar 7 00:52:29.906544 systemd[1]: Detected architecture arm64. Mar 7 00:52:29.906553 systemd[1]: Running in initrd. Mar 7 00:52:29.906561 systemd[1]: No hostname configured, using default hostname. Mar 7 00:52:29.906579 systemd[1]: Hostname set to . Mar 7 00:52:29.906590 systemd[1]: Initializing machine ID from VM UUID. Mar 7 00:52:29.906598 systemd[1]: Queued start job for default target initrd.target. Mar 7 00:52:29.906607 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:52:29.906615 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:52:29.906624 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 00:52:29.906635 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:52:29.906643 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 00:52:29.906651 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 00:52:29.906661 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 00:52:29.906670 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 00:52:29.907377 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:52:29.907391 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:52:29.907406 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:52:29.907415 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:52:29.907423 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:52:29.907431 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:52:29.907440 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:52:29.907448 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:52:29.907458 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 00:52:29.907466 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 7 00:52:29.907475 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:52:29.907485 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:52:29.907493 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:52:29.907502 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:52:29.907510 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 00:52:29.907518 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:52:29.907527 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 00:52:29.907536 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 00:52:29.907544 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:52:29.907554 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:52:29.907562 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:52:29.907642 systemd-journald[237]: Collecting audit messages is disabled. Mar 7 00:52:29.907665 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 00:52:29.907742 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:52:29.907753 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 00:52:29.907762 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 00:52:29.907772 systemd-journald[237]: Journal started Mar 7 00:52:29.907796 systemd-journald[237]: Runtime Journal (/run/log/journal/766e6fdcf12b4089a377569d8a02740c) is 8.0M, max 76.6M, 68.6M free. Mar 7 00:52:29.896730 systemd-modules-load[238]: Inserted module 'overlay' Mar 7 00:52:29.908858 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:52:29.912040 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:52:29.912927 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 00:52:29.919703 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 00:52:29.920954 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:52:29.923277 systemd-modules-load[238]: Inserted module 'br_netfilter' Mar 7 00:52:29.923784 kernel: Bridge firewalling registered Mar 7 00:52:29.925403 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:52:29.927856 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:52:29.929125 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:52:29.938528 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:52:29.946477 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:52:29.952481 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:52:29.953421 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:52:29.959926 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:52:29.965600 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:52:29.973881 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 00:52:29.986214 dracut-cmdline[278]: dracut-dracut-053 Mar 7 00:52:29.992299 systemd-resolved[270]: Positive Trust Anchors: Mar 7 00:52:29.992316 systemd-resolved[270]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:52:29.992347 systemd-resolved[270]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:52:30.000211 dracut-cmdline[278]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9d22c40559a0d209dc0fcc2dfdd5ddf9671e6da0cc59463f610ba522f01325a6 Mar 7 00:52:29.998131 systemd-resolved[270]: Defaulting to hostname 'linux'. Mar 7 00:52:29.999861 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:52:30.000499 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:52:30.083720 kernel: SCSI subsystem initialized Mar 7 00:52:30.088746 kernel: Loading iSCSI transport class v2.0-870. Mar 7 00:52:30.096726 kernel: iscsi: registered transport (tcp) Mar 7 00:52:30.110704 kernel: iscsi: registered transport (qla4xxx) Mar 7 00:52:30.110752 kernel: QLogic iSCSI HBA Driver Mar 7 00:52:30.160533 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 00:52:30.169946 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 00:52:30.189983 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 00:52:30.190068 kernel: device-mapper: uevent: version 1.0.3 Mar 7 00:52:30.190746 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 7 00:52:30.240764 kernel: raid6: neonx8 gen() 15725 MB/s Mar 7 00:52:30.257744 kernel: raid6: neonx4 gen() 15584 MB/s Mar 7 00:52:30.274727 kernel: raid6: neonx2 gen() 13186 MB/s Mar 7 00:52:30.291771 kernel: raid6: neonx1 gen() 10439 MB/s Mar 7 00:52:30.308749 kernel: raid6: int64x8 gen() 6931 MB/s Mar 7 00:52:30.325759 kernel: raid6: int64x4 gen() 7305 MB/s Mar 7 00:52:30.342746 kernel: raid6: int64x2 gen() 6098 MB/s Mar 7 00:52:30.359754 kernel: raid6: int64x1 gen() 5034 MB/s Mar 7 00:52:30.359820 kernel: raid6: using algorithm neonx8 gen() 15725 MB/s Mar 7 00:52:30.376746 kernel: raid6: .... xor() 11917 MB/s, rmw enabled Mar 7 00:52:30.376836 kernel: raid6: using neon recovery algorithm Mar 7 00:52:30.381829 kernel: xor: measuring software checksum speed Mar 7 00:52:30.381915 kernel: 8regs : 19769 MB/sec Mar 7 00:52:30.381971 kernel: 32regs : 19669 MB/sec Mar 7 00:52:30.382001 kernel: arm64_neon : 27052 MB/sec Mar 7 00:52:30.382718 kernel: xor: using function: arm64_neon (27052 MB/sec) Mar 7 00:52:30.431767 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 00:52:30.446560 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:52:30.454012 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:52:30.481515 systemd-udevd[459]: Using default interface naming scheme 'v255'. Mar 7 00:52:30.485754 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:52:30.493027 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 00:52:30.507507 dracut-pre-trigger[462]: rd.md=0: removing MD RAID activation Mar 7 00:52:30.545184 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:52:30.550919 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:52:30.601423 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:52:30.608930 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 00:52:30.634990 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 00:52:30.637244 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:52:30.639671 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:52:30.640846 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:52:30.649141 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 00:52:30.668340 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:52:30.721335 kernel: scsi host0: Virtio SCSI HBA Mar 7 00:52:30.725673 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 00:52:30.725758 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:52:30.732819 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 7 00:52:30.732877 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 7 00:52:30.728503 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:52:30.729792 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:52:30.731336 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:52:30.740655 kernel: ACPI: bus type USB registered Mar 7 00:52:30.740760 kernel: usbcore: registered new interface driver usbfs Mar 7 00:52:30.740802 kernel: usbcore: registered new interface driver hub Mar 7 00:52:30.740850 kernel: usbcore: registered new device driver usb Mar 7 00:52:30.734138 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:52:30.741896 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:52:30.752307 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:52:30.764082 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 00:52:30.772207 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 7 00:52:30.772434 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 7 00:52:30.774869 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 7 00:52:30.775020 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 7 00:52:30.775927 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 7 00:52:30.778708 kernel: sr 0:0:0:0: Power-on or device reset occurred Mar 7 00:52:30.778888 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 7 00:52:30.779017 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Mar 7 00:52:30.779105 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 7 00:52:30.779116 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Mar 7 00:52:30.779196 kernel: hub 1-0:1.0: USB hub found Mar 7 00:52:30.779303 kernel: hub 1-0:1.0: 4 ports detected Mar 7 00:52:30.781020 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 7 00:52:30.781157 kernel: hub 2-0:1.0: USB hub found Mar 7 00:52:30.781702 kernel: hub 2-0:1.0: 4 ports detected Mar 7 00:52:30.799491 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:52:30.805734 kernel: sd 0:0:0:1: Power-on or device reset occurred Mar 7 00:52:30.807767 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Mar 7 00:52:30.807982 kernel: sd 0:0:0:1: [sda] Write Protect is off Mar 7 00:52:30.808774 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Mar 7 00:52:30.808941 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 7 00:52:30.813545 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 7 00:52:30.813624 kernel: GPT:17805311 != 80003071 Mar 7 00:52:30.813637 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 7 00:52:30.813647 kernel: GPT:17805311 != 80003071 Mar 7 00:52:30.813665 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 7 00:52:30.813685 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 00:52:30.815727 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Mar 7 00:52:30.857168 kernel: BTRFS: device fsid 237c8587-8110-47ef-99f9-37e4ed4d3b31 devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (508) Mar 7 00:52:30.858705 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (517) Mar 7 00:52:30.862996 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 7 00:52:30.872621 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 7 00:52:30.879456 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 7 00:52:30.880223 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 7 00:52:30.885592 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 7 00:52:30.897957 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 00:52:30.911705 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 00:52:30.912849 disk-uuid[578]: Primary Header is updated. Mar 7 00:52:30.912849 disk-uuid[578]: Secondary Entries is updated. Mar 7 00:52:30.912849 disk-uuid[578]: Secondary Header is updated. Mar 7 00:52:31.020782 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 7 00:52:31.158164 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Mar 7 00:52:31.158225 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 7 00:52:31.159376 kernel: usbcore: registered new interface driver usbhid Mar 7 00:52:31.159413 kernel: usbhid: USB HID core driver Mar 7 00:52:31.265737 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Mar 7 00:52:31.395721 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Mar 7 00:52:31.448304 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Mar 7 00:52:31.933494 disk-uuid[579]: The operation has completed successfully. Mar 7 00:52:31.934651 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 00:52:31.989097 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 00:52:31.989948 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 00:52:32.011931 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 00:52:32.017693 sh[597]: Success Mar 7 00:52:32.031704 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 7 00:52:32.091413 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 00:52:32.093827 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 00:52:32.104868 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 00:52:32.120907 kernel: BTRFS info (device dm-0): first mount of filesystem 237c8587-8110-47ef-99f9-37e4ed4d3b31 Mar 7 00:52:32.120968 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:52:32.120986 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 7 00:52:32.121011 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 7 00:52:32.121712 kernel: BTRFS info (device dm-0): using free space tree Mar 7 00:52:32.127768 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 7 00:52:32.130424 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 00:52:32.131932 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 00:52:32.143999 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 00:52:32.149877 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 00:52:32.162945 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:52:32.162995 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:52:32.163769 kernel: BTRFS info (device sda6): using free space tree Mar 7 00:52:32.168690 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 00:52:32.168749 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 00:52:32.181720 kernel: BTRFS info (device sda6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:52:32.181418 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 7 00:52:32.188939 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 00:52:32.197962 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 00:52:32.278584 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:52:32.289902 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:52:32.296660 ignition[701]: Ignition 2.19.0 Mar 7 00:52:32.296670 ignition[701]: Stage: fetch-offline Mar 7 00:52:32.296720 ignition[701]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:52:32.298912 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:52:32.296728 ignition[701]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 00:52:32.296885 ignition[701]: parsed url from cmdline: "" Mar 7 00:52:32.296888 ignition[701]: no config URL provided Mar 7 00:52:32.296892 ignition[701]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 00:52:32.296899 ignition[701]: no config at "/usr/lib/ignition/user.ign" Mar 7 00:52:32.296904 ignition[701]: failed to fetch config: resource requires networking Mar 7 00:52:32.297066 ignition[701]: Ignition finished successfully Mar 7 00:52:32.318801 systemd-networkd[783]: lo: Link UP Mar 7 00:52:32.318817 systemd-networkd[783]: lo: Gained carrier Mar 7 00:52:32.320996 systemd-networkd[783]: Enumeration completed Mar 7 00:52:32.322034 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:52:32.322034 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:32.322038 systemd-networkd[783]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:52:32.323425 systemd[1]: Reached target network.target - Network. Mar 7 00:52:32.323949 systemd-networkd[783]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:32.323952 systemd-networkd[783]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:52:32.324473 systemd-networkd[783]: eth0: Link UP Mar 7 00:52:32.324476 systemd-networkd[783]: eth0: Gained carrier Mar 7 00:52:32.324482 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:32.330997 systemd-networkd[783]: eth1: Link UP Mar 7 00:52:32.331001 systemd-networkd[783]: eth1: Gained carrier Mar 7 00:52:32.331011 systemd-networkd[783]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:32.334263 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 00:52:32.346986 ignition[786]: Ignition 2.19.0 Mar 7 00:52:32.347579 ignition[786]: Stage: fetch Mar 7 00:52:32.347784 ignition[786]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:52:32.347795 ignition[786]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 00:52:32.347898 ignition[786]: parsed url from cmdline: "" Mar 7 00:52:32.347902 ignition[786]: no config URL provided Mar 7 00:52:32.347907 ignition[786]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 00:52:32.347914 ignition[786]: no config at "/usr/lib/ignition/user.ign" Mar 7 00:52:32.347932 ignition[786]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 7 00:52:32.349749 ignition[786]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 7 00:52:32.370789 systemd-networkd[783]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 7 00:52:32.382821 systemd-networkd[783]: eth0: DHCPv4 address 116.202.17.139/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 7 00:52:32.550650 ignition[786]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 7 00:52:32.555822 ignition[786]: GET result: OK Mar 7 00:52:32.555930 ignition[786]: parsing config with SHA512: 20b077620c098abc0be1fa6556373bb633d76240c895946cb4a37afc4f580c634f3037145859d938bdbc4fee013eab78e67c1313dfadfb5a761a6802e18e33eb Mar 7 00:52:32.560168 unknown[786]: fetched base config from "system" Mar 7 00:52:32.560178 unknown[786]: fetched base config from "system" Mar 7 00:52:32.560601 ignition[786]: fetch: fetch complete Mar 7 00:52:32.560183 unknown[786]: fetched user config from "hetzner" Mar 7 00:52:32.560608 ignition[786]: fetch: fetch passed Mar 7 00:52:32.560656 ignition[786]: Ignition finished successfully Mar 7 00:52:32.564244 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 00:52:32.571904 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 00:52:32.585724 ignition[794]: Ignition 2.19.0 Mar 7 00:52:32.585749 ignition[794]: Stage: kargs Mar 7 00:52:32.585960 ignition[794]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:52:32.585972 ignition[794]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 00:52:32.589743 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 00:52:32.587386 ignition[794]: kargs: kargs passed Mar 7 00:52:32.587459 ignition[794]: Ignition finished successfully Mar 7 00:52:32.596151 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 00:52:32.607772 ignition[800]: Ignition 2.19.0 Mar 7 00:52:32.607786 ignition[800]: Stage: disks Mar 7 00:52:32.608006 ignition[800]: no configs at "/usr/lib/ignition/base.d" Mar 7 00:52:32.608016 ignition[800]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 00:52:32.609849 ignition[800]: disks: disks passed Mar 7 00:52:32.609919 ignition[800]: Ignition finished successfully Mar 7 00:52:32.613666 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 00:52:32.615744 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 00:52:32.617216 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 00:52:32.617949 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:52:32.619359 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:52:32.620732 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:52:32.627971 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 00:52:32.647533 systemd-fsck[808]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 7 00:52:32.652755 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 00:52:32.658872 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 00:52:32.700691 kernel: EXT4-fs (sda9): mounted filesystem 596a8ea8-9d3d-4d06-a56e-9d3ebd3cb76d r/w with ordered data mode. Quota mode: none. Mar 7 00:52:32.699977 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 00:52:32.701933 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 00:52:32.713840 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:52:32.718164 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 00:52:32.721869 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 7 00:52:32.724043 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 00:52:32.726696 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (816) Mar 7 00:52:32.725232 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:52:32.730293 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:52:32.730424 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:52:32.731807 kernel: BTRFS info (device sda6): using free space tree Mar 7 00:52:32.731306 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 00:52:32.735467 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 00:52:32.738063 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 00:52:32.738083 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 00:52:32.739776 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:52:32.799702 initrd-setup-root[843]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 00:52:32.802247 coreos-metadata[818]: Mar 07 00:52:32.802 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 7 00:52:32.806607 coreos-metadata[818]: Mar 07 00:52:32.804 INFO Fetch successful Mar 7 00:52:32.806607 coreos-metadata[818]: Mar 07 00:52:32.804 INFO wrote hostname ci-4081-3-6-n-f47b87f6f2 to /sysroot/etc/hostname Mar 7 00:52:32.808643 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 00:52:32.812480 initrd-setup-root[851]: cut: /sysroot/etc/group: No such file or directory Mar 7 00:52:32.817518 initrd-setup-root[858]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 00:52:32.821588 initrd-setup-root[865]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 00:52:32.933758 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 00:52:32.939829 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 00:52:32.944719 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 00:52:32.951711 kernel: BTRFS info (device sda6): last unmount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:52:32.983362 ignition[933]: INFO : Ignition 2.19.0 Mar 7 00:52:32.983362 ignition[933]: INFO : Stage: mount Mar 7 00:52:32.984480 ignition[933]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:52:32.984480 ignition[933]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 00:52:32.988226 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 00:52:32.992614 ignition[933]: INFO : mount: mount passed Mar 7 00:52:32.992614 ignition[933]: INFO : Ignition finished successfully Mar 7 00:52:32.993176 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 00:52:32.998888 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 00:52:33.120208 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 00:52:33.128006 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 00:52:33.136696 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (944) Mar 7 00:52:33.138755 kernel: BTRFS info (device sda6): first mount of filesystem 6e876a94-9f11-430e-8016-2af72863cd2e Mar 7 00:52:33.138790 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 7 00:52:33.138802 kernel: BTRFS info (device sda6): using free space tree Mar 7 00:52:33.142689 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 00:52:33.142725 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 00:52:33.145410 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 00:52:33.169792 ignition[960]: INFO : Ignition 2.19.0 Mar 7 00:52:33.170485 ignition[960]: INFO : Stage: files Mar 7 00:52:33.171099 ignition[960]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:52:33.171663 ignition[960]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 00:52:33.173324 ignition[960]: DEBUG : files: compiled without relabeling support, skipping Mar 7 00:52:33.174284 ignition[960]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 00:52:33.175043 ignition[960]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 00:52:33.178774 ignition[960]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 00:52:33.181060 ignition[960]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 00:52:33.181060 ignition[960]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 00:52:33.181060 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:52:33.181060 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 7 00:52:33.179230 unknown[960]: wrote ssh authorized keys file for user: core Mar 7 00:52:33.279758 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:52:33.367707 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 00:52:33.376926 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:52:33.376926 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 00:52:33.376926 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:52:33.376926 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:52:33.376926 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:52:33.376926 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-arm64.raw: attempt #1 Mar 7 00:52:33.600223 systemd-networkd[783]: eth0: Gained IPv6LL Mar 7 00:52:33.647808 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 00:52:33.874463 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 7 00:52:33.874463 ignition[960]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 00:52:33.877264 ignition[960]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:52:33.877264 ignition[960]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 00:52:33.877264 ignition[960]: INFO : files: files passed Mar 7 00:52:33.877264 ignition[960]: INFO : Ignition finished successfully Mar 7 00:52:33.878477 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 00:52:33.886889 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 00:52:33.890907 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 00:52:33.895925 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 00:52:33.896060 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 00:52:33.912031 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:52:33.912031 initrd-setup-root-after-ignition[990]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:52:33.915031 initrd-setup-root-after-ignition[994]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 00:52:33.917906 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:52:33.919561 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 00:52:33.925946 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 00:52:33.955980 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 00:52:33.956176 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 00:52:33.958404 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 00:52:33.959762 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 00:52:33.960389 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 00:52:33.965979 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 00:52:33.980102 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:52:33.987982 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 00:52:33.999576 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:52:34.001154 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:52:34.003827 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 00:52:34.005303 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 00:52:34.005487 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 00:52:34.007408 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 00:52:34.008498 systemd[1]: Stopped target basic.target - Basic System. Mar 7 00:52:34.009369 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 00:52:34.010327 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 00:52:34.011387 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 00:52:34.012395 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 00:52:34.013366 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 00:52:34.014573 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 00:52:34.015668 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 00:52:34.016644 systemd[1]: Stopped target swap.target - Swaps. Mar 7 00:52:34.017410 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 00:52:34.017602 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 00:52:34.018883 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:52:34.019955 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:52:34.020905 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 00:52:34.022749 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:52:34.023470 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 00:52:34.023694 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 00:52:34.025126 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 00:52:34.025286 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 00:52:34.026437 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 00:52:34.026599 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 00:52:34.027591 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 7 00:52:34.027764 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 00:52:34.040770 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 00:52:34.042275 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 00:52:34.042525 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:52:34.047095 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 00:52:34.047570 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 00:52:34.047719 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:52:34.048658 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 00:52:34.050892 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 00:52:34.062957 ignition[1014]: INFO : Ignition 2.19.0 Mar 7 00:52:34.062957 ignition[1014]: INFO : Stage: umount Mar 7 00:52:34.062957 ignition[1014]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 00:52:34.062957 ignition[1014]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 00:52:34.062957 ignition[1014]: INFO : umount: umount passed Mar 7 00:52:34.062957 ignition[1014]: INFO : Ignition finished successfully Mar 7 00:52:34.067638 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 00:52:34.068339 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 00:52:34.070611 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 00:52:34.070713 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 00:52:34.072299 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 00:52:34.072380 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 00:52:34.073774 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 00:52:34.073852 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 00:52:34.074704 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 00:52:34.074750 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 00:52:34.076326 systemd[1]: Stopped target network.target - Network. Mar 7 00:52:34.078260 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 00:52:34.078330 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 00:52:34.081484 systemd[1]: Stopped target paths.target - Path Units. Mar 7 00:52:34.082521 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 00:52:34.089776 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:52:34.095015 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 00:52:34.095492 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 00:52:34.096938 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 00:52:34.096990 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 00:52:34.099227 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 00:52:34.099264 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 00:52:34.100764 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 00:52:34.100839 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 00:52:34.102296 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 00:52:34.102355 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 00:52:34.106267 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 00:52:34.107656 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 00:52:34.110708 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 00:52:34.112064 systemd-networkd[783]: eth1: DHCPv6 lease lost Mar 7 00:52:34.117792 systemd-networkd[783]: eth0: DHCPv6 lease lost Mar 7 00:52:34.120916 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 00:52:34.121187 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 00:52:34.126419 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 00:52:34.127266 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 00:52:34.128439 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 00:52:34.128493 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:52:34.134897 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 00:52:34.135380 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 00:52:34.135436 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 00:52:34.136658 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 00:52:34.136910 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:52:34.140081 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 00:52:34.140133 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 00:52:34.141845 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 00:52:34.141894 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:52:34.146274 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:52:34.148268 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 00:52:34.148372 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 00:52:34.156615 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 00:52:34.158051 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 00:52:34.159239 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 00:52:34.159347 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 00:52:34.160442 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 00:52:34.160607 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:52:34.162388 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 00:52:34.162446 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 00:52:34.163852 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 00:52:34.163882 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:52:34.164844 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 00:52:34.164888 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 00:52:34.166270 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 00:52:34.166312 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 00:52:34.167771 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 00:52:34.167819 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 00:52:34.180025 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 00:52:34.182162 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 00:52:34.182273 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:52:34.183340 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:52:34.183383 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:52:34.190166 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 00:52:34.190279 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 00:52:34.191816 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 00:52:34.202851 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 00:52:34.210251 systemd[1]: Switching root. Mar 7 00:52:34.240762 systemd-journald[237]: Journal stopped Mar 7 00:52:35.156887 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). Mar 7 00:52:35.156963 kernel: SELinux: policy capability network_peer_controls=1 Mar 7 00:52:35.156984 kernel: SELinux: policy capability open_perms=1 Mar 7 00:52:35.156994 kernel: SELinux: policy capability extended_socket_class=1 Mar 7 00:52:35.157014 kernel: SELinux: policy capability always_check_network=0 Mar 7 00:52:35.157024 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 7 00:52:35.157033 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 7 00:52:35.157043 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 7 00:52:35.157052 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 7 00:52:35.157062 kernel: audit: type=1403 audit(1772844754.407:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 7 00:52:35.157072 systemd[1]: Successfully loaded SELinux policy in 34.385ms. Mar 7 00:52:35.157092 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 11.005ms. Mar 7 00:52:35.157103 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 00:52:35.157116 systemd[1]: Detected virtualization kvm. Mar 7 00:52:35.157127 systemd[1]: Detected architecture arm64. Mar 7 00:52:35.157137 systemd[1]: Detected first boot. Mar 7 00:52:35.157148 systemd[1]: Hostname set to . Mar 7 00:52:35.157158 systemd[1]: Initializing machine ID from VM UUID. Mar 7 00:52:35.157168 zram_generator::config[1059]: No configuration found. Mar 7 00:52:35.157183 systemd[1]: Populated /etc with preset unit settings. Mar 7 00:52:35.157193 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 7 00:52:35.157205 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 7 00:52:35.157216 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 7 00:52:35.157226 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 7 00:52:35.157237 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 7 00:52:35.157247 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 7 00:52:35.157258 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 7 00:52:35.157269 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 7 00:52:35.157279 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 7 00:52:35.157291 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 7 00:52:35.157302 systemd[1]: Created slice user.slice - User and Session Slice. Mar 7 00:52:35.157312 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 00:52:35.157323 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 00:52:35.157333 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 7 00:52:35.157343 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 7 00:52:35.157354 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 7 00:52:35.157364 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 00:52:35.157375 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 7 00:52:35.157387 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 00:52:35.157397 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 7 00:52:35.157408 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 7 00:52:35.157418 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 7 00:52:35.157429 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 7 00:52:35.157440 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 00:52:35.157451 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 00:52:35.157462 systemd[1]: Reached target slices.target - Slice Units. Mar 7 00:52:35.157472 systemd[1]: Reached target swap.target - Swaps. Mar 7 00:52:35.157483 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 7 00:52:35.157493 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 7 00:52:35.157504 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 00:52:35.157514 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 00:52:35.157525 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 00:52:35.157574 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 7 00:52:35.157589 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 7 00:52:35.157602 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 7 00:52:35.157613 systemd[1]: Mounting media.mount - External Media Directory... Mar 7 00:52:35.157624 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 7 00:52:35.157634 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 7 00:52:35.157644 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 7 00:52:35.157655 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 7 00:52:35.157666 systemd[1]: Reached target machines.target - Containers. Mar 7 00:52:35.158709 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 7 00:52:35.158740 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:52:35.158752 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 00:52:35.158763 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 7 00:52:35.158774 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:52:35.158784 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:52:35.158795 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:52:35.158810 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 7 00:52:35.158824 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:52:35.158836 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 7 00:52:35.158847 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 7 00:52:35.158874 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 7 00:52:35.158887 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 7 00:52:35.158898 systemd[1]: Stopped systemd-fsck-usr.service. Mar 7 00:52:35.158909 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 00:52:35.158922 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 00:52:35.158933 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 00:52:35.158946 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 7 00:52:35.158956 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 00:52:35.158967 systemd[1]: verity-setup.service: Deactivated successfully. Mar 7 00:52:35.158978 systemd[1]: Stopped verity-setup.service. Mar 7 00:52:35.158989 kernel: ACPI: bus type drm_connector registered Mar 7 00:52:35.158999 kernel: fuse: init (API version 7.39) Mar 7 00:52:35.159009 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 7 00:52:35.159021 kernel: loop: module loaded Mar 7 00:52:35.159032 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 7 00:52:35.159043 systemd[1]: Mounted media.mount - External Media Directory. Mar 7 00:52:35.159054 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 7 00:52:35.159064 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 7 00:52:35.159075 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 7 00:52:35.159087 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 00:52:35.159098 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 7 00:52:35.159108 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 7 00:52:35.159119 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:52:35.159129 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:52:35.159154 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:52:35.159166 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:52:35.159177 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:52:35.159189 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:52:35.159202 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 7 00:52:35.159243 systemd-journald[1129]: Collecting audit messages is disabled. Mar 7 00:52:35.159272 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 7 00:52:35.159285 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:52:35.159296 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:52:35.159307 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 00:52:35.159318 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 00:52:35.159330 systemd-journald[1129]: Journal started Mar 7 00:52:35.159352 systemd-journald[1129]: Runtime Journal (/run/log/journal/766e6fdcf12b4089a377569d8a02740c) is 8.0M, max 76.6M, 68.6M free. Mar 7 00:52:34.894587 systemd[1]: Queued start job for default target multi-user.target. Mar 7 00:52:34.913489 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 7 00:52:34.914249 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 7 00:52:35.161763 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 7 00:52:35.162772 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 00:52:35.176481 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 00:52:35.187815 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 7 00:52:35.192853 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 7 00:52:35.195795 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 7 00:52:35.195843 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 00:52:35.198518 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 7 00:52:35.201485 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 7 00:52:35.204805 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 7 00:52:35.205419 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:52:35.210042 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 7 00:52:35.215930 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 7 00:52:35.216603 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:52:35.218831 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 7 00:52:35.219446 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:52:35.222987 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 00:52:35.226892 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 7 00:52:35.229614 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 7 00:52:35.231027 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 7 00:52:35.233271 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 7 00:52:35.240353 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 7 00:52:35.243408 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 7 00:52:35.254785 systemd-journald[1129]: Time spent on flushing to /var/log/journal/766e6fdcf12b4089a377569d8a02740c is 72.954ms for 1121 entries. Mar 7 00:52:35.254785 systemd-journald[1129]: System Journal (/var/log/journal/766e6fdcf12b4089a377569d8a02740c) is 8.0M, max 584.8M, 576.8M free. Mar 7 00:52:35.359516 systemd-journald[1129]: Received client request to flush runtime journal. Mar 7 00:52:35.360103 kernel: loop0: detected capacity change from 0 to 114328 Mar 7 00:52:35.360150 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 7 00:52:35.360186 kernel: loop1: detected capacity change from 0 to 200864 Mar 7 00:52:35.267471 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 7 00:52:35.268890 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 7 00:52:35.277641 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 7 00:52:35.289388 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 00:52:35.293463 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 7 00:52:35.333255 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 00:52:35.345187 udevadm[1184]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 7 00:52:35.364212 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 7 00:52:35.368201 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 7 00:52:35.371297 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 7 00:52:35.373140 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 7 00:52:35.382665 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 00:52:35.399790 kernel: loop2: detected capacity change from 0 to 8 Mar 7 00:52:35.406376 systemd-tmpfiles[1194]: ACLs are not supported, ignoring. Mar 7 00:52:35.406395 systemd-tmpfiles[1194]: ACLs are not supported, ignoring. Mar 7 00:52:35.411605 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 00:52:35.419737 kernel: loop3: detected capacity change from 0 to 114432 Mar 7 00:52:35.457757 kernel: loop4: detected capacity change from 0 to 114328 Mar 7 00:52:35.469399 kernel: loop5: detected capacity change from 0 to 200864 Mar 7 00:52:35.488992 kernel: loop6: detected capacity change from 0 to 8 Mar 7 00:52:35.490819 kernel: loop7: detected capacity change from 0 to 114432 Mar 7 00:52:35.513556 (sd-merge)[1200]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 7 00:52:35.514398 (sd-merge)[1200]: Merged extensions into '/usr'. Mar 7 00:52:35.520806 systemd[1]: Reloading requested from client PID 1172 ('systemd-sysext') (unit systemd-sysext.service)... Mar 7 00:52:35.520823 systemd[1]: Reloading... Mar 7 00:52:35.638710 zram_generator::config[1226]: No configuration found. Mar 7 00:52:35.757141 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:52:35.803768 ldconfig[1167]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 7 00:52:35.808646 systemd[1]: Reloading finished in 287 ms. Mar 7 00:52:35.832658 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 7 00:52:35.840725 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 7 00:52:35.848427 systemd[1]: Starting ensure-sysext.service... Mar 7 00:52:35.851887 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 00:52:35.860928 systemd[1]: Reloading requested from client PID 1263 ('systemctl') (unit ensure-sysext.service)... Mar 7 00:52:35.860950 systemd[1]: Reloading... Mar 7 00:52:35.894575 systemd-tmpfiles[1264]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 7 00:52:35.895200 systemd-tmpfiles[1264]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 7 00:52:35.896065 systemd-tmpfiles[1264]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 7 00:52:35.896364 systemd-tmpfiles[1264]: ACLs are not supported, ignoring. Mar 7 00:52:35.896478 systemd-tmpfiles[1264]: ACLs are not supported, ignoring. Mar 7 00:52:35.904288 systemd-tmpfiles[1264]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:52:35.905870 systemd-tmpfiles[1264]: Skipping /boot Mar 7 00:52:35.927012 systemd-tmpfiles[1264]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 00:52:35.927024 systemd-tmpfiles[1264]: Skipping /boot Mar 7 00:52:35.969713 zram_generator::config[1290]: No configuration found. Mar 7 00:52:36.078109 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:52:36.124810 systemd[1]: Reloading finished in 263 ms. Mar 7 00:52:36.141728 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 7 00:52:36.143873 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 00:52:36.160937 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 00:52:36.176070 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 7 00:52:36.183909 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 7 00:52:36.193997 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 00:52:36.198890 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 00:52:36.203430 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 7 00:52:36.215398 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 7 00:52:36.218550 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:52:36.222962 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:52:36.226941 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:52:36.229756 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:52:36.230355 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:52:36.233996 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:52:36.234146 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:52:36.237059 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:52:36.239711 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 00:52:36.241336 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:52:36.244744 systemd[1]: Finished ensure-sysext.service. Mar 7 00:52:36.254055 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 7 00:52:36.256144 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:52:36.256354 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:52:36.258675 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:52:36.275141 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 7 00:52:36.277497 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 7 00:52:36.279358 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:52:36.280340 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:52:36.281637 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:52:36.281790 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:52:36.285591 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:52:36.285635 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 00:52:36.296277 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 00:52:36.297726 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 00:52:36.298975 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 7 00:52:36.308895 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 7 00:52:36.321098 systemd-udevd[1337]: Using default interface naming scheme 'v255'. Mar 7 00:52:36.325010 augenrules[1365]: No rules Mar 7 00:52:36.328132 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 7 00:52:36.329757 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 00:52:36.331661 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 7 00:52:36.354412 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 00:52:36.369963 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 00:52:36.423373 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 7 00:52:36.424177 systemd[1]: Reached target time-set.target - System Time Set. Mar 7 00:52:36.450308 systemd-resolved[1334]: Positive Trust Anchors: Mar 7 00:52:36.450328 systemd-resolved[1334]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 00:52:36.450361 systemd-resolved[1334]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 00:52:36.460281 systemd-resolved[1334]: Using system hostname 'ci-4081-3-6-n-f47b87f6f2'. Mar 7 00:52:36.467125 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 7 00:52:36.490277 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 00:52:36.493355 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 00:52:36.505440 systemd-networkd[1382]: lo: Link UP Mar 7 00:52:36.505870 systemd-networkd[1382]: lo: Gained carrier Mar 7 00:52:36.508031 systemd-networkd[1382]: Enumeration completed Mar 7 00:52:36.508496 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 00:52:36.509412 systemd[1]: Reached target network.target - Network. Mar 7 00:52:36.511010 systemd-networkd[1382]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:36.511017 systemd-networkd[1382]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:52:36.512471 systemd-networkd[1382]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:36.512587 systemd-networkd[1382]: eth0: Link UP Mar 7 00:52:36.512643 systemd-networkd[1382]: eth0: Gained carrier Mar 7 00:52:36.512719 systemd-networkd[1382]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:36.518145 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 7 00:52:36.555969 systemd-networkd[1382]: eth0: DHCPv4 address 116.202.17.139/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 7 00:52:36.556925 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Mar 7 00:52:36.562512 systemd-networkd[1382]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:36.562536 systemd-networkd[1382]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 00:52:36.563089 systemd-networkd[1382]: eth1: Link UP Mar 7 00:52:36.563096 systemd-networkd[1382]: eth1: Gained carrier Mar 7 00:52:36.563111 systemd-networkd[1382]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 00:52:36.563333 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Mar 7 00:52:36.574713 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Mar 7 00:52:36.599740 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1395) Mar 7 00:52:36.599836 kernel: mousedev: PS/2 mouse device common for all mice Mar 7 00:52:36.627951 systemd-networkd[1382]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 7 00:52:36.628247 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Mar 7 00:52:36.628628 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Mar 7 00:52:36.638003 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 7 00:52:36.638134 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 00:52:36.647048 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 00:52:36.651604 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 00:52:36.654930 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 00:52:36.657567 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 00:52:36.657613 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 00:52:36.660144 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 00:52:36.660344 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 00:52:36.679029 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 7 00:52:36.695061 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 7 00:52:36.696393 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 00:52:36.698738 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 00:52:36.700596 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 00:52:36.703388 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 00:52:36.703582 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 00:52:36.704907 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 00:52:36.719651 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 7 00:52:36.750732 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Mar 7 00:52:36.750799 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 7 00:52:36.750835 kernel: [drm] features: -context_init Mar 7 00:52:36.751786 kernel: [drm] number of scanouts: 1 Mar 7 00:52:36.751840 kernel: [drm] number of cap sets: 0 Mar 7 00:52:36.756702 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Mar 7 00:52:36.764418 kernel: Console: switching to colour frame buffer device 160x50 Mar 7 00:52:36.763117 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:52:36.767735 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 7 00:52:36.779588 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 00:52:36.779858 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:52:36.786932 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 00:52:36.853235 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 00:52:36.907210 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 7 00:52:36.916055 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 7 00:52:36.928703 lvm[1442]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 00:52:36.955429 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 7 00:52:36.957731 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 00:52:36.958318 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 00:52:36.959023 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 7 00:52:36.959825 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 7 00:52:36.960791 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 7 00:52:36.961411 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 7 00:52:36.962145 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 7 00:52:36.962797 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 7 00:52:36.962822 systemd[1]: Reached target paths.target - Path Units. Mar 7 00:52:36.963278 systemd[1]: Reached target timers.target - Timer Units. Mar 7 00:52:36.965040 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 7 00:52:36.967078 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 7 00:52:36.976110 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 7 00:52:36.979701 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 7 00:52:36.981816 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 7 00:52:36.983204 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 00:52:36.983825 systemd[1]: Reached target basic.target - Basic System. Mar 7 00:52:36.984342 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:52:36.984377 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 7 00:52:36.993370 systemd[1]: Starting containerd.service - containerd container runtime... Mar 7 00:52:36.995566 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 7 00:52:36.997896 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 7 00:52:37.004725 lvm[1446]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 00:52:37.008873 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 7 00:52:37.012261 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 7 00:52:37.014873 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 7 00:52:37.019016 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 7 00:52:37.021080 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 7 00:52:37.023948 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 7 00:52:37.026848 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 7 00:52:37.028855 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 7 00:52:37.032153 jq[1450]: false Mar 7 00:52:37.039315 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 7 00:52:37.041041 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 7 00:52:37.042348 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 7 00:52:37.046504 systemd[1]: Starting update-engine.service - Update Engine... Mar 7 00:52:37.050834 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 7 00:52:37.054179 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 7 00:52:37.054370 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 7 00:52:37.086841 coreos-metadata[1448]: Mar 07 00:52:37.084 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 7 00:52:37.098799 coreos-metadata[1448]: Mar 07 00:52:37.091 INFO Fetch successful Mar 7 00:52:37.098799 coreos-metadata[1448]: Mar 07 00:52:37.091 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 7 00:52:37.098799 coreos-metadata[1448]: Mar 07 00:52:37.093 INFO Fetch successful Mar 7 00:52:37.101322 jq[1461]: true Mar 7 00:52:37.110471 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 7 00:52:37.112171 systemd[1]: motdgen.service: Deactivated successfully. Mar 7 00:52:37.112315 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 7 00:52:37.115700 (ntainerd)[1477]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 7 00:52:37.121807 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 7 00:52:37.122838 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 7 00:52:37.130916 dbus-daemon[1449]: [system] SELinux support is enabled Mar 7 00:52:37.131084 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 7 00:52:37.134901 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 7 00:52:37.134924 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 7 00:52:37.137080 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 7 00:52:37.137105 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 7 00:52:37.145723 extend-filesystems[1451]: Found loop4 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found loop5 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found loop6 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found loop7 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda1 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda2 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda3 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found usr Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda4 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda6 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda7 Mar 7 00:52:37.159279 extend-filesystems[1451]: Found sda9 Mar 7 00:52:37.159279 extend-filesystems[1451]: Checking size of /dev/sda9 Mar 7 00:52:37.207814 tar[1466]: linux-arm64/LICENSE Mar 7 00:52:37.207814 tar[1466]: linux-arm64/helm Mar 7 00:52:37.202502 systemd[1]: Started update-engine.service - Update Engine. Mar 7 00:52:37.208208 update_engine[1460]: I20260307 00:52:37.179463 1460 main.cc:92] Flatcar Update Engine starting Mar 7 00:52:37.208343 extend-filesystems[1451]: Resized partition /dev/sda9 Mar 7 00:52:37.215018 update_engine[1460]: I20260307 00:52:37.209285 1460 update_check_scheduler.cc:74] Next update check in 3m38s Mar 7 00:52:37.209049 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 7 00:52:37.215132 extend-filesystems[1495]: resize2fs 1.47.1 (20-May-2024) Mar 7 00:52:37.218914 jq[1484]: true Mar 7 00:52:37.224703 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Mar 7 00:52:37.255223 systemd-logind[1459]: New seat seat0. Mar 7 00:52:37.259544 systemd-logind[1459]: Watching system buttons on /dev/input/event0 (Power Button) Mar 7 00:52:37.259574 systemd-logind[1459]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Mar 7 00:52:37.259886 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 7 00:52:37.262085 systemd[1]: Started systemd-logind.service - User Login Management. Mar 7 00:52:37.265482 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 7 00:52:37.318638 bash[1517]: Updated "/home/core/.ssh/authorized_keys" Mar 7 00:52:37.324309 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 7 00:52:37.342361 systemd[1]: Starting sshkeys.service... Mar 7 00:52:37.351715 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1388) Mar 7 00:52:37.356167 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 7 00:52:37.365194 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 7 00:52:37.395876 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Mar 7 00:52:37.396773 coreos-metadata[1528]: Mar 07 00:52:37.396 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 7 00:52:37.398487 coreos-metadata[1528]: Mar 07 00:52:37.398 INFO Fetch successful Mar 7 00:52:37.417106 extend-filesystems[1495]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 7 00:52:37.417106 extend-filesystems[1495]: old_desc_blocks = 1, new_desc_blocks = 5 Mar 7 00:52:37.417106 extend-filesystems[1495]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Mar 7 00:52:37.428536 extend-filesystems[1451]: Resized filesystem in /dev/sda9 Mar 7 00:52:37.428536 extend-filesystems[1451]: Found sr0 Mar 7 00:52:37.418192 unknown[1528]: wrote ssh authorized keys file for user: core Mar 7 00:52:37.423031 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 7 00:52:37.423229 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 7 00:52:37.478752 update-ssh-keys[1535]: Updated "/home/core/.ssh/authorized_keys" Mar 7 00:52:37.481822 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 7 00:52:37.490900 systemd[1]: Finished sshkeys.service. Mar 7 00:52:37.494886 locksmithd[1496]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 7 00:52:37.571955 containerd[1477]: time="2026-03-07T00:52:37.570043720Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 7 00:52:37.601485 containerd[1477]: time="2026-03-07T00:52:37.601220560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:52:37.603335 containerd[1477]: time="2026-03-07T00:52:37.603279880Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:52:37.603335 containerd[1477]: time="2026-03-07T00:52:37.603329680Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 7 00:52:37.603443 containerd[1477]: time="2026-03-07T00:52:37.603349240Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603561160Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603591400Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603664560Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603698720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603884480Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603905400Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603920040Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.603930280Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.604001560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.604201480Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 7 00:52:37.604711 containerd[1477]: time="2026-03-07T00:52:37.604301880Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 00:52:37.605023 containerd[1477]: time="2026-03-07T00:52:37.604316840Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 7 00:52:37.605023 containerd[1477]: time="2026-03-07T00:52:37.604394960Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 7 00:52:37.605023 containerd[1477]: time="2026-03-07T00:52:37.604437000Z" level=info msg="metadata content store policy set" policy=shared Mar 7 00:52:37.610495 containerd[1477]: time="2026-03-07T00:52:37.610280640Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 7 00:52:37.610495 containerd[1477]: time="2026-03-07T00:52:37.610354720Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 7 00:52:37.610495 containerd[1477]: time="2026-03-07T00:52:37.610441800Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 7 00:52:37.610495 containerd[1477]: time="2026-03-07T00:52:37.610465440Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.610835800Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611050160Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611343520Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611461960Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611481680Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611499720Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611544600Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611568800Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611586520Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611605040Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611624720Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611648440Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611666840Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.611979 containerd[1477]: time="2026-03-07T00:52:37.611723480Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611753280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611771880Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611796240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611814880Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611834640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611851520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611868160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611886840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611904440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611924440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.612630 containerd[1477]: time="2026-03-07T00:52:37.611940560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.613177 containerd[1477]: time="2026-03-07T00:52:37.611957560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.613177 containerd[1477]: time="2026-03-07T00:52:37.612972440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.613177 containerd[1477]: time="2026-03-07T00:52:37.612992920Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 7 00:52:37.613177 containerd[1477]: time="2026-03-07T00:52:37.613020480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.613177 containerd[1477]: time="2026-03-07T00:52:37.613042080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.613177 containerd[1477]: time="2026-03-07T00:52:37.613055520Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 7 00:52:37.613358 containerd[1477]: time="2026-03-07T00:52:37.613341520Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 7 00:52:37.613604 containerd[1477]: time="2026-03-07T00:52:37.613583480Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 7 00:52:37.613692 containerd[1477]: time="2026-03-07T00:52:37.613662160Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 7 00:52:37.613750 containerd[1477]: time="2026-03-07T00:52:37.613735560Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 7 00:52:37.613800 containerd[1477]: time="2026-03-07T00:52:37.613787480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.614713 containerd[1477]: time="2026-03-07T00:52:37.613839960Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 7 00:52:37.614713 containerd[1477]: time="2026-03-07T00:52:37.613855240Z" level=info msg="NRI interface is disabled by configuration." Mar 7 00:52:37.614713 containerd[1477]: time="2026-03-07T00:52:37.613868640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 7 00:52:37.614834 containerd[1477]: time="2026-03-07T00:52:37.614245080Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 7 00:52:37.614834 containerd[1477]: time="2026-03-07T00:52:37.614308280Z" level=info msg="Connect containerd service" Mar 7 00:52:37.614834 containerd[1477]: time="2026-03-07T00:52:37.614339680Z" level=info msg="using legacy CRI server" Mar 7 00:52:37.614834 containerd[1477]: time="2026-03-07T00:52:37.614348800Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 7 00:52:37.614834 containerd[1477]: time="2026-03-07T00:52:37.614467000Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 7 00:52:37.615869 containerd[1477]: time="2026-03-07T00:52:37.615298120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 00:52:37.615869 containerd[1477]: time="2026-03-07T00:52:37.615498800Z" level=info msg="Start subscribing containerd event" Mar 7 00:52:37.615869 containerd[1477]: time="2026-03-07T00:52:37.615561520Z" level=info msg="Start recovering state" Mar 7 00:52:37.615869 containerd[1477]: time="2026-03-07T00:52:37.615628280Z" level=info msg="Start event monitor" Mar 7 00:52:37.615869 containerd[1477]: time="2026-03-07T00:52:37.615639320Z" level=info msg="Start snapshots syncer" Mar 7 00:52:37.615869 containerd[1477]: time="2026-03-07T00:52:37.615648800Z" level=info msg="Start cni network conf syncer for default" Mar 7 00:52:37.615869 containerd[1477]: time="2026-03-07T00:52:37.615657880Z" level=info msg="Start streaming server" Mar 7 00:52:37.617729 containerd[1477]: time="2026-03-07T00:52:37.616511200Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 7 00:52:37.617729 containerd[1477]: time="2026-03-07T00:52:37.616589840Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 7 00:52:37.619727 containerd[1477]: time="2026-03-07T00:52:37.618773600Z" level=info msg="containerd successfully booted in 0.050322s" Mar 7 00:52:37.618885 systemd[1]: Started containerd.service - containerd container runtime. Mar 7 00:52:37.698790 systemd-networkd[1382]: eth0: Gained IPv6LL Mar 7 00:52:37.699365 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Mar 7 00:52:37.707416 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 7 00:52:37.709404 systemd[1]: Reached target network-online.target - Network is Online. Mar 7 00:52:37.719007 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:52:37.722826 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 7 00:52:37.775382 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 7 00:52:37.974939 tar[1466]: linux-arm64/README.md Mar 7 00:52:37.991741 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 7 00:52:38.272750 systemd-networkd[1382]: eth1: Gained IPv6LL Mar 7 00:52:38.273213 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Mar 7 00:52:38.573868 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:52:38.579434 (kubelet)[1562]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:52:38.925025 sshd_keygen[1483]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 7 00:52:38.948165 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 7 00:52:38.955600 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 7 00:52:38.963657 systemd[1]: issuegen.service: Deactivated successfully. Mar 7 00:52:38.963901 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 7 00:52:38.973692 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 7 00:52:38.985725 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 7 00:52:38.998300 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 7 00:52:39.003117 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 7 00:52:39.007075 systemd[1]: Reached target getty.target - Login Prompts. Mar 7 00:52:39.009936 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 7 00:52:39.010881 systemd[1]: Startup finished in 756ms (kernel) + 4.719s (initrd) + 4.637s (userspace) = 10.114s. Mar 7 00:52:39.064055 kubelet[1562]: E0307 00:52:39.063970 1562 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:52:39.068642 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:52:39.068906 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:52:49.095912 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 7 00:52:49.109033 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:52:49.226671 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:52:49.230635 (kubelet)[1597]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:52:49.284547 kubelet[1597]: E0307 00:52:49.284478 1597 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:52:49.288229 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:52:49.288587 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:52:59.346036 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 7 00:52:59.353008 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:52:59.470747 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:52:59.476155 (kubelet)[1613]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:52:59.515519 kubelet[1613]: E0307 00:52:59.515471 1613 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:52:59.518428 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:52:59.518570 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:53:08.466535 systemd-timesyncd[1348]: Contacted time server 144.76.66.156:123 (2.flatcar.pool.ntp.org). Mar 7 00:53:08.466625 systemd-timesyncd[1348]: Initial clock synchronization to Sat 2026-03-07 00:53:08.861928 UTC. Mar 7 00:53:09.600366 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 7 00:53:09.617102 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:09.732668 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:09.744318 (kubelet)[1628]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:53:09.787479 kubelet[1628]: E0307 00:53:09.787419 1628 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:53:09.790437 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:53:09.790585 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:53:12.637773 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 7 00:53:12.642981 systemd[1]: Started sshd@0-116.202.17.139:22-20.161.92.111:37060.service - OpenSSH per-connection server daemon (20.161.92.111:37060). Mar 7 00:53:13.255631 sshd[1637]: Accepted publickey for core from 20.161.92.111 port 37060 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:53:13.258366 sshd[1637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:53:13.268259 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 7 00:53:13.286932 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 7 00:53:13.290938 systemd-logind[1459]: New session 1 of user core. Mar 7 00:53:13.298947 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 7 00:53:13.311178 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 7 00:53:13.316020 (systemd)[1641]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 7 00:53:13.424471 systemd[1641]: Queued start job for default target default.target. Mar 7 00:53:13.433406 systemd[1641]: Created slice app.slice - User Application Slice. Mar 7 00:53:13.433532 systemd[1641]: Reached target paths.target - Paths. Mar 7 00:53:13.433566 systemd[1641]: Reached target timers.target - Timers. Mar 7 00:53:13.436278 systemd[1641]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 7 00:53:13.451061 systemd[1641]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 7 00:53:13.451179 systemd[1641]: Reached target sockets.target - Sockets. Mar 7 00:53:13.451192 systemd[1641]: Reached target basic.target - Basic System. Mar 7 00:53:13.451233 systemd[1641]: Reached target default.target - Main User Target. Mar 7 00:53:13.451260 systemd[1641]: Startup finished in 128ms. Mar 7 00:53:13.451671 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 7 00:53:13.460448 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 7 00:53:13.907187 systemd[1]: Started sshd@1-116.202.17.139:22-20.161.92.111:37068.service - OpenSSH per-connection server daemon (20.161.92.111:37068). Mar 7 00:53:14.518050 sshd[1652]: Accepted publickey for core from 20.161.92.111 port 37068 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:53:14.520734 sshd[1652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:53:14.525271 systemd-logind[1459]: New session 2 of user core. Mar 7 00:53:14.529951 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 7 00:53:14.949261 sshd[1652]: pam_unix(sshd:session): session closed for user core Mar 7 00:53:14.953435 systemd[1]: sshd@1-116.202.17.139:22-20.161.92.111:37068.service: Deactivated successfully. Mar 7 00:53:14.955184 systemd[1]: session-2.scope: Deactivated successfully. Mar 7 00:53:14.957325 systemd-logind[1459]: Session 2 logged out. Waiting for processes to exit. Mar 7 00:53:14.958476 systemd-logind[1459]: Removed session 2. Mar 7 00:53:15.055973 systemd[1]: Started sshd@2-116.202.17.139:22-20.161.92.111:37084.service - OpenSSH per-connection server daemon (20.161.92.111:37084). Mar 7 00:53:15.664750 sshd[1659]: Accepted publickey for core from 20.161.92.111 port 37084 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:53:15.665976 sshd[1659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:53:15.673027 systemd-logind[1459]: New session 3 of user core. Mar 7 00:53:15.680168 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 7 00:53:16.087058 sshd[1659]: pam_unix(sshd:session): session closed for user core Mar 7 00:53:16.094642 systemd[1]: sshd@2-116.202.17.139:22-20.161.92.111:37084.service: Deactivated successfully. Mar 7 00:53:16.096499 systemd[1]: session-3.scope: Deactivated successfully. Mar 7 00:53:16.097352 systemd-logind[1459]: Session 3 logged out. Waiting for processes to exit. Mar 7 00:53:16.098746 systemd-logind[1459]: Removed session 3. Mar 7 00:53:16.192539 systemd[1]: Started sshd@3-116.202.17.139:22-20.161.92.111:37096.service - OpenSSH per-connection server daemon (20.161.92.111:37096). Mar 7 00:53:16.800063 sshd[1666]: Accepted publickey for core from 20.161.92.111 port 37096 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:53:16.802353 sshd[1666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:53:16.808217 systemd-logind[1459]: New session 4 of user core. Mar 7 00:53:16.812025 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 7 00:53:17.226894 sshd[1666]: pam_unix(sshd:session): session closed for user core Mar 7 00:53:17.233771 systemd-logind[1459]: Session 4 logged out. Waiting for processes to exit. Mar 7 00:53:17.234034 systemd[1]: sshd@3-116.202.17.139:22-20.161.92.111:37096.service: Deactivated successfully. Mar 7 00:53:17.235577 systemd[1]: session-4.scope: Deactivated successfully. Mar 7 00:53:17.237298 systemd-logind[1459]: Removed session 4. Mar 7 00:53:17.338180 systemd[1]: Started sshd@4-116.202.17.139:22-20.161.92.111:37108.service - OpenSSH per-connection server daemon (20.161.92.111:37108). Mar 7 00:53:17.933807 sshd[1673]: Accepted publickey for core from 20.161.92.111 port 37108 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:53:17.936193 sshd[1673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:53:17.941891 systemd-logind[1459]: New session 5 of user core. Mar 7 00:53:17.949021 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 7 00:53:18.272635 sudo[1676]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 7 00:53:18.273020 sudo[1676]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:53:18.294927 sudo[1676]: pam_unix(sudo:session): session closed for user root Mar 7 00:53:18.390832 sshd[1673]: pam_unix(sshd:session): session closed for user core Mar 7 00:53:18.397027 systemd[1]: sshd@4-116.202.17.139:22-20.161.92.111:37108.service: Deactivated successfully. Mar 7 00:53:18.398719 systemd[1]: session-5.scope: Deactivated successfully. Mar 7 00:53:18.399561 systemd-logind[1459]: Session 5 logged out. Waiting for processes to exit. Mar 7 00:53:18.401139 systemd-logind[1459]: Removed session 5. Mar 7 00:53:18.501073 systemd[1]: Started sshd@5-116.202.17.139:22-20.161.92.111:37122.service - OpenSSH per-connection server daemon (20.161.92.111:37122). Mar 7 00:53:19.099552 sshd[1681]: Accepted publickey for core from 20.161.92.111 port 37122 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:53:19.102128 sshd[1681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:53:19.107308 systemd-logind[1459]: New session 6 of user core. Mar 7 00:53:19.118086 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 7 00:53:19.430349 sudo[1685]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 7 00:53:19.430672 sudo[1685]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:53:19.434749 sudo[1685]: pam_unix(sudo:session): session closed for user root Mar 7 00:53:19.440932 sudo[1684]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 7 00:53:19.441362 sudo[1684]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:53:19.457807 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 7 00:53:19.461026 auditctl[1688]: No rules Mar 7 00:53:19.461710 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 00:53:19.461912 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 7 00:53:19.465996 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 00:53:19.514494 augenrules[1706]: No rules Mar 7 00:53:19.516102 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 00:53:19.518207 sudo[1684]: pam_unix(sudo:session): session closed for user root Mar 7 00:53:19.613444 sshd[1681]: pam_unix(sshd:session): session closed for user core Mar 7 00:53:19.619975 systemd[1]: sshd@5-116.202.17.139:22-20.161.92.111:37122.service: Deactivated successfully. Mar 7 00:53:19.622231 systemd[1]: session-6.scope: Deactivated successfully. Mar 7 00:53:19.623037 systemd-logind[1459]: Session 6 logged out. Waiting for processes to exit. Mar 7 00:53:19.624300 systemd-logind[1459]: Removed session 6. Mar 7 00:53:19.730322 systemd[1]: Started sshd@6-116.202.17.139:22-20.161.92.111:37134.service - OpenSSH per-connection server daemon (20.161.92.111:37134). Mar 7 00:53:19.846044 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 7 00:53:19.860080 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:19.974375 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:19.985774 (kubelet)[1724]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:53:20.033388 kubelet[1724]: E0307 00:53:20.033278 1724 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:53:20.036515 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:53:20.036801 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:53:20.319900 sshd[1714]: Accepted publickey for core from 20.161.92.111 port 37134 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:53:20.322485 sshd[1714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:53:20.328404 systemd-logind[1459]: New session 7 of user core. Mar 7 00:53:20.337965 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 7 00:53:20.649436 sudo[1732]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 7 00:53:20.649837 sudo[1732]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 00:53:20.962048 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 7 00:53:20.962143 (dockerd)[1747]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 7 00:53:21.201458 dockerd[1747]: time="2026-03-07T00:53:21.200885743Z" level=info msg="Starting up" Mar 7 00:53:21.288517 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2091998366-merged.mount: Deactivated successfully. Mar 7 00:53:21.308884 dockerd[1747]: time="2026-03-07T00:53:21.308563605Z" level=info msg="Loading containers: start." Mar 7 00:53:21.410732 kernel: Initializing XFRM netlink socket Mar 7 00:53:21.496436 systemd-networkd[1382]: docker0: Link UP Mar 7 00:53:21.509913 dockerd[1747]: time="2026-03-07T00:53:21.509850071Z" level=info msg="Loading containers: done." Mar 7 00:53:21.526485 dockerd[1747]: time="2026-03-07T00:53:21.526407546Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 7 00:53:21.526755 dockerd[1747]: time="2026-03-07T00:53:21.526529428Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 7 00:53:21.526755 dockerd[1747]: time="2026-03-07T00:53:21.526634706Z" level=info msg="Daemon has completed initialization" Mar 7 00:53:21.571867 dockerd[1747]: time="2026-03-07T00:53:21.571605245Z" level=info msg="API listen on /run/docker.sock" Mar 7 00:53:21.572447 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 7 00:53:22.079329 containerd[1477]: time="2026-03-07T00:53:22.079245092Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 7 00:53:22.750306 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2771389163.mount: Deactivated successfully. Mar 7 00:53:22.899297 update_engine[1460]: I20260307 00:53:22.899242 1460 update_attempter.cc:509] Updating boot flags... Mar 7 00:53:22.964716 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1908) Mar 7 00:53:23.034844 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1908) Mar 7 00:53:23.092830 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1908) Mar 7 00:53:23.953740 containerd[1477]: time="2026-03-07T00:53:23.953667592Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:23.956244 containerd[1477]: time="2026-03-07T00:53:23.956169122Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=24583350" Mar 7 00:53:23.958110 containerd[1477]: time="2026-03-07T00:53:23.957532439Z" level=info msg="ImageCreate event name:\"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:23.963773 containerd[1477]: time="2026-03-07T00:53:23.963293273Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:23.964540 containerd[1477]: time="2026-03-07T00:53:23.964232904Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"24579851\" in 1.88494578s" Mar 7 00:53:23.964540 containerd[1477]: time="2026-03-07T00:53:23.964287477Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\"" Mar 7 00:53:23.965185 containerd[1477]: time="2026-03-07T00:53:23.965145248Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 7 00:53:25.409130 containerd[1477]: time="2026-03-07T00:53:25.409042025Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:25.412107 containerd[1477]: time="2026-03-07T00:53:25.412056239Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=19139661" Mar 7 00:53:25.413872 containerd[1477]: time="2026-03-07T00:53:25.413826192Z" level=info msg="ImageCreate event name:\"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:25.418706 containerd[1477]: time="2026-03-07T00:53:25.418579943Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:25.421569 containerd[1477]: time="2026-03-07T00:53:25.421506168Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"20724045\" in 1.455769507s" Mar 7 00:53:25.421569 containerd[1477]: time="2026-03-07T00:53:25.421544510Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\"" Mar 7 00:53:25.422708 containerd[1477]: time="2026-03-07T00:53:25.422453707Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 7 00:53:26.539114 containerd[1477]: time="2026-03-07T00:53:26.539057062Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:26.541159 containerd[1477]: time="2026-03-07T00:53:26.541076284Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=14195564" Mar 7 00:53:26.542126 containerd[1477]: time="2026-03-07T00:53:26.542094580Z" level=info msg="ImageCreate event name:\"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:26.548416 containerd[1477]: time="2026-03-07T00:53:26.548386195Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:26.549412 containerd[1477]: time="2026-03-07T00:53:26.549375543Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"15779966\" in 1.126875173s" Mar 7 00:53:26.549478 containerd[1477]: time="2026-03-07T00:53:26.549413456Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\"" Mar 7 00:53:26.549904 containerd[1477]: time="2026-03-07T00:53:26.549872951Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 7 00:53:27.537315 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3252609880.mount: Deactivated successfully. Mar 7 00:53:27.747921 containerd[1477]: time="2026-03-07T00:53:27.747867590Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:27.750247 containerd[1477]: time="2026-03-07T00:53:27.750213750Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=22697114" Mar 7 00:53:27.751712 containerd[1477]: time="2026-03-07T00:53:27.751543306Z" level=info msg="ImageCreate event name:\"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:27.755899 containerd[1477]: time="2026-03-07T00:53:27.755837535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:27.757998 containerd[1477]: time="2026-03-07T00:53:27.757219565Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"22696107\" in 1.207298341s" Mar 7 00:53:27.757998 containerd[1477]: time="2026-03-07T00:53:27.757804447Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\"" Mar 7 00:53:27.759158 containerd[1477]: time="2026-03-07T00:53:27.758822216Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 7 00:53:28.371836 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3168061012.mount: Deactivated successfully. Mar 7 00:53:29.328730 containerd[1477]: time="2026-03-07T00:53:29.326717886Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:29.330104 containerd[1477]: time="2026-03-07T00:53:29.330077304Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395498" Mar 7 00:53:29.331875 containerd[1477]: time="2026-03-07T00:53:29.331849387Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:29.337858 containerd[1477]: time="2026-03-07T00:53:29.337812283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:29.340736 containerd[1477]: time="2026-03-07T00:53:29.340698486Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.581793264s" Mar 7 00:53:29.340882 containerd[1477]: time="2026-03-07T00:53:29.340854298Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Mar 7 00:53:29.341453 containerd[1477]: time="2026-03-07T00:53:29.341432068Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 7 00:53:29.969838 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2262941007.mount: Deactivated successfully. Mar 7 00:53:29.975984 containerd[1477]: time="2026-03-07T00:53:29.975902730Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:29.977920 containerd[1477]: time="2026-03-07T00:53:29.977862695Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Mar 7 00:53:29.979190 containerd[1477]: time="2026-03-07T00:53:29.979142981Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:29.981907 containerd[1477]: time="2026-03-07T00:53:29.981827777Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:29.983367 containerd[1477]: time="2026-03-07T00:53:29.983309952Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 641.674912ms" Mar 7 00:53:29.983367 containerd[1477]: time="2026-03-07T00:53:29.983349327Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 7 00:53:29.984042 containerd[1477]: time="2026-03-07T00:53:29.983973816Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 7 00:53:30.095826 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 7 00:53:30.102894 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:30.240911 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:30.241701 (kubelet)[2044]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 00:53:30.289509 kubelet[2044]: E0307 00:53:30.289448 2044 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 00:53:30.292004 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 00:53:30.292157 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 00:53:30.613300 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3047604308.mount: Deactivated successfully. Mar 7 00:53:31.403173 containerd[1477]: time="2026-03-07T00:53:31.403118723Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:31.404730 containerd[1477]: time="2026-03-07T00:53:31.404663921Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21125601" Mar 7 00:53:31.406041 containerd[1477]: time="2026-03-07T00:53:31.405977553Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:31.411470 containerd[1477]: time="2026-03-07T00:53:31.411438424Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:31.415557 containerd[1477]: time="2026-03-07T00:53:31.415399896Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 1.431366785s" Mar 7 00:53:31.415557 containerd[1477]: time="2026-03-07T00:53:31.415479985Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"" Mar 7 00:53:36.777572 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:36.794771 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:36.828374 systemd[1]: Reloading requested from client PID 2140 ('systemctl') (unit session-7.scope)... Mar 7 00:53:36.828530 systemd[1]: Reloading... Mar 7 00:53:36.964705 zram_generator::config[2186]: No configuration found. Mar 7 00:53:37.057210 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:53:37.126836 systemd[1]: Reloading finished in 297 ms. Mar 7 00:53:37.196911 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:37.197747 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 00:53:37.198807 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:37.205000 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:37.329024 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:37.333823 (kubelet)[2230]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:53:37.378751 kubelet[2230]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 00:53:37.378751 kubelet[2230]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:53:37.378751 kubelet[2230]: I0307 00:53:37.378075 2230 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 00:53:38.503920 kubelet[2230]: I0307 00:53:38.503878 2230 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 7 00:53:38.505706 kubelet[2230]: I0307 00:53:38.504375 2230 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:53:38.505706 kubelet[2230]: I0307 00:53:38.504420 2230 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 00:53:38.505706 kubelet[2230]: I0307 00:53:38.504427 2230 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:53:38.505706 kubelet[2230]: I0307 00:53:38.504713 2230 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 00:53:38.515827 kubelet[2230]: I0307 00:53:38.515791 2230 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:53:38.516058 kubelet[2230]: E0307 00:53:38.515801 2230 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://116.202.17.139:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 7 00:53:38.519839 kubelet[2230]: E0307 00:53:38.519783 2230 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 00:53:38.519947 kubelet[2230]: I0307 00:53:38.519843 2230 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 00:53:38.521973 kubelet[2230]: I0307 00:53:38.521938 2230 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 00:53:38.522322 kubelet[2230]: I0307 00:53:38.522237 2230 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:53:38.522410 kubelet[2230]: I0307 00:53:38.522271 2230 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-f47b87f6f2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:53:38.522586 kubelet[2230]: I0307 00:53:38.522414 2230 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 00:53:38.522586 kubelet[2230]: I0307 00:53:38.522422 2230 container_manager_linux.go:306] "Creating device plugin manager" Mar 7 00:53:38.522586 kubelet[2230]: I0307 00:53:38.522522 2230 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 00:53:38.524808 kubelet[2230]: I0307 00:53:38.524786 2230 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:53:38.526511 kubelet[2230]: I0307 00:53:38.526473 2230 kubelet.go:475] "Attempting to sync node with API server" Mar 7 00:53:38.527174 kubelet[2230]: I0307 00:53:38.526507 2230 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:53:38.527174 kubelet[2230]: I0307 00:53:38.527026 2230 kubelet.go:387] "Adding apiserver pod source" Mar 7 00:53:38.527174 kubelet[2230]: I0307 00:53:38.527043 2230 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:53:38.527174 kubelet[2230]: E0307 00:53:38.527074 2230 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://116.202.17.139:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-n-f47b87f6f2&limit=500&resourceVersion=0\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 00:53:38.529728 kubelet[2230]: E0307 00:53:38.528218 2230 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://116.202.17.139:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 00:53:38.529728 kubelet[2230]: I0307 00:53:38.528444 2230 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 00:53:38.529728 kubelet[2230]: I0307 00:53:38.529152 2230 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:53:38.529728 kubelet[2230]: I0307 00:53:38.529191 2230 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 00:53:38.529728 kubelet[2230]: W0307 00:53:38.529233 2230 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 7 00:53:38.531743 kubelet[2230]: I0307 00:53:38.531715 2230 server.go:1262] "Started kubelet" Mar 7 00:53:38.537889 kubelet[2230]: I0307 00:53:38.537852 2230 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 00:53:38.538299 kubelet[2230]: E0307 00:53:38.536140 2230 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://116.202.17.139:6443/api/v1/namespaces/default/events\": dial tcp 116.202.17.139:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-f47b87f6f2.189a68f83278a7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-f47b87f6f2,UID:ci-4081-3-6-n-f47b87f6f2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-f47b87f6f2,},FirstTimestamp:2026-03-07 00:53:38.531657676 +0000 UTC m=+1.194207548,LastTimestamp:2026-03-07 00:53:38.531657676 +0000 UTC m=+1.194207548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-f47b87f6f2,}" Mar 7 00:53:38.540492 kubelet[2230]: I0307 00:53:38.540441 2230 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:53:38.541722 kubelet[2230]: I0307 00:53:38.541666 2230 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 7 00:53:38.542033 kubelet[2230]: I0307 00:53:38.541964 2230 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:53:38.542159 kubelet[2230]: I0307 00:53:38.542142 2230 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 00:53:38.542929 kubelet[2230]: I0307 00:53:38.542908 2230 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:53:38.543173 kubelet[2230]: I0307 00:53:38.543142 2230 reconciler.go:29] "Reconciler: start to sync state" Mar 7 00:53:38.543740 kubelet[2230]: I0307 00:53:38.543025 2230 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 00:53:38.548528 kubelet[2230]: I0307 00:53:38.548472 2230 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:53:38.551901 kubelet[2230]: I0307 00:53:38.551873 2230 server.go:310] "Adding debug handlers to kubelet server" Mar 7 00:53:38.552222 kubelet[2230]: E0307 00:53:38.542092 2230 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" Mar 7 00:53:38.553794 kubelet[2230]: E0307 00:53:38.553770 2230 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://116.202.17.139:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 00:53:38.553996 kubelet[2230]: E0307 00:53:38.553973 2230 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://116.202.17.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-f47b87f6f2?timeout=10s\": dial tcp 116.202.17.139:6443: connect: connection refused" interval="200ms" Mar 7 00:53:38.555530 kubelet[2230]: I0307 00:53:38.555511 2230 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:53:38.555739 kubelet[2230]: I0307 00:53:38.555721 2230 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:53:38.557672 kubelet[2230]: I0307 00:53:38.557654 2230 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:53:38.563434 kubelet[2230]: E0307 00:53:38.563409 2230 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:53:38.567238 kubelet[2230]: I0307 00:53:38.567165 2230 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 00:53:38.568276 kubelet[2230]: I0307 00:53:38.568204 2230 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 00:53:38.568276 kubelet[2230]: I0307 00:53:38.568229 2230 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 7 00:53:38.568276 kubelet[2230]: I0307 00:53:38.568262 2230 kubelet.go:2428] "Starting kubelet main sync loop" Mar 7 00:53:38.568508 kubelet[2230]: E0307 00:53:38.568303 2230 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:53:38.576805 kubelet[2230]: E0307 00:53:38.576483 2230 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://116.202.17.139:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 7 00:53:38.591322 kubelet[2230]: I0307 00:53:38.591285 2230 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 00:53:38.591322 kubelet[2230]: I0307 00:53:38.591313 2230 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 00:53:38.591471 kubelet[2230]: I0307 00:53:38.591341 2230 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:53:38.594409 kubelet[2230]: I0307 00:53:38.594362 2230 policy_none.go:49] "None policy: Start" Mar 7 00:53:38.594409 kubelet[2230]: I0307 00:53:38.594390 2230 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 00:53:38.594409 kubelet[2230]: I0307 00:53:38.594403 2230 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 00:53:38.595919 kubelet[2230]: I0307 00:53:38.595883 2230 policy_none.go:47] "Start" Mar 7 00:53:38.601709 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 7 00:53:38.622628 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 7 00:53:38.627613 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 7 00:53:38.638171 kubelet[2230]: E0307 00:53:38.638139 2230 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:53:38.639691 kubelet[2230]: I0307 00:53:38.639444 2230 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 00:53:38.639691 kubelet[2230]: I0307 00:53:38.639467 2230 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:53:38.640226 kubelet[2230]: I0307 00:53:38.640087 2230 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 00:53:38.642724 kubelet[2230]: E0307 00:53:38.642576 2230 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:53:38.642724 kubelet[2230]: E0307 00:53:38.642621 2230 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-n-f47b87f6f2\" not found" Mar 7 00:53:38.682631 systemd[1]: Created slice kubepods-burstable-pod2c0cf808c36cbeb24ba6fbbf8a1bd5ee.slice - libcontainer container kubepods-burstable-pod2c0cf808c36cbeb24ba6fbbf8a1bd5ee.slice. Mar 7 00:53:38.703561 kubelet[2230]: E0307 00:53:38.703320 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.710286 systemd[1]: Created slice kubepods-burstable-pod446506b153dee196cc2ac840c330d9c9.slice - libcontainer container kubepods-burstable-pod446506b153dee196cc2ac840c330d9c9.slice. Mar 7 00:53:38.713821 kubelet[2230]: E0307 00:53:38.713791 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.716781 systemd[1]: Created slice kubepods-burstable-podf830d6ea2b7057ea12074408549eb5dd.slice - libcontainer container kubepods-burstable-podf830d6ea2b7057ea12074408549eb5dd.slice. Mar 7 00:53:38.720939 kubelet[2230]: E0307 00:53:38.720866 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.742316 kubelet[2230]: I0307 00:53:38.741806 2230 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.742316 kubelet[2230]: E0307 00:53:38.742268 2230 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://116.202.17.139:6443/api/v1/nodes\": dial tcp 116.202.17.139:6443: connect: connection refused" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.744887 kubelet[2230]: I0307 00:53:38.744859 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745029 kubelet[2230]: I0307 00:53:38.745011 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745114 kubelet[2230]: I0307 00:53:38.745098 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745748 kubelet[2230]: I0307 00:53:38.745191 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f830d6ea2b7057ea12074408549eb5dd-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-f47b87f6f2\" (UID: \"f830d6ea2b7057ea12074408549eb5dd\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745748 kubelet[2230]: I0307 00:53:38.745234 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2c0cf808c36cbeb24ba6fbbf8a1bd5ee-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" (UID: \"2c0cf808c36cbeb24ba6fbbf8a1bd5ee\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745748 kubelet[2230]: I0307 00:53:38.745252 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2c0cf808c36cbeb24ba6fbbf8a1bd5ee-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" (UID: \"2c0cf808c36cbeb24ba6fbbf8a1bd5ee\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745748 kubelet[2230]: I0307 00:53:38.745272 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745748 kubelet[2230]: I0307 00:53:38.745290 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2c0cf808c36cbeb24ba6fbbf8a1bd5ee-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" (UID: \"2c0cf808c36cbeb24ba6fbbf8a1bd5ee\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.745933 kubelet[2230]: I0307 00:53:38.745307 2230 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.754838 kubelet[2230]: E0307 00:53:38.754715 2230 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://116.202.17.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-f47b87f6f2?timeout=10s\": dial tcp 116.202.17.139:6443: connect: connection refused" interval="400ms" Mar 7 00:53:38.945239 kubelet[2230]: I0307 00:53:38.945204 2230 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:38.945622 kubelet[2230]: E0307 00:53:38.945594 2230 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://116.202.17.139:6443/api/v1/nodes\": dial tcp 116.202.17.139:6443: connect: connection refused" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:39.007839 containerd[1477]: time="2026-03-07T00:53:39.007700039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-f47b87f6f2,Uid:2c0cf808c36cbeb24ba6fbbf8a1bd5ee,Namespace:kube-system,Attempt:0,}" Mar 7 00:53:39.018214 containerd[1477]: time="2026-03-07T00:53:39.018155603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-f47b87f6f2,Uid:446506b153dee196cc2ac840c330d9c9,Namespace:kube-system,Attempt:0,}" Mar 7 00:53:39.024541 containerd[1477]: time="2026-03-07T00:53:39.024394254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-f47b87f6f2,Uid:f830d6ea2b7057ea12074408549eb5dd,Namespace:kube-system,Attempt:0,}" Mar 7 00:53:39.156221 kubelet[2230]: E0307 00:53:39.156143 2230 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://116.202.17.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-f47b87f6f2?timeout=10s\": dial tcp 116.202.17.139:6443: connect: connection refused" interval="800ms" Mar 7 00:53:39.348721 kubelet[2230]: I0307 00:53:39.348638 2230 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:39.349201 kubelet[2230]: E0307 00:53:39.349156 2230 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://116.202.17.139:6443/api/v1/nodes\": dial tcp 116.202.17.139:6443: connect: connection refused" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:39.542439 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1683217059.mount: Deactivated successfully. Mar 7 00:53:39.552916 containerd[1477]: time="2026-03-07T00:53:39.552794251Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:53:39.554553 containerd[1477]: time="2026-03-07T00:53:39.554512997Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Mar 7 00:53:39.555715 containerd[1477]: time="2026-03-07T00:53:39.555216229Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:53:39.558702 containerd[1477]: time="2026-03-07T00:53:39.557328489Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:53:39.558702 containerd[1477]: time="2026-03-07T00:53:39.558425916Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:53:39.559453 containerd[1477]: time="2026-03-07T00:53:39.559426616Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 00:53:39.560021 containerd[1477]: time="2026-03-07T00:53:39.559984158Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 00:53:39.562631 containerd[1477]: time="2026-03-07T00:53:39.562576289Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 00:53:39.564068 containerd[1477]: time="2026-03-07T00:53:39.564037484Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 556.223062ms" Mar 7 00:53:39.565238 containerd[1477]: time="2026-03-07T00:53:39.565184876Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 540.716274ms" Mar 7 00:53:39.569648 containerd[1477]: time="2026-03-07T00:53:39.569470330Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 551.197862ms" Mar 7 00:53:39.576422 kubelet[2230]: E0307 00:53:39.576384 2230 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://116.202.17.139:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 00:53:39.613772 kubelet[2230]: E0307 00:53:39.610916 2230 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://116.202.17.139:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 00:53:39.643536 kubelet[2230]: E0307 00:53:39.643475 2230 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://116.202.17.139:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-n-f47b87f6f2&limit=500&resourceVersion=0\": dial tcp 116.202.17.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 00:53:39.693714 containerd[1477]: time="2026-03-07T00:53:39.692715705Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:53:39.693714 containerd[1477]: time="2026-03-07T00:53:39.692774638Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:53:39.693714 containerd[1477]: time="2026-03-07T00:53:39.692790412Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:39.697225 containerd[1477]: time="2026-03-07T00:53:39.694844460Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:39.702065 containerd[1477]: time="2026-03-07T00:53:39.700699086Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:53:39.702065 containerd[1477]: time="2026-03-07T00:53:39.700758499Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:53:39.702065 containerd[1477]: time="2026-03-07T00:53:39.700778958Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:39.702065 containerd[1477]: time="2026-03-07T00:53:39.700867557Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:39.704424 containerd[1477]: time="2026-03-07T00:53:39.704337879Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:53:39.704535 containerd[1477]: time="2026-03-07T00:53:39.704398133Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:53:39.704535 containerd[1477]: time="2026-03-07T00:53:39.704414227Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:39.704535 containerd[1477]: time="2026-03-07T00:53:39.704488654Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:39.732856 systemd[1]: Started cri-containerd-502c7ce3caa6b4d94c3d501ad53b8a5017f7e860c3193ba442e4a7e12a8509cc.scope - libcontainer container 502c7ce3caa6b4d94c3d501ad53b8a5017f7e860c3193ba442e4a7e12a8509cc. Mar 7 00:53:39.743851 systemd[1]: Started cri-containerd-402cfa0d120170feecf0f41e041b7683238811cc61526fe12dc98208c74424d5.scope - libcontainer container 402cfa0d120170feecf0f41e041b7683238811cc61526fe12dc98208c74424d5. Mar 7 00:53:39.746420 systemd[1]: Started cri-containerd-e95d8ed2a3cc30e54be777143732bb20b3e6568c4f0c467cccc8a018f37c5bb7.scope - libcontainer container e95d8ed2a3cc30e54be777143732bb20b3e6568c4f0c467cccc8a018f37c5bb7. Mar 7 00:53:39.794085 containerd[1477]: time="2026-03-07T00:53:39.794036199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-f47b87f6f2,Uid:446506b153dee196cc2ac840c330d9c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"502c7ce3caa6b4d94c3d501ad53b8a5017f7e860c3193ba442e4a7e12a8509cc\"" Mar 7 00:53:39.801906 containerd[1477]: time="2026-03-07T00:53:39.801864640Z" level=info msg="CreateContainer within sandbox \"502c7ce3caa6b4d94c3d501ad53b8a5017f7e860c3193ba442e4a7e12a8509cc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 7 00:53:39.808090 containerd[1477]: time="2026-03-07T00:53:39.808047562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-f47b87f6f2,Uid:2c0cf808c36cbeb24ba6fbbf8a1bd5ee,Namespace:kube-system,Attempt:0,} returns sandbox id \"402cfa0d120170feecf0f41e041b7683238811cc61526fe12dc98208c74424d5\"" Mar 7 00:53:39.814537 containerd[1477]: time="2026-03-07T00:53:39.814457567Z" level=info msg="CreateContainer within sandbox \"402cfa0d120170feecf0f41e041b7683238811cc61526fe12dc98208c74424d5\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 7 00:53:39.819242 containerd[1477]: time="2026-03-07T00:53:39.819192346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-f47b87f6f2,Uid:f830d6ea2b7057ea12074408549eb5dd,Namespace:kube-system,Attempt:0,} returns sandbox id \"e95d8ed2a3cc30e54be777143732bb20b3e6568c4f0c467cccc8a018f37c5bb7\"" Mar 7 00:53:39.824580 containerd[1477]: time="2026-03-07T00:53:39.824542959Z" level=info msg="CreateContainer within sandbox \"e95d8ed2a3cc30e54be777143732bb20b3e6568c4f0c467cccc8a018f37c5bb7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 7 00:53:39.825045 containerd[1477]: time="2026-03-07T00:53:39.824946362Z" level=info msg="CreateContainer within sandbox \"502c7ce3caa6b4d94c3d501ad53b8a5017f7e860c3193ba442e4a7e12a8509cc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925\"" Mar 7 00:53:39.826973 containerd[1477]: time="2026-03-07T00:53:39.826948042Z" level=info msg="StartContainer for \"62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925\"" Mar 7 00:53:39.840540 containerd[1477]: time="2026-03-07T00:53:39.840408830Z" level=info msg="CreateContainer within sandbox \"402cfa0d120170feecf0f41e041b7683238811cc61526fe12dc98208c74424d5\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f8fcffb42f89217518d6641d057f342d5dcd1f05f6db2eb89ef6facc2aa98b34\"" Mar 7 00:53:39.841752 containerd[1477]: time="2026-03-07T00:53:39.841126996Z" level=info msg="StartContainer for \"f8fcffb42f89217518d6641d057f342d5dcd1f05f6db2eb89ef6facc2aa98b34\"" Mar 7 00:53:39.843396 containerd[1477]: time="2026-03-07T00:53:39.843367050Z" level=info msg="CreateContainer within sandbox \"e95d8ed2a3cc30e54be777143732bb20b3e6568c4f0c467cccc8a018f37c5bb7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d\"" Mar 7 00:53:39.844032 containerd[1477]: time="2026-03-07T00:53:39.844008627Z" level=info msg="StartContainer for \"163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d\"" Mar 7 00:53:39.858860 systemd[1]: Started cri-containerd-62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925.scope - libcontainer container 62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925. Mar 7 00:53:39.895880 systemd[1]: Started cri-containerd-163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d.scope - libcontainer container 163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d. Mar 7 00:53:39.898342 systemd[1]: Started cri-containerd-f8fcffb42f89217518d6641d057f342d5dcd1f05f6db2eb89ef6facc2aa98b34.scope - libcontainer container f8fcffb42f89217518d6641d057f342d5dcd1f05f6db2eb89ef6facc2aa98b34. Mar 7 00:53:39.917370 containerd[1477]: time="2026-03-07T00:53:39.917315244Z" level=info msg="StartContainer for \"62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925\" returns successfully" Mar 7 00:53:39.957901 kubelet[2230]: E0307 00:53:39.957844 2230 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://116.202.17.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-f47b87f6f2?timeout=10s\": dial tcp 116.202.17.139:6443: connect: connection refused" interval="1.6s" Mar 7 00:53:39.958813 containerd[1477]: time="2026-03-07T00:53:39.958292942Z" level=info msg="StartContainer for \"f8fcffb42f89217518d6641d057f342d5dcd1f05f6db2eb89ef6facc2aa98b34\" returns successfully" Mar 7 00:53:39.968481 containerd[1477]: time="2026-03-07T00:53:39.968421012Z" level=info msg="StartContainer for \"163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d\" returns successfully" Mar 7 00:53:40.153043 kubelet[2230]: I0307 00:53:40.152671 2230 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:40.613669 kubelet[2230]: E0307 00:53:40.613479 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:40.615586 kubelet[2230]: E0307 00:53:40.615273 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:40.618705 kubelet[2230]: E0307 00:53:40.617752 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:41.619879 kubelet[2230]: E0307 00:53:41.619414 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:41.619879 kubelet[2230]: E0307 00:53:41.619753 2230 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.166361 kubelet[2230]: E0307 00:53:42.166327 2230 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-n-f47b87f6f2\" not found" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.298714 kubelet[2230]: I0307 00:53:42.297013 2230 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.298714 kubelet[2230]: E0307 00:53:42.297054 2230 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4081-3-6-n-f47b87f6f2\": node \"ci-4081-3-6-n-f47b87f6f2\" not found" Mar 7 00:53:42.343184 kubelet[2230]: I0307 00:53:42.342602 2230 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.353968 kubelet[2230]: E0307 00:53:42.353930 2230 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.354162 kubelet[2230]: I0307 00:53:42.354146 2230 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.358048 kubelet[2230]: E0307 00:53:42.357877 2230 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-f47b87f6f2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.358048 kubelet[2230]: I0307 00:53:42.357910 2230 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.360728 kubelet[2230]: E0307 00:53:42.359786 2230 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.530035 kubelet[2230]: I0307 00:53:42.529844 2230 apiserver.go:52] "Watching apiserver" Mar 7 00:53:42.543945 kubelet[2230]: I0307 00:53:42.543833 2230 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 00:53:42.797338 kubelet[2230]: I0307 00:53:42.797168 2230 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:42.801105 kubelet[2230]: E0307 00:53:42.801053 2230 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:43.896445 kubelet[2230]: I0307 00:53:43.896168 2230 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:44.298525 systemd[1]: Reloading requested from client PID 2510 ('systemctl') (unit session-7.scope)... Mar 7 00:53:44.298544 systemd[1]: Reloading... Mar 7 00:53:44.386710 zram_generator::config[2550]: No configuration found. Mar 7 00:53:44.501957 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 00:53:44.603156 systemd[1]: Reloading finished in 304 ms. Mar 7 00:53:44.639913 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:44.656395 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 00:53:44.657074 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:44.657257 systemd[1]: kubelet.service: Consumed 1.610s CPU time, 123.0M memory peak, 0B memory swap peak. Mar 7 00:53:44.667476 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 00:53:44.785750 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 00:53:44.792167 (kubelet)[2596]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 00:53:44.860806 kubelet[2596]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 00:53:44.860806 kubelet[2596]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 00:53:44.861571 kubelet[2596]: I0307 00:53:44.861152 2596 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 00:53:44.871284 kubelet[2596]: I0307 00:53:44.871239 2596 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 7 00:53:44.871284 kubelet[2596]: I0307 00:53:44.871270 2596 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 00:53:44.871284 kubelet[2596]: I0307 00:53:44.871293 2596 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 00:53:44.871460 kubelet[2596]: I0307 00:53:44.871302 2596 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 00:53:44.871577 kubelet[2596]: I0307 00:53:44.871543 2596 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 00:53:44.873143 kubelet[2596]: I0307 00:53:44.873123 2596 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 7 00:53:44.875673 kubelet[2596]: I0307 00:53:44.875520 2596 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 00:53:44.878807 kubelet[2596]: E0307 00:53:44.878467 2596 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 00:53:44.878807 kubelet[2596]: I0307 00:53:44.878522 2596 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 00:53:44.881097 kubelet[2596]: I0307 00:53:44.881073 2596 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 00:53:44.881464 kubelet[2596]: I0307 00:53:44.881436 2596 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 00:53:44.881719 kubelet[2596]: I0307 00:53:44.881537 2596 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-f47b87f6f2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 00:53:44.881848 kubelet[2596]: I0307 00:53:44.881834 2596 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 00:53:44.882176 kubelet[2596]: I0307 00:53:44.881895 2596 container_manager_linux.go:306] "Creating device plugin manager" Mar 7 00:53:44.882176 kubelet[2596]: I0307 00:53:44.881925 2596 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 00:53:44.882176 kubelet[2596]: I0307 00:53:44.882133 2596 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:53:44.882425 kubelet[2596]: I0307 00:53:44.882411 2596 kubelet.go:475] "Attempting to sync node with API server" Mar 7 00:53:44.882509 kubelet[2596]: I0307 00:53:44.882498 2596 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 00:53:44.882580 kubelet[2596]: I0307 00:53:44.882571 2596 kubelet.go:387] "Adding apiserver pod source" Mar 7 00:53:44.882633 kubelet[2596]: I0307 00:53:44.882625 2596 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 00:53:44.885984 kubelet[2596]: I0307 00:53:44.885963 2596 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 00:53:44.887715 kubelet[2596]: I0307 00:53:44.886754 2596 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 00:53:44.887715 kubelet[2596]: I0307 00:53:44.886786 2596 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 00:53:44.890474 kubelet[2596]: I0307 00:53:44.890455 2596 server.go:1262] "Started kubelet" Mar 7 00:53:44.895110 kubelet[2596]: I0307 00:53:44.895077 2596 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 00:53:44.898103 kubelet[2596]: I0307 00:53:44.898065 2596 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 00:53:44.905605 kubelet[2596]: I0307 00:53:44.905568 2596 server.go:310] "Adding debug handlers to kubelet server" Mar 7 00:53:44.906313 kubelet[2596]: I0307 00:53:44.901839 2596 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 00:53:44.912291 kubelet[2596]: I0307 00:53:44.912257 2596 factory.go:223] Registration of the systemd container factory successfully Mar 7 00:53:44.912390 kubelet[2596]: I0307 00:53:44.912363 2596 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 00:53:44.923369 kubelet[2596]: I0307 00:53:44.898347 2596 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 00:53:44.923610 kubelet[2596]: I0307 00:53:44.923559 2596 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 00:53:44.923856 kubelet[2596]: I0307 00:53:44.923842 2596 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 00:53:44.923945 kubelet[2596]: I0307 00:53:44.903630 2596 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 7 00:53:44.927102 kubelet[2596]: I0307 00:53:44.903640 2596 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 00:53:44.927102 kubelet[2596]: E0307 00:53:44.903787 2596 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-f47b87f6f2\" not found" Mar 7 00:53:44.928425 kubelet[2596]: I0307 00:53:44.928202 2596 reconciler.go:29] "Reconciler: start to sync state" Mar 7 00:53:44.932359 kubelet[2596]: I0307 00:53:44.932327 2596 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 00:53:44.933546 kubelet[2596]: I0307 00:53:44.933526 2596 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 00:53:44.934014 kubelet[2596]: I0307 00:53:44.933676 2596 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 7 00:53:44.934014 kubelet[2596]: I0307 00:53:44.933736 2596 kubelet.go:2428] "Starting kubelet main sync loop" Mar 7 00:53:44.934014 kubelet[2596]: E0307 00:53:44.933777 2596 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 00:53:44.951480 kubelet[2596]: E0307 00:53:44.951334 2596 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 00:53:44.953959 kubelet[2596]: I0307 00:53:44.953937 2596 factory.go:223] Registration of the containerd container factory successfully Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009469 2596 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009495 2596 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009517 2596 state_mem.go:36] "Initialized new in-memory state store" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009648 2596 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009657 2596 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009673 2596 policy_none.go:49] "None policy: Start" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009709 2596 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009720 2596 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009815 2596 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 7 00:53:45.010527 kubelet[2596]: I0307 00:53:45.009822 2596 policy_none.go:47] "Start" Mar 7 00:53:45.015868 kubelet[2596]: E0307 00:53:45.015837 2596 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 00:53:45.016033 kubelet[2596]: I0307 00:53:45.016012 2596 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 00:53:45.016089 kubelet[2596]: I0307 00:53:45.016031 2596 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 00:53:45.016521 kubelet[2596]: I0307 00:53:45.016499 2596 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 00:53:45.017890 kubelet[2596]: E0307 00:53:45.017816 2596 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 00:53:45.035148 kubelet[2596]: I0307 00:53:45.034848 2596 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.035526 kubelet[2596]: I0307 00:53:45.035512 2596 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.036015 kubelet[2596]: I0307 00:53:45.036000 2596 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.047892 kubelet[2596]: E0307 00:53:45.047859 2596 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-f47b87f6f2\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.120088 kubelet[2596]: I0307 00:53:45.119963 2596 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.129950 kubelet[2596]: I0307 00:53:45.129405 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2c0cf808c36cbeb24ba6fbbf8a1bd5ee-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" (UID: \"2c0cf808c36cbeb24ba6fbbf8a1bd5ee\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.129950 kubelet[2596]: I0307 00:53:45.129480 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.129950 kubelet[2596]: I0307 00:53:45.129520 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.129950 kubelet[2596]: I0307 00:53:45.129552 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f830d6ea2b7057ea12074408549eb5dd-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-f47b87f6f2\" (UID: \"f830d6ea2b7057ea12074408549eb5dd\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.129950 kubelet[2596]: I0307 00:53:45.129590 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2c0cf808c36cbeb24ba6fbbf8a1bd5ee-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" (UID: \"2c0cf808c36cbeb24ba6fbbf8a1bd5ee\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.130342 kubelet[2596]: I0307 00:53:45.129619 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2c0cf808c36cbeb24ba6fbbf8a1bd5ee-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" (UID: \"2c0cf808c36cbeb24ba6fbbf8a1bd5ee\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.130342 kubelet[2596]: I0307 00:53:45.129664 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.130342 kubelet[2596]: I0307 00:53:45.129712 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.130342 kubelet[2596]: I0307 00:53:45.129752 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/446506b153dee196cc2ac840c330d9c9-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-f47b87f6f2\" (UID: \"446506b153dee196cc2ac840c330d9c9\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.137099 kubelet[2596]: I0307 00:53:45.137037 2596 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.137560 kubelet[2596]: I0307 00:53:45.137527 2596 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.884667 kubelet[2596]: I0307 00:53:45.884283 2596 apiserver.go:52] "Watching apiserver" Mar 7 00:53:45.926311 kubelet[2596]: I0307 00:53:45.926256 2596 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 00:53:45.988488 kubelet[2596]: I0307 00:53:45.987913 2596 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:45.998488 kubelet[2596]: E0307 00:53:45.997899 2596 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-f47b87f6f2\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" Mar 7 00:53:46.032077 kubelet[2596]: I0307 00:53:46.031990 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-n-f47b87f6f2" podStartSLOduration=3.031972231 podStartE2EDuration="3.031972231s" podCreationTimestamp="2026-03-07 00:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:53:46.020511203 +0000 UTC m=+1.221284926" watchObservedRunningTime="2026-03-07 00:53:46.031972231 +0000 UTC m=+1.232745914" Mar 7 00:53:46.050085 kubelet[2596]: I0307 00:53:46.049814 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-n-f47b87f6f2" podStartSLOduration=1.049793755 podStartE2EDuration="1.049793755s" podCreationTimestamp="2026-03-07 00:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:53:46.032655168 +0000 UTC m=+1.233428851" watchObservedRunningTime="2026-03-07 00:53:46.049793755 +0000 UTC m=+1.250567479" Mar 7 00:53:46.050085 kubelet[2596]: I0307 00:53:46.049919 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-f47b87f6f2" podStartSLOduration=1.049912357 podStartE2EDuration="1.049912357s" podCreationTimestamp="2026-03-07 00:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:53:46.049493088 +0000 UTC m=+1.250266771" watchObservedRunningTime="2026-03-07 00:53:46.049912357 +0000 UTC m=+1.250686040" Mar 7 00:53:49.534498 kubelet[2596]: I0307 00:53:49.534419 2596 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 7 00:53:49.536734 containerd[1477]: time="2026-03-07T00:53:49.535533245Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 7 00:53:49.537273 kubelet[2596]: I0307 00:53:49.535871 2596 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 7 00:53:49.733919 systemd[1]: Created slice kubepods-besteffort-pod2db855e6_34d6_4e9d_80df_9be0c94df127.slice - libcontainer container kubepods-besteffort-pod2db855e6_34d6_4e9d_80df_9be0c94df127.slice. Mar 7 00:53:49.758477 kubelet[2596]: I0307 00:53:49.758224 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2db855e6-34d6-4e9d-80df-9be0c94df127-xtables-lock\") pod \"kube-proxy-hgvtl\" (UID: \"2db855e6-34d6-4e9d-80df-9be0c94df127\") " pod="kube-system/kube-proxy-hgvtl" Mar 7 00:53:49.758477 kubelet[2596]: I0307 00:53:49.758282 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2db855e6-34d6-4e9d-80df-9be0c94df127-kube-proxy\") pod \"kube-proxy-hgvtl\" (UID: \"2db855e6-34d6-4e9d-80df-9be0c94df127\") " pod="kube-system/kube-proxy-hgvtl" Mar 7 00:53:49.758477 kubelet[2596]: I0307 00:53:49.758318 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2db855e6-34d6-4e9d-80df-9be0c94df127-lib-modules\") pod \"kube-proxy-hgvtl\" (UID: \"2db855e6-34d6-4e9d-80df-9be0c94df127\") " pod="kube-system/kube-proxy-hgvtl" Mar 7 00:53:49.758477 kubelet[2596]: I0307 00:53:49.758350 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rngjp\" (UniqueName: \"kubernetes.io/projected/2db855e6-34d6-4e9d-80df-9be0c94df127-kube-api-access-rngjp\") pod \"kube-proxy-hgvtl\" (UID: \"2db855e6-34d6-4e9d-80df-9be0c94df127\") " pod="kube-system/kube-proxy-hgvtl" Mar 7 00:53:49.871306 kubelet[2596]: E0307 00:53:49.871228 2596 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 7 00:53:49.871306 kubelet[2596]: E0307 00:53:49.871261 2596 projected.go:196] Error preparing data for projected volume kube-api-access-rngjp for pod kube-system/kube-proxy-hgvtl: configmap "kube-root-ca.crt" not found Mar 7 00:53:49.871747 kubelet[2596]: E0307 00:53:49.871514 2596 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db855e6-34d6-4e9d-80df-9be0c94df127-kube-api-access-rngjp podName:2db855e6-34d6-4e9d-80df-9be0c94df127 nodeName:}" failed. No retries permitted until 2026-03-07 00:53:50.371487955 +0000 UTC m=+5.572261638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rngjp" (UniqueName: "kubernetes.io/projected/2db855e6-34d6-4e9d-80df-9be0c94df127-kube-api-access-rngjp") pod "kube-proxy-hgvtl" (UID: "2db855e6-34d6-4e9d-80df-9be0c94df127") : configmap "kube-root-ca.crt" not found Mar 7 00:53:50.644237 containerd[1477]: time="2026-03-07T00:53:50.644072639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hgvtl,Uid:2db855e6-34d6-4e9d-80df-9be0c94df127,Namespace:kube-system,Attempt:0,}" Mar 7 00:53:50.668573 containerd[1477]: time="2026-03-07T00:53:50.667833325Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:53:50.668573 containerd[1477]: time="2026-03-07T00:53:50.668310278Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:53:50.668573 containerd[1477]: time="2026-03-07T00:53:50.668324209Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:50.668573 containerd[1477]: time="2026-03-07T00:53:50.668528057Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:50.687119 systemd[1]: Started cri-containerd-538004254cdf563b05b86441e71deacf43fd05e200a6e80542db8b05d5af8ac3.scope - libcontainer container 538004254cdf563b05b86441e71deacf43fd05e200a6e80542db8b05d5af8ac3. Mar 7 00:53:50.736498 containerd[1477]: time="2026-03-07T00:53:50.736285533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hgvtl,Uid:2db855e6-34d6-4e9d-80df-9be0c94df127,Namespace:kube-system,Attempt:0,} returns sandbox id \"538004254cdf563b05b86441e71deacf43fd05e200a6e80542db8b05d5af8ac3\"" Mar 7 00:53:50.744185 containerd[1477]: time="2026-03-07T00:53:50.744110337Z" level=info msg="CreateContainer within sandbox \"538004254cdf563b05b86441e71deacf43fd05e200a6e80542db8b05d5af8ac3\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 7 00:53:50.770239 containerd[1477]: time="2026-03-07T00:53:50.770182887Z" level=info msg="CreateContainer within sandbox \"538004254cdf563b05b86441e71deacf43fd05e200a6e80542db8b05d5af8ac3\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e06de83210bf735a748cab87363d9f2470b1b29f0413c0560af2204b295614fb\"" Mar 7 00:53:50.774724 containerd[1477]: time="2026-03-07T00:53:50.771152485Z" level=info msg="StartContainer for \"e06de83210bf735a748cab87363d9f2470b1b29f0413c0560af2204b295614fb\"" Mar 7 00:53:50.801030 kubelet[2596]: E0307 00:53:50.800978 2596 status_manager.go:1018] "Failed to get status for pod" err="pods \"tigera-operator-5588576f44-k86lz\" is forbidden: User \"system:node:ci-4081-3-6-n-f47b87f6f2\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4081-3-6-n-f47b87f6f2' and this object" podUID="a7533f1c-7388-470f-80e0-8fff8344eb24" pod="tigera-operator/tigera-operator-5588576f44-k86lz" Mar 7 00:53:50.801373 kubelet[2596]: E0307 00:53:50.801070 2596 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:ci-4081-3-6-n-f47b87f6f2\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4081-3-6-n-f47b87f6f2' and this object" logger="UnhandledError" reflector="object-\"tigera-operator\"/\"kubernetes-services-endpoint\"" type="*v1.ConfigMap" Mar 7 00:53:50.801373 kubelet[2596]: E0307 00:53:50.801110 2596 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4081-3-6-n-f47b87f6f2\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4081-3-6-n-f47b87f6f2' and this object" logger="UnhandledError" reflector="object-\"tigera-operator\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Mar 7 00:53:50.809046 systemd[1]: Created slice kubepods-besteffort-poda7533f1c_7388_470f_80e0_8fff8344eb24.slice - libcontainer container kubepods-besteffort-poda7533f1c_7388_470f_80e0_8fff8344eb24.slice. Mar 7 00:53:50.820880 systemd[1]: Started cri-containerd-e06de83210bf735a748cab87363d9f2470b1b29f0413c0560af2204b295614fb.scope - libcontainer container e06de83210bf735a748cab87363d9f2470b1b29f0413c0560af2204b295614fb. Mar 7 00:53:50.867975 kubelet[2596]: I0307 00:53:50.867935 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq76n\" (UniqueName: \"kubernetes.io/projected/a7533f1c-7388-470f-80e0-8fff8344eb24-kube-api-access-zq76n\") pod \"tigera-operator-5588576f44-k86lz\" (UID: \"a7533f1c-7388-470f-80e0-8fff8344eb24\") " pod="tigera-operator/tigera-operator-5588576f44-k86lz" Mar 7 00:53:50.868130 kubelet[2596]: I0307 00:53:50.868118 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a7533f1c-7388-470f-80e0-8fff8344eb24-var-lib-calico\") pod \"tigera-operator-5588576f44-k86lz\" (UID: \"a7533f1c-7388-470f-80e0-8fff8344eb24\") " pod="tigera-operator/tigera-operator-5588576f44-k86lz" Mar 7 00:53:50.880878 containerd[1477]: time="2026-03-07T00:53:50.880640165Z" level=info msg="StartContainer for \"e06de83210bf735a748cab87363d9f2470b1b29f0413c0560af2204b295614fb\" returns successfully" Mar 7 00:53:51.017757 kubelet[2596]: I0307 00:53:51.017520 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hgvtl" podStartSLOduration=2.015674413 podStartE2EDuration="2.015674413s" podCreationTimestamp="2026-03-07 00:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:53:51.014000545 +0000 UTC m=+6.214774268" watchObservedRunningTime="2026-03-07 00:53:51.015674413 +0000 UTC m=+6.216448136" Mar 7 00:53:51.476527 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount289165873.mount: Deactivated successfully. Mar 7 00:53:51.717564 containerd[1477]: time="2026-03-07T00:53:51.717505649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-k86lz,Uid:a7533f1c-7388-470f-80e0-8fff8344eb24,Namespace:tigera-operator,Attempt:0,}" Mar 7 00:53:51.748576 containerd[1477]: time="2026-03-07T00:53:51.747954443Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:53:51.748576 containerd[1477]: time="2026-03-07T00:53:51.748422929Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:53:51.748576 containerd[1477]: time="2026-03-07T00:53:51.748492303Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:51.748870 containerd[1477]: time="2026-03-07T00:53:51.748634014Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:53:51.773866 systemd[1]: Started cri-containerd-94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1.scope - libcontainer container 94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1. Mar 7 00:53:51.812936 containerd[1477]: time="2026-03-07T00:53:51.812858361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-k86lz,Uid:a7533f1c-7388-470f-80e0-8fff8344eb24,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1\"" Mar 7 00:53:51.818171 containerd[1477]: time="2026-03-07T00:53:51.818086887Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 7 00:53:52.472624 systemd[1]: run-containerd-runc-k8s.io-94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1-runc.SA3C22.mount: Deactivated successfully. Mar 7 00:53:53.301149 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3316814512.mount: Deactivated successfully. Mar 7 00:53:53.716419 containerd[1477]: time="2026-03-07T00:53:53.716358160Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:53.717623 containerd[1477]: time="2026-03-07T00:53:53.717568373Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 7 00:53:53.719059 containerd[1477]: time="2026-03-07T00:53:53.719010150Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:53.722358 containerd[1477]: time="2026-03-07T00:53:53.722308235Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:53:53.723916 containerd[1477]: time="2026-03-07T00:53:53.723844958Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 1.905674888s" Mar 7 00:53:53.723916 containerd[1477]: time="2026-03-07T00:53:53.723887709Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 7 00:53:53.728594 containerd[1477]: time="2026-03-07T00:53:53.728566688Z" level=info msg="CreateContainer within sandbox \"94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 7 00:53:53.744718 containerd[1477]: time="2026-03-07T00:53:53.744654631Z" level=info msg="CreateContainer within sandbox \"94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158\"" Mar 7 00:53:53.746024 containerd[1477]: time="2026-03-07T00:53:53.745883497Z" level=info msg="StartContainer for \"f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158\"" Mar 7 00:53:53.776965 systemd[1]: Started cri-containerd-f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158.scope - libcontainer container f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158. Mar 7 00:53:53.811313 containerd[1477]: time="2026-03-07T00:53:53.811038757Z" level=info msg="StartContainer for \"f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158\" returns successfully" Mar 7 00:53:54.028772 kubelet[2596]: I0307 00:53:54.028560 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-k86lz" podStartSLOduration=2.120443699 podStartE2EDuration="4.02854386s" podCreationTimestamp="2026-03-07 00:53:50 +0000 UTC" firstStartedPulling="2026-03-07 00:53:51.816552207 +0000 UTC m=+7.017325890" lastFinishedPulling="2026-03-07 00:53:53.724652368 +0000 UTC m=+8.925426051" observedRunningTime="2026-03-07 00:53:54.028057654 +0000 UTC m=+9.228831377" watchObservedRunningTime="2026-03-07 00:53:54.02854386 +0000 UTC m=+9.229317583" Mar 7 00:53:59.883811 sudo[1732]: pam_unix(sudo:session): session closed for user root Mar 7 00:53:59.981126 sshd[1714]: pam_unix(sshd:session): session closed for user core Mar 7 00:53:59.987245 systemd[1]: sshd@6-116.202.17.139:22-20.161.92.111:37134.service: Deactivated successfully. Mar 7 00:53:59.991793 systemd[1]: session-7.scope: Deactivated successfully. Mar 7 00:53:59.992096 systemd[1]: session-7.scope: Consumed 7.605s CPU time, 152.0M memory peak, 0B memory swap peak. Mar 7 00:53:59.994649 systemd-logind[1459]: Session 7 logged out. Waiting for processes to exit. Mar 7 00:53:59.997094 systemd-logind[1459]: Removed session 7. Mar 7 00:54:06.706016 systemd[1]: Created slice kubepods-besteffort-podc98f70c7_fe48_4d2a_8b8f_edccffa02027.slice - libcontainer container kubepods-besteffort-podc98f70c7_fe48_4d2a_8b8f_edccffa02027.slice. Mar 7 00:54:06.771116 kubelet[2596]: I0307 00:54:06.770987 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c98f70c7-fe48-4d2a-8b8f-edccffa02027-tigera-ca-bundle\") pod \"calico-typha-5dfd975ddc-qjwpx\" (UID: \"c98f70c7-fe48-4d2a-8b8f-edccffa02027\") " pod="calico-system/calico-typha-5dfd975ddc-qjwpx" Mar 7 00:54:06.771116 kubelet[2596]: I0307 00:54:06.771027 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c98f70c7-fe48-4d2a-8b8f-edccffa02027-typha-certs\") pod \"calico-typha-5dfd975ddc-qjwpx\" (UID: \"c98f70c7-fe48-4d2a-8b8f-edccffa02027\") " pod="calico-system/calico-typha-5dfd975ddc-qjwpx" Mar 7 00:54:06.771116 kubelet[2596]: I0307 00:54:06.771045 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qbz\" (UniqueName: \"kubernetes.io/projected/c98f70c7-fe48-4d2a-8b8f-edccffa02027-kube-api-access-g8qbz\") pod \"calico-typha-5dfd975ddc-qjwpx\" (UID: \"c98f70c7-fe48-4d2a-8b8f-edccffa02027\") " pod="calico-system/calico-typha-5dfd975ddc-qjwpx" Mar 7 00:54:06.872106 kubelet[2596]: I0307 00:54:06.871515 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-cni-bin-dir\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.875903 kubelet[2596]: I0307 00:54:06.874797 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-cni-net-dir\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.875903 kubelet[2596]: I0307 00:54:06.875290 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/686e27f3-19b0-4a26-8415-55ead591442f-node-certs\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.875903 kubelet[2596]: I0307 00:54:06.875323 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-bpffs\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.875903 kubelet[2596]: I0307 00:54:06.875385 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/686e27f3-19b0-4a26-8415-55ead591442f-tigera-ca-bundle\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.875903 kubelet[2596]: I0307 00:54:06.875404 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-xtables-lock\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876116 kubelet[2596]: I0307 00:54:06.875421 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-lib-modules\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876116 kubelet[2596]: I0307 00:54:06.875440 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-policysync\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876116 kubelet[2596]: I0307 00:54:06.875459 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-cni-log-dir\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876116 kubelet[2596]: I0307 00:54:06.875480 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-sys-fs\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876116 kubelet[2596]: I0307 00:54:06.875495 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-var-lib-calico\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876240 kubelet[2596]: I0307 00:54:06.875513 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-var-run-calico\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876240 kubelet[2596]: I0307 00:54:06.875547 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vjd\" (UniqueName: \"kubernetes.io/projected/686e27f3-19b0-4a26-8415-55ead591442f-kube-api-access-w7vjd\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876240 kubelet[2596]: I0307 00:54:06.875569 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-nodeproc\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.876240 kubelet[2596]: I0307 00:54:06.875589 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/686e27f3-19b0-4a26-8415-55ead591442f-flexvol-driver-host\") pod \"calico-node-cptjl\" (UID: \"686e27f3-19b0-4a26-8415-55ead591442f\") " pod="calico-system/calico-node-cptjl" Mar 7 00:54:06.883636 systemd[1]: Created slice kubepods-besteffort-pod686e27f3_19b0_4a26_8415_55ead591442f.slice - libcontainer container kubepods-besteffort-pod686e27f3_19b0_4a26_8415_55ead591442f.slice. Mar 7 00:54:06.972347 kubelet[2596]: E0307 00:54:06.971662 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:06.979844 kubelet[2596]: E0307 00:54:06.979723 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.979844 kubelet[2596]: W0307 00:54:06.979764 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.979844 kubelet[2596]: E0307 00:54:06.979786 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.980371 kubelet[2596]: E0307 00:54:06.980327 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.980371 kubelet[2596]: W0307 00:54:06.980340 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.980371 kubelet[2596]: E0307 00:54:06.980351 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.984459 kubelet[2596]: E0307 00:54:06.984434 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.984459 kubelet[2596]: W0307 00:54:06.984455 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.985375 kubelet[2596]: E0307 00:54:06.984471 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.985605 kubelet[2596]: E0307 00:54:06.985537 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.985605 kubelet[2596]: W0307 00:54:06.985554 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.985605 kubelet[2596]: E0307 00:54:06.985579 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.987532 kubelet[2596]: E0307 00:54:06.987311 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.987532 kubelet[2596]: W0307 00:54:06.987328 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.987532 kubelet[2596]: E0307 00:54:06.987343 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.988500 kubelet[2596]: E0307 00:54:06.988351 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.988500 kubelet[2596]: W0307 00:54:06.988366 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.988500 kubelet[2596]: E0307 00:54:06.988379 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.988884 kubelet[2596]: E0307 00:54:06.988801 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.988884 kubelet[2596]: W0307 00:54:06.988814 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.988884 kubelet[2596]: E0307 00:54:06.988826 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.989239 kubelet[2596]: E0307 00:54:06.989147 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.989239 kubelet[2596]: W0307 00:54:06.989158 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.989239 kubelet[2596]: E0307 00:54:06.989168 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.989881 kubelet[2596]: E0307 00:54:06.989792 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.989881 kubelet[2596]: W0307 00:54:06.989808 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.989881 kubelet[2596]: E0307 00:54:06.989823 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.990250 kubelet[2596]: E0307 00:54:06.990156 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.990250 kubelet[2596]: W0307 00:54:06.990168 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.990250 kubelet[2596]: E0307 00:54:06.990178 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.990596 kubelet[2596]: E0307 00:54:06.990483 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.990596 kubelet[2596]: W0307 00:54:06.990494 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.990596 kubelet[2596]: E0307 00:54:06.990505 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.990945 kubelet[2596]: E0307 00:54:06.990863 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.990945 kubelet[2596]: W0307 00:54:06.990877 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.990945 kubelet[2596]: E0307 00:54:06.990888 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.991364 kubelet[2596]: E0307 00:54:06.991227 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.991364 kubelet[2596]: W0307 00:54:06.991238 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.991364 kubelet[2596]: E0307 00:54:06.991273 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.991779 kubelet[2596]: E0307 00:54:06.991646 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.991779 kubelet[2596]: W0307 00:54:06.991658 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.991779 kubelet[2596]: E0307 00:54:06.991668 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.992186 kubelet[2596]: E0307 00:54:06.992047 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.992186 kubelet[2596]: W0307 00:54:06.992057 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.992186 kubelet[2596]: E0307 00:54:06.992067 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.992506 kubelet[2596]: E0307 00:54:06.992413 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.992506 kubelet[2596]: W0307 00:54:06.992424 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.992506 kubelet[2596]: E0307 00:54:06.992434 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.993013 kubelet[2596]: E0307 00:54:06.992844 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.993013 kubelet[2596]: W0307 00:54:06.992855 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.993013 kubelet[2596]: E0307 00:54:06.992866 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.993289 kubelet[2596]: E0307 00:54:06.993185 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.993289 kubelet[2596]: W0307 00:54:06.993198 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.993289 kubelet[2596]: E0307 00:54:06.993209 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.993733 kubelet[2596]: E0307 00:54:06.993609 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.993733 kubelet[2596]: W0307 00:54:06.993622 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.993733 kubelet[2596]: E0307 00:54:06.993633 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.993970 kubelet[2596]: E0307 00:54:06.993878 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.993970 kubelet[2596]: W0307 00:54:06.993889 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.993970 kubelet[2596]: E0307 00:54:06.993900 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.994176 kubelet[2596]: E0307 00:54:06.994118 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.994176 kubelet[2596]: W0307 00:54:06.994128 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.994176 kubelet[2596]: E0307 00:54:06.994138 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.994406 kubelet[2596]: E0307 00:54:06.994396 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.994504 kubelet[2596]: W0307 00:54:06.994450 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.994504 kubelet[2596]: E0307 00:54:06.994463 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.994883 kubelet[2596]: E0307 00:54:06.994827 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.994883 kubelet[2596]: W0307 00:54:06.994839 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.994883 kubelet[2596]: E0307 00:54:06.994849 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.995803 kubelet[2596]: E0307 00:54:06.995596 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.995803 kubelet[2596]: W0307 00:54:06.995609 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.995803 kubelet[2596]: E0307 00:54:06.995628 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.996406 kubelet[2596]: E0307 00:54:06.996377 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.996615 kubelet[2596]: W0307 00:54:06.996481 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.996615 kubelet[2596]: E0307 00:54:06.996498 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.997502 kubelet[2596]: E0307 00:54:06.997258 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.997502 kubelet[2596]: W0307 00:54:06.997272 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.997502 kubelet[2596]: E0307 00:54:06.997282 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.998769 kubelet[2596]: E0307 00:54:06.998664 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.998769 kubelet[2596]: W0307 00:54:06.998694 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.999518 kubelet[2596]: E0307 00:54:06.998720 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:06.999793 kubelet[2596]: E0307 00:54:06.999661 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:06.999793 kubelet[2596]: W0307 00:54:06.999689 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:06.999793 kubelet[2596]: E0307 00:54:06.999702 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.000213 kubelet[2596]: E0307 00:54:07.000010 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.000213 kubelet[2596]: W0307 00:54:07.000021 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.000213 kubelet[2596]: E0307 00:54:07.000031 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.001044 kubelet[2596]: E0307 00:54:07.000928 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.001044 kubelet[2596]: W0307 00:54:07.000952 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.001044 kubelet[2596]: E0307 00:54:07.000966 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.002712 kubelet[2596]: E0307 00:54:07.001383 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.002712 kubelet[2596]: W0307 00:54:07.001396 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.002712 kubelet[2596]: E0307 00:54:07.001407 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.003161 kubelet[2596]: E0307 00:54:07.003050 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.003161 kubelet[2596]: W0307 00:54:07.003063 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.003161 kubelet[2596]: E0307 00:54:07.003074 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.003425 kubelet[2596]: E0307 00:54:07.003328 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.003425 kubelet[2596]: W0307 00:54:07.003337 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.003425 kubelet[2596]: E0307 00:54:07.003347 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.003840 kubelet[2596]: E0307 00:54:07.003739 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.003840 kubelet[2596]: W0307 00:54:07.003750 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.003840 kubelet[2596]: E0307 00:54:07.003760 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.004224 kubelet[2596]: E0307 00:54:07.004141 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.004224 kubelet[2596]: W0307 00:54:07.004153 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.004224 kubelet[2596]: E0307 00:54:07.004175 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.004724 kubelet[2596]: E0307 00:54:07.004557 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.004724 kubelet[2596]: W0307 00:54:07.004569 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.004724 kubelet[2596]: E0307 00:54:07.004595 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.004934 kubelet[2596]: E0307 00:54:07.004923 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.005013 kubelet[2596]: W0307 00:54:07.004987 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.005013 kubelet[2596]: E0307 00:54:07.005003 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.005367 kubelet[2596]: E0307 00:54:07.005293 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.005367 kubelet[2596]: W0307 00:54:07.005305 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.005367 kubelet[2596]: E0307 00:54:07.005313 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.005796 kubelet[2596]: E0307 00:54:07.005674 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.005796 kubelet[2596]: W0307 00:54:07.005717 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.005796 kubelet[2596]: E0307 00:54:07.005726 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.007108 kubelet[2596]: E0307 00:54:07.006560 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.007108 kubelet[2596]: W0307 00:54:07.006614 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.007108 kubelet[2596]: E0307 00:54:07.006630 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.008247 kubelet[2596]: E0307 00:54:07.008117 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.008247 kubelet[2596]: W0307 00:54:07.008131 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.008247 kubelet[2596]: E0307 00:54:07.008143 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.008515 kubelet[2596]: E0307 00:54:07.008470 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.008515 kubelet[2596]: W0307 00:54:07.008481 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.008515 kubelet[2596]: E0307 00:54:07.008491 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.009274 kubelet[2596]: E0307 00:54:07.008943 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.009274 kubelet[2596]: W0307 00:54:07.008956 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.009274 kubelet[2596]: E0307 00:54:07.008966 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.011193 kubelet[2596]: E0307 00:54:07.011066 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.011193 kubelet[2596]: W0307 00:54:07.011081 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.011193 kubelet[2596]: E0307 00:54:07.011092 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.012128 kubelet[2596]: E0307 00:54:07.011871 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.012128 kubelet[2596]: W0307 00:54:07.011885 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.012128 kubelet[2596]: E0307 00:54:07.011899 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.012490 kubelet[2596]: E0307 00:54:07.012475 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.012574 kubelet[2596]: W0307 00:54:07.012562 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.013950 kubelet[2596]: E0307 00:54:07.013333 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.016005 containerd[1477]: time="2026-03-07T00:54:07.015810570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dfd975ddc-qjwpx,Uid:c98f70c7-fe48-4d2a-8b8f-edccffa02027,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:07.034878 kubelet[2596]: E0307 00:54:07.034781 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.034878 kubelet[2596]: W0307 00:54:07.034805 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.034878 kubelet[2596]: E0307 00:54:07.034826 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.062875 containerd[1477]: time="2026-03-07T00:54:07.062764310Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:07.063076 containerd[1477]: time="2026-03-07T00:54:07.062834656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:07.063076 containerd[1477]: time="2026-03-07T00:54:07.062851182Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:07.063076 containerd[1477]: time="2026-03-07T00:54:07.062942096Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:07.068089 kubelet[2596]: E0307 00:54:07.068067 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.068357 kubelet[2596]: W0307 00:54:07.068223 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.068357 kubelet[2596]: E0307 00:54:07.068248 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.068976 kubelet[2596]: E0307 00:54:07.068769 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.068976 kubelet[2596]: W0307 00:54:07.068790 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.068976 kubelet[2596]: E0307 00:54:07.068832 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.070049 kubelet[2596]: E0307 00:54:07.069919 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.070049 kubelet[2596]: W0307 00:54:07.069933 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.070049 kubelet[2596]: E0307 00:54:07.069957 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.070753 kubelet[2596]: E0307 00:54:07.070562 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.070753 kubelet[2596]: W0307 00:54:07.070575 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.070753 kubelet[2596]: E0307 00:54:07.070595 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.071351 kubelet[2596]: E0307 00:54:07.071336 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.071488 kubelet[2596]: W0307 00:54:07.071428 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.071488 kubelet[2596]: E0307 00:54:07.071445 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.072001 kubelet[2596]: E0307 00:54:07.071986 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.072107 kubelet[2596]: W0307 00:54:07.072095 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.072188 kubelet[2596]: E0307 00:54:07.072176 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.072938 kubelet[2596]: E0307 00:54:07.072724 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.072938 kubelet[2596]: W0307 00:54:07.072739 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.072938 kubelet[2596]: E0307 00:54:07.072767 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.073562 kubelet[2596]: E0307 00:54:07.073154 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.073562 kubelet[2596]: W0307 00:54:07.073177 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.073811 kubelet[2596]: E0307 00:54:07.073189 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.073972 kubelet[2596]: E0307 00:54:07.073960 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.074050 kubelet[2596]: W0307 00:54:07.074038 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.074133 kubelet[2596]: E0307 00:54:07.074121 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.074384 kubelet[2596]: E0307 00:54:07.074372 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.074516 kubelet[2596]: W0307 00:54:07.074500 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.074592 kubelet[2596]: E0307 00:54:07.074581 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.075890 kubelet[2596]: E0307 00:54:07.075768 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.075890 kubelet[2596]: W0307 00:54:07.075783 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.075890 kubelet[2596]: E0307 00:54:07.075794 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.076188 kubelet[2596]: E0307 00:54:07.076065 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.076188 kubelet[2596]: W0307 00:54:07.076076 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.076188 kubelet[2596]: E0307 00:54:07.076085 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.076480 kubelet[2596]: E0307 00:54:07.076327 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.076480 kubelet[2596]: W0307 00:54:07.076338 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.076480 kubelet[2596]: E0307 00:54:07.076348 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.076816 kubelet[2596]: E0307 00:54:07.076645 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.076816 kubelet[2596]: W0307 00:54:07.076657 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.076816 kubelet[2596]: E0307 00:54:07.076667 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.077085 kubelet[2596]: E0307 00:54:07.077074 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.077184 kubelet[2596]: W0307 00:54:07.077132 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.077184 kubelet[2596]: E0307 00:54:07.077147 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.077752 kubelet[2596]: E0307 00:54:07.077739 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.077885 kubelet[2596]: W0307 00:54:07.077807 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.077885 kubelet[2596]: E0307 00:54:07.077821 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.078190 kubelet[2596]: E0307 00:54:07.078135 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.078190 kubelet[2596]: W0307 00:54:07.078146 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.078190 kubelet[2596]: E0307 00:54:07.078156 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.078878 kubelet[2596]: E0307 00:54:07.078780 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.078878 kubelet[2596]: W0307 00:54:07.078792 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.078878 kubelet[2596]: E0307 00:54:07.078801 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.079192 kubelet[2596]: E0307 00:54:07.079136 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.079192 kubelet[2596]: W0307 00:54:07.079147 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.079192 kubelet[2596]: E0307 00:54:07.079157 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.079742 kubelet[2596]: E0307 00:54:07.079460 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.079742 kubelet[2596]: W0307 00:54:07.079472 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.079742 kubelet[2596]: E0307 00:54:07.079482 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.080038 kubelet[2596]: E0307 00:54:07.079904 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.080038 kubelet[2596]: W0307 00:54:07.079916 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.080038 kubelet[2596]: E0307 00:54:07.079926 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.080038 kubelet[2596]: I0307 00:54:07.079951 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16bf6d64-1b62-4d2e-b193-67ace2ec0676-registration-dir\") pod \"csi-node-driver-2zr4s\" (UID: \"16bf6d64-1b62-4d2e-b193-67ace2ec0676\") " pod="calico-system/csi-node-driver-2zr4s" Mar 7 00:54:07.080334 kubelet[2596]: E0307 00:54:07.080216 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.080334 kubelet[2596]: W0307 00:54:07.080229 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.080334 kubelet[2596]: E0307 00:54:07.080259 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.080334 kubelet[2596]: I0307 00:54:07.080283 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2m4z\" (UniqueName: \"kubernetes.io/projected/16bf6d64-1b62-4d2e-b193-67ace2ec0676-kube-api-access-r2m4z\") pod \"csi-node-driver-2zr4s\" (UID: \"16bf6d64-1b62-4d2e-b193-67ace2ec0676\") " pod="calico-system/csi-node-driver-2zr4s" Mar 7 00:54:07.080865 kubelet[2596]: E0307 00:54:07.080755 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.080865 kubelet[2596]: W0307 00:54:07.080769 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.080865 kubelet[2596]: E0307 00:54:07.080780 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.080865 kubelet[2596]: I0307 00:54:07.080804 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16bf6d64-1b62-4d2e-b193-67ace2ec0676-socket-dir\") pod \"csi-node-driver-2zr4s\" (UID: \"16bf6d64-1b62-4d2e-b193-67ace2ec0676\") " pod="calico-system/csi-node-driver-2zr4s" Mar 7 00:54:07.081473 kubelet[2596]: E0307 00:54:07.081354 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.081473 kubelet[2596]: W0307 00:54:07.081367 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.081473 kubelet[2596]: E0307 00:54:07.081378 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.081834 kubelet[2596]: I0307 00:54:07.081399 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16bf6d64-1b62-4d2e-b193-67ace2ec0676-kubelet-dir\") pod \"csi-node-driver-2zr4s\" (UID: \"16bf6d64-1b62-4d2e-b193-67ace2ec0676\") " pod="calico-system/csi-node-driver-2zr4s" Mar 7 00:54:07.081989 kubelet[2596]: E0307 00:54:07.081907 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.081989 kubelet[2596]: W0307 00:54:07.081917 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.081989 kubelet[2596]: E0307 00:54:07.081937 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.082470 kubelet[2596]: E0307 00:54:07.082457 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.082561 kubelet[2596]: W0307 00:54:07.082530 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.082561 kubelet[2596]: E0307 00:54:07.082543 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.082952 kubelet[2596]: E0307 00:54:07.082920 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.082952 kubelet[2596]: W0307 00:54:07.082932 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.082952 kubelet[2596]: E0307 00:54:07.082941 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.083559 kubelet[2596]: E0307 00:54:07.083399 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.083559 kubelet[2596]: W0307 00:54:07.083411 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.083559 kubelet[2596]: E0307 00:54:07.083421 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.083559 kubelet[2596]: I0307 00:54:07.083446 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/16bf6d64-1b62-4d2e-b193-67ace2ec0676-varrun\") pod \"csi-node-driver-2zr4s\" (UID: \"16bf6d64-1b62-4d2e-b193-67ace2ec0676\") " pod="calico-system/csi-node-driver-2zr4s" Mar 7 00:54:07.084645 kubelet[2596]: E0307 00:54:07.084196 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.084645 kubelet[2596]: W0307 00:54:07.084210 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.084645 kubelet[2596]: E0307 00:54:07.084227 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.085569 kubelet[2596]: E0307 00:54:07.085365 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.085569 kubelet[2596]: W0307 00:54:07.085379 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.085569 kubelet[2596]: E0307 00:54:07.085390 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.087095 kubelet[2596]: E0307 00:54:07.086757 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.087095 kubelet[2596]: W0307 00:54:07.086770 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.087095 kubelet[2596]: E0307 00:54:07.086781 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.087797 kubelet[2596]: E0307 00:54:07.087542 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.087797 kubelet[2596]: W0307 00:54:07.087555 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.087797 kubelet[2596]: E0307 00:54:07.087565 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.089747 kubelet[2596]: E0307 00:54:07.088876 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.089747 kubelet[2596]: W0307 00:54:07.088889 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.089747 kubelet[2596]: E0307 00:54:07.088903 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.090737 kubelet[2596]: E0307 00:54:07.090138 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.090737 kubelet[2596]: W0307 00:54:07.090152 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.090737 kubelet[2596]: E0307 00:54:07.090163 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.091361 kubelet[2596]: E0307 00:54:07.090911 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.091361 kubelet[2596]: W0307 00:54:07.090933 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.091361 kubelet[2596]: E0307 00:54:07.090946 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.095927 systemd[1]: Started cri-containerd-4d595f5b86839c2c1ca845e1f081a9843445898b19da01ed8a4c3fb9f8ebced1.scope - libcontainer container 4d595f5b86839c2c1ca845e1f081a9843445898b19da01ed8a4c3fb9f8ebced1. Mar 7 00:54:07.141709 containerd[1477]: time="2026-03-07T00:54:07.141205305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dfd975ddc-qjwpx,Uid:c98f70c7-fe48-4d2a-8b8f-edccffa02027,Namespace:calico-system,Attempt:0,} returns sandbox id \"4d595f5b86839c2c1ca845e1f081a9843445898b19da01ed8a4c3fb9f8ebced1\"" Mar 7 00:54:07.143392 containerd[1477]: time="2026-03-07T00:54:07.143348829Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 7 00:54:07.187090 kubelet[2596]: E0307 00:54:07.186861 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.187090 kubelet[2596]: W0307 00:54:07.186890 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.187090 kubelet[2596]: E0307 00:54:07.186929 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.187632 kubelet[2596]: E0307 00:54:07.187496 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.187632 kubelet[2596]: W0307 00:54:07.187535 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.187632 kubelet[2596]: E0307 00:54:07.187568 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.187982 kubelet[2596]: E0307 00:54:07.187888 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.187982 kubelet[2596]: W0307 00:54:07.187910 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.187982 kubelet[2596]: E0307 00:54:07.187928 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.188416 kubelet[2596]: E0307 00:54:07.188391 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.188416 kubelet[2596]: W0307 00:54:07.188406 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.188416 kubelet[2596]: E0307 00:54:07.188418 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.188636 kubelet[2596]: E0307 00:54:07.188608 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.188636 kubelet[2596]: W0307 00:54:07.188624 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.188636 kubelet[2596]: E0307 00:54:07.188635 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.189066 kubelet[2596]: E0307 00:54:07.188959 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.189066 kubelet[2596]: W0307 00:54:07.188970 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.189066 kubelet[2596]: E0307 00:54:07.188981 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.189321 kubelet[2596]: E0307 00:54:07.189147 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.189321 kubelet[2596]: W0307 00:54:07.189155 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.189321 kubelet[2596]: E0307 00:54:07.189165 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.189321 kubelet[2596]: E0307 00:54:07.189339 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.189704 kubelet[2596]: W0307 00:54:07.189349 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.189704 kubelet[2596]: E0307 00:54:07.189359 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.189704 kubelet[2596]: E0307 00:54:07.189525 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.189704 kubelet[2596]: W0307 00:54:07.189534 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.189704 kubelet[2596]: E0307 00:54:07.189543 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.190180 kubelet[2596]: E0307 00:54:07.189753 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.190180 kubelet[2596]: W0307 00:54:07.189763 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.190180 kubelet[2596]: E0307 00:54:07.189777 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.190466 kubelet[2596]: E0307 00:54:07.190401 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.190466 kubelet[2596]: W0307 00:54:07.190430 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.190466 kubelet[2596]: E0307 00:54:07.190457 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.191302 kubelet[2596]: E0307 00:54:07.190954 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.191302 kubelet[2596]: W0307 00:54:07.190974 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.191302 kubelet[2596]: E0307 00:54:07.190999 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.192001 kubelet[2596]: E0307 00:54:07.191847 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.192001 kubelet[2596]: W0307 00:54:07.191877 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.192001 kubelet[2596]: E0307 00:54:07.191924 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.193052 kubelet[2596]: E0307 00:54:07.192831 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.193052 kubelet[2596]: W0307 00:54:07.192852 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.193052 kubelet[2596]: E0307 00:54:07.192872 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.193613 kubelet[2596]: E0307 00:54:07.193429 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.193613 kubelet[2596]: W0307 00:54:07.193449 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.193613 kubelet[2596]: E0307 00:54:07.193472 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.194097 kubelet[2596]: E0307 00:54:07.193964 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.194097 kubelet[2596]: W0307 00:54:07.193984 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.194097 kubelet[2596]: E0307 00:54:07.194004 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.194702 kubelet[2596]: E0307 00:54:07.194549 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.194702 kubelet[2596]: W0307 00:54:07.194566 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.194702 kubelet[2596]: E0307 00:54:07.194594 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.195272 kubelet[2596]: E0307 00:54:07.195118 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.195272 kubelet[2596]: W0307 00:54:07.195135 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.195272 kubelet[2596]: E0307 00:54:07.195151 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.195767 kubelet[2596]: E0307 00:54:07.195532 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.195767 kubelet[2596]: W0307 00:54:07.195547 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.195767 kubelet[2596]: E0307 00:54:07.195624 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.196248 kubelet[2596]: E0307 00:54:07.196123 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.196248 kubelet[2596]: W0307 00:54:07.196140 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.196248 kubelet[2596]: E0307 00:54:07.196155 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.196908 kubelet[2596]: E0307 00:54:07.196773 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.196908 kubelet[2596]: W0307 00:54:07.196791 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.196908 kubelet[2596]: E0307 00:54:07.196807 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.197374 kubelet[2596]: E0307 00:54:07.197292 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.197374 kubelet[2596]: W0307 00:54:07.197306 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.197374 kubelet[2596]: E0307 00:54:07.197319 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.199024 kubelet[2596]: E0307 00:54:07.197805 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.199024 kubelet[2596]: W0307 00:54:07.197823 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.199024 kubelet[2596]: E0307 00:54:07.197838 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.199024 kubelet[2596]: E0307 00:54:07.198755 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.199024 kubelet[2596]: W0307 00:54:07.198769 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.199024 kubelet[2596]: E0307 00:54:07.198782 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.199421 kubelet[2596]: E0307 00:54:07.199370 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.199421 kubelet[2596]: W0307 00:54:07.199383 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.199421 kubelet[2596]: E0307 00:54:07.199396 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.207100 containerd[1477]: time="2026-03-07T00:54:07.207071461Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cptjl,Uid:686e27f3-19b0-4a26-8415-55ead591442f,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:07.210703 kubelet[2596]: E0307 00:54:07.210638 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:07.210785 kubelet[2596]: W0307 00:54:07.210672 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:07.210785 kubelet[2596]: E0307 00:54:07.210762 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:07.233953 containerd[1477]: time="2026-03-07T00:54:07.233530790Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:07.233953 containerd[1477]: time="2026-03-07T00:54:07.233627866Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:07.233953 containerd[1477]: time="2026-03-07T00:54:07.233654476Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:07.233953 containerd[1477]: time="2026-03-07T00:54:07.233891725Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:07.254868 systemd[1]: Started cri-containerd-806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4.scope - libcontainer container 806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4. Mar 7 00:54:07.282985 containerd[1477]: time="2026-03-07T00:54:07.282191810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cptjl,Uid:686e27f3-19b0-4a26-8415-55ead591442f,Namespace:calico-system,Attempt:0,} returns sandbox id \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\"" Mar 7 00:54:08.515958 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3142210759.mount: Deactivated successfully. Mar 7 00:54:08.936028 kubelet[2596]: E0307 00:54:08.934493 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:08.963763 containerd[1477]: time="2026-03-07T00:54:08.963714954Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:08.965444 containerd[1477]: time="2026-03-07T00:54:08.965020185Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 7 00:54:08.966926 containerd[1477]: time="2026-03-07T00:54:08.966876696Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:08.970061 containerd[1477]: time="2026-03-07T00:54:08.970001744Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:08.971182 containerd[1477]: time="2026-03-07T00:54:08.971036838Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 1.827655717s" Mar 7 00:54:08.971182 containerd[1477]: time="2026-03-07T00:54:08.971079614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 7 00:54:08.973603 containerd[1477]: time="2026-03-07T00:54:08.973444308Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 7 00:54:08.988977 containerd[1477]: time="2026-03-07T00:54:08.988935663Z" level=info msg="CreateContainer within sandbox \"4d595f5b86839c2c1ca845e1f081a9843445898b19da01ed8a4c3fb9f8ebced1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 7 00:54:09.016355 containerd[1477]: time="2026-03-07T00:54:09.016203708Z" level=info msg="CreateContainer within sandbox \"4d595f5b86839c2c1ca845e1f081a9843445898b19da01ed8a4c3fb9f8ebced1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5893993a9d056b40dca661e6aaf440ba3dd427dac19d18761327222d853cc300\"" Mar 7 00:54:09.018181 containerd[1477]: time="2026-03-07T00:54:09.017941673Z" level=info msg="StartContainer for \"5893993a9d056b40dca661e6aaf440ba3dd427dac19d18761327222d853cc300\"" Mar 7 00:54:09.056900 systemd[1]: Started cri-containerd-5893993a9d056b40dca661e6aaf440ba3dd427dac19d18761327222d853cc300.scope - libcontainer container 5893993a9d056b40dca661e6aaf440ba3dd427dac19d18761327222d853cc300. Mar 7 00:54:09.093360 containerd[1477]: time="2026-03-07T00:54:09.093270851Z" level=info msg="StartContainer for \"5893993a9d056b40dca661e6aaf440ba3dd427dac19d18761327222d853cc300\" returns successfully" Mar 7 00:54:10.097361 kubelet[2596]: E0307 00:54:10.097215 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.097361 kubelet[2596]: W0307 00:54:10.097240 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.097361 kubelet[2596]: E0307 00:54:10.097278 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.098998 kubelet[2596]: E0307 00:54:10.098768 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.098998 kubelet[2596]: W0307 00:54:10.098789 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.098998 kubelet[2596]: E0307 00:54:10.098808 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.099333 kubelet[2596]: E0307 00:54:10.099214 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.099333 kubelet[2596]: W0307 00:54:10.099226 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.099333 kubelet[2596]: E0307 00:54:10.099237 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.099485 kubelet[2596]: E0307 00:54:10.099469 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.099549 kubelet[2596]: W0307 00:54:10.099537 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.099598 kubelet[2596]: E0307 00:54:10.099588 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.100467 kubelet[2596]: E0307 00:54:10.100451 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.100636 kubelet[2596]: W0307 00:54:10.100529 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.100636 kubelet[2596]: E0307 00:54:10.100555 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.100979 kubelet[2596]: E0307 00:54:10.100883 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.100979 kubelet[2596]: W0307 00:54:10.100894 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.100979 kubelet[2596]: E0307 00:54:10.100905 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.101290 kubelet[2596]: E0307 00:54:10.101179 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.101567 kubelet[2596]: W0307 00:54:10.101356 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.101567 kubelet[2596]: E0307 00:54:10.101386 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.101970 kubelet[2596]: E0307 00:54:10.101869 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.102080 kubelet[2596]: W0307 00:54:10.102049 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.102354 kubelet[2596]: E0307 00:54:10.102169 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.103787 kubelet[2596]: E0307 00:54:10.103765 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.103964 kubelet[2596]: W0307 00:54:10.103850 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.103964 kubelet[2596]: E0307 00:54:10.103866 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.104915 kubelet[2596]: E0307 00:54:10.104800 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.104915 kubelet[2596]: W0307 00:54:10.104815 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.104915 kubelet[2596]: E0307 00:54:10.104827 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.105089 kubelet[2596]: E0307 00:54:10.105078 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.105149 kubelet[2596]: W0307 00:54:10.105138 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.105205 kubelet[2596]: E0307 00:54:10.105195 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.105435 kubelet[2596]: E0307 00:54:10.105423 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.105706 kubelet[2596]: W0307 00:54:10.105515 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.106051 kubelet[2596]: E0307 00:54:10.105607 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.110806 kubelet[2596]: E0307 00:54:10.110776 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.111033 kubelet[2596]: W0307 00:54:10.110895 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.111033 kubelet[2596]: E0307 00:54:10.110914 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.115728 kubelet[2596]: E0307 00:54:10.113751 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.115984 kubelet[2596]: W0307 00:54:10.115839 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.115984 kubelet[2596]: E0307 00:54:10.115864 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.116143 kubelet[2596]: E0307 00:54:10.116132 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.116203 kubelet[2596]: W0307 00:54:10.116192 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.116258 kubelet[2596]: E0307 00:54:10.116248 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.121954 kubelet[2596]: E0307 00:54:10.121777 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.121954 kubelet[2596]: W0307 00:54:10.121796 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.121954 kubelet[2596]: E0307 00:54:10.121811 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.122160 kubelet[2596]: E0307 00:54:10.122147 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.122362 kubelet[2596]: W0307 00:54:10.122207 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.122362 kubelet[2596]: E0307 00:54:10.122229 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.122579 kubelet[2596]: E0307 00:54:10.122568 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.122664 kubelet[2596]: W0307 00:54:10.122644 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.125734 kubelet[2596]: E0307 00:54:10.122767 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.126094 kubelet[2596]: E0307 00:54:10.126072 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.126094 kubelet[2596]: W0307 00:54:10.126090 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.126185 kubelet[2596]: E0307 00:54:10.126103 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.126290 kubelet[2596]: E0307 00:54:10.126274 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.126290 kubelet[2596]: W0307 00:54:10.126287 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.126359 kubelet[2596]: E0307 00:54:10.126302 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.126447 kubelet[2596]: E0307 00:54:10.126432 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.126447 kubelet[2596]: W0307 00:54:10.126444 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.126502 kubelet[2596]: E0307 00:54:10.126457 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.129864 kubelet[2596]: E0307 00:54:10.129835 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.129864 kubelet[2596]: W0307 00:54:10.129858 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.129864 kubelet[2596]: E0307 00:54:10.129874 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.130359 kubelet[2596]: E0307 00:54:10.130336 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.130359 kubelet[2596]: W0307 00:54:10.130352 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.130559 kubelet[2596]: E0307 00:54:10.130540 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.130951 kubelet[2596]: E0307 00:54:10.130925 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.130951 kubelet[2596]: W0307 00:54:10.130944 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.131034 kubelet[2596]: E0307 00:54:10.130958 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.131188 kubelet[2596]: E0307 00:54:10.131166 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.131188 kubelet[2596]: W0307 00:54:10.131181 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.131188 kubelet[2596]: E0307 00:54:10.131192 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.131920 kubelet[2596]: E0307 00:54:10.131888 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.131920 kubelet[2596]: W0307 00:54:10.131911 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.132014 kubelet[2596]: E0307 00:54:10.131925 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.132125 kubelet[2596]: E0307 00:54:10.132105 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.132125 kubelet[2596]: W0307 00:54:10.132118 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.132179 kubelet[2596]: E0307 00:54:10.132132 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.132289 kubelet[2596]: E0307 00:54:10.132266 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.132289 kubelet[2596]: W0307 00:54:10.132278 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.132358 kubelet[2596]: E0307 00:54:10.132291 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.133886 kubelet[2596]: E0307 00:54:10.133855 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.133953 kubelet[2596]: W0307 00:54:10.133880 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.133953 kubelet[2596]: E0307 00:54:10.133906 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.134441 kubelet[2596]: E0307 00:54:10.134419 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.134441 kubelet[2596]: W0307 00:54:10.134436 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.134518 kubelet[2596]: E0307 00:54:10.134448 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.135357 kubelet[2596]: E0307 00:54:10.135329 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.135357 kubelet[2596]: W0307 00:54:10.135347 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.135442 kubelet[2596]: E0307 00:54:10.135366 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.135604 kubelet[2596]: E0307 00:54:10.135585 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.135604 kubelet[2596]: W0307 00:54:10.135599 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.136740 kubelet[2596]: E0307 00:54:10.135610 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.138290 kubelet[2596]: E0307 00:54:10.138267 2596 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 00:54:10.138290 kubelet[2596]: W0307 00:54:10.138284 2596 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 00:54:10.138370 kubelet[2596]: E0307 00:54:10.138296 2596 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 00:54:10.497664 containerd[1477]: time="2026-03-07T00:54:10.496352567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:10.498856 containerd[1477]: time="2026-03-07T00:54:10.498427424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 7 00:54:10.499818 containerd[1477]: time="2026-03-07T00:54:10.499768914Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:10.502949 containerd[1477]: time="2026-03-07T00:54:10.502898044Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:10.503895 containerd[1477]: time="2026-03-07T00:54:10.503752811Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.530226834s" Mar 7 00:54:10.503895 containerd[1477]: time="2026-03-07T00:54:10.503785382Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 7 00:54:10.509442 containerd[1477]: time="2026-03-07T00:54:10.509389704Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 7 00:54:10.527005 containerd[1477]: time="2026-03-07T00:54:10.526884256Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8\"" Mar 7 00:54:10.528016 containerd[1477]: time="2026-03-07T00:54:10.527862945Z" level=info msg="StartContainer for \"28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8\"" Mar 7 00:54:10.565876 systemd[1]: Started cri-containerd-28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8.scope - libcontainer container 28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8. Mar 7 00:54:10.602618 containerd[1477]: time="2026-03-07T00:54:10.602573625Z" level=info msg="StartContainer for \"28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8\" returns successfully" Mar 7 00:54:10.619110 systemd[1]: cri-containerd-28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8.scope: Deactivated successfully. Mar 7 00:54:10.642988 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8-rootfs.mount: Deactivated successfully. Mar 7 00:54:10.738175 containerd[1477]: time="2026-03-07T00:54:10.738058186Z" level=info msg="shim disconnected" id=28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8 namespace=k8s.io Mar 7 00:54:10.738175 containerd[1477]: time="2026-03-07T00:54:10.738148657Z" level=warning msg="cleaning up after shim disconnected" id=28d5c57e7f6b8791133a17ba2b3d925d164ddfce680faa358a4d50cea821bda8 namespace=k8s.io Mar 7 00:54:10.738175 containerd[1477]: time="2026-03-07T00:54:10.738169343Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:54:10.938594 kubelet[2596]: E0307 00:54:10.938539 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:11.068955 kubelet[2596]: I0307 00:54:11.068926 2596 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:54:11.071350 containerd[1477]: time="2026-03-07T00:54:11.070627378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 7 00:54:11.101033 kubelet[2596]: I0307 00:54:11.100537 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5dfd975ddc-qjwpx" podStartSLOduration=3.2708289649999998 podStartE2EDuration="5.100522788s" podCreationTimestamp="2026-03-07 00:54:06 +0000 UTC" firstStartedPulling="2026-03-07 00:54:07.142662852 +0000 UTC m=+22.343436495" lastFinishedPulling="2026-03-07 00:54:08.972356635 +0000 UTC m=+24.173130318" observedRunningTime="2026-03-07 00:54:10.090886575 +0000 UTC m=+25.291660218" watchObservedRunningTime="2026-03-07 00:54:11.100522788 +0000 UTC m=+26.301296471" Mar 7 00:54:12.667425 kubelet[2596]: I0307 00:54:12.665896 2596 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:54:12.937051 kubelet[2596]: E0307 00:54:12.935151 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:14.936537 kubelet[2596]: E0307 00:54:14.936471 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:16.935024 kubelet[2596]: E0307 00:54:16.934094 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:17.256072 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1807332578.mount: Deactivated successfully. Mar 7 00:54:17.287759 containerd[1477]: time="2026-03-07T00:54:17.287635698Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:17.290257 containerd[1477]: time="2026-03-07T00:54:17.289994571Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 7 00:54:17.291925 containerd[1477]: time="2026-03-07T00:54:17.291850509Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:17.295728 containerd[1477]: time="2026-03-07T00:54:17.295162318Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:17.297253 containerd[1477]: time="2026-03-07T00:54:17.296666561Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 6.22599469s" Mar 7 00:54:17.297253 containerd[1477]: time="2026-03-07T00:54:17.296752304Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 7 00:54:17.301229 containerd[1477]: time="2026-03-07T00:54:17.301199458Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 7 00:54:17.317915 containerd[1477]: time="2026-03-07T00:54:17.317875093Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28\"" Mar 7 00:54:17.319266 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2309016902.mount: Deactivated successfully. Mar 7 00:54:17.322359 containerd[1477]: time="2026-03-07T00:54:17.320885741Z" level=info msg="StartContainer for \"d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28\"" Mar 7 00:54:17.360160 systemd[1]: Started cri-containerd-d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28.scope - libcontainer container d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28. Mar 7 00:54:17.386504 containerd[1477]: time="2026-03-07T00:54:17.386389802Z" level=info msg="StartContainer for \"d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28\" returns successfully" Mar 7 00:54:17.504949 systemd[1]: cri-containerd-d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28.scope: Deactivated successfully. Mar 7 00:54:17.794031 containerd[1477]: time="2026-03-07T00:54:17.793957105Z" level=info msg="shim disconnected" id=d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28 namespace=k8s.io Mar 7 00:54:17.794031 containerd[1477]: time="2026-03-07T00:54:17.794023243Z" level=warning msg="cleaning up after shim disconnected" id=d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28 namespace=k8s.io Mar 7 00:54:17.794031 containerd[1477]: time="2026-03-07T00:54:17.794035086Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:54:18.093202 containerd[1477]: time="2026-03-07T00:54:18.092906659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 7 00:54:18.258394 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d535044d62513b08a5e4c4370c608ee480152b14859b2bb463c142a3d09e6f28-rootfs.mount: Deactivated successfully. Mar 7 00:54:18.935708 kubelet[2596]: E0307 00:54:18.934393 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:20.934526 kubelet[2596]: E0307 00:54:20.934459 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:21.913421 containerd[1477]: time="2026-03-07T00:54:21.913363417Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:21.915353 containerd[1477]: time="2026-03-07T00:54:21.915305366Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 7 00:54:21.915890 containerd[1477]: time="2026-03-07T00:54:21.915802887Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:21.919337 containerd[1477]: time="2026-03-07T00:54:21.918978974Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:21.919869 containerd[1477]: time="2026-03-07T00:54:21.919834380Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 3.826884631s" Mar 7 00:54:21.919869 containerd[1477]: time="2026-03-07T00:54:21.919867068Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 7 00:54:21.925123 containerd[1477]: time="2026-03-07T00:54:21.925089170Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 7 00:54:21.948712 containerd[1477]: time="2026-03-07T00:54:21.948632617Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29\"" Mar 7 00:54:21.951207 containerd[1477]: time="2026-03-07T00:54:21.949419727Z" level=info msg="StartContainer for \"b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29\"" Mar 7 00:54:21.985611 systemd[1]: run-containerd-runc-k8s.io-b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29-runc.4jM8fz.mount: Deactivated successfully. Mar 7 00:54:21.992979 systemd[1]: Started cri-containerd-b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29.scope - libcontainer container b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29. Mar 7 00:54:22.053379 containerd[1477]: time="2026-03-07T00:54:22.053260232Z" level=info msg="StartContainer for \"b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29\" returns successfully" Mar 7 00:54:22.655800 containerd[1477]: time="2026-03-07T00:54:22.655736819Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 00:54:22.658739 systemd[1]: cri-containerd-b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29.scope: Deactivated successfully. Mar 7 00:54:22.678845 kubelet[2596]: I0307 00:54:22.678221 2596 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 7 00:54:22.772714 containerd[1477]: time="2026-03-07T00:54:22.769799523Z" level=info msg="shim disconnected" id=b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29 namespace=k8s.io Mar 7 00:54:22.772714 containerd[1477]: time="2026-03-07T00:54:22.769854016Z" level=warning msg="cleaning up after shim disconnected" id=b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29 namespace=k8s.io Mar 7 00:54:22.772714 containerd[1477]: time="2026-03-07T00:54:22.769863578Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:54:22.780077 systemd[1]: Created slice kubepods-burstable-pod5bc543fa_d95f_4c47_aab4_75443fc7ed6f.slice - libcontainer container kubepods-burstable-pod5bc543fa_d95f_4c47_aab4_75443fc7ed6f.slice. Mar 7 00:54:22.802364 systemd[1]: Created slice kubepods-burstable-poddcb976b7_3486_45dd_bbba_92e1e50edecd.slice - libcontainer container kubepods-burstable-poddcb976b7_3486_45dd_bbba_92e1e50edecd.slice. Mar 7 00:54:22.814200 systemd[1]: Created slice kubepods-besteffort-pod60512900_2075_4aec_8ac2_3b41bbf3cac4.slice - libcontainer container kubepods-besteffort-pod60512900_2075_4aec_8ac2_3b41bbf3cac4.slice. Mar 7 00:54:22.818687 kubelet[2596]: I0307 00:54:22.818628 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd57x\" (UniqueName: \"kubernetes.io/projected/5bc543fa-d95f-4c47-aab4-75443fc7ed6f-kube-api-access-sd57x\") pod \"coredns-66bc5c9577-kppgg\" (UID: \"5bc543fa-d95f-4c47-aab4-75443fc7ed6f\") " pod="kube-system/coredns-66bc5c9577-kppgg" Mar 7 00:54:22.819375 kubelet[2596]: I0307 00:54:22.819200 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/0106d835-24d5-479e-aa94-270e56a82826-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-z9vmk\" (UID: \"0106d835-24d5-479e-aa94-270e56a82826\") " pod="calico-system/goldmane-cccfbd5cf-z9vmk" Mar 7 00:54:22.819375 kubelet[2596]: I0307 00:54:22.819239 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-nginx-config\") pod \"whisker-57578867fc-57xjz\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " pod="calico-system/whisker-57578867fc-57xjz" Mar 7 00:54:22.820827 kubelet[2596]: I0307 00:54:22.819258 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-backend-key-pair\") pod \"whisker-57578867fc-57xjz\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " pod="calico-system/whisker-57578867fc-57xjz" Mar 7 00:54:22.820827 kubelet[2596]: I0307 00:54:22.820789 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/92eb6a25-972b-4cc1-a42c-1135d83a0c74-calico-apiserver-certs\") pod \"calico-apiserver-75b65f9ff7-snpd2\" (UID: \"92eb6a25-972b-4cc1-a42c-1135d83a0c74\") " pod="calico-system/calico-apiserver-75b65f9ff7-snpd2" Mar 7 00:54:22.821010 kubelet[2596]: I0307 00:54:22.820929 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bc543fa-d95f-4c47-aab4-75443fc7ed6f-config-volume\") pod \"coredns-66bc5c9577-kppgg\" (UID: \"5bc543fa-d95f-4c47-aab4-75443fc7ed6f\") " pod="kube-system/coredns-66bc5c9577-kppgg" Mar 7 00:54:22.821010 kubelet[2596]: I0307 00:54:22.820959 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0106d835-24d5-479e-aa94-270e56a82826-config\") pod \"goldmane-cccfbd5cf-z9vmk\" (UID: \"0106d835-24d5-479e-aa94-270e56a82826\") " pod="calico-system/goldmane-cccfbd5cf-z9vmk" Mar 7 00:54:22.821437 kubelet[2596]: I0307 00:54:22.820992 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcb976b7-3486-45dd-bbba-92e1e50edecd-config-volume\") pod \"coredns-66bc5c9577-r6nrd\" (UID: \"dcb976b7-3486-45dd-bbba-92e1e50edecd\") " pod="kube-system/coredns-66bc5c9577-r6nrd" Mar 7 00:54:22.821594 kubelet[2596]: I0307 00:54:22.821554 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgxc5\" (UniqueName: \"kubernetes.io/projected/dcb976b7-3486-45dd-bbba-92e1e50edecd-kube-api-access-xgxc5\") pod \"coredns-66bc5c9577-r6nrd\" (UID: \"dcb976b7-3486-45dd-bbba-92e1e50edecd\") " pod="kube-system/coredns-66bc5c9577-r6nrd" Mar 7 00:54:22.821669 kubelet[2596]: I0307 00:54:22.821579 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mx9n\" (UniqueName: \"kubernetes.io/projected/5a4b6a6b-223d-4d9c-9933-e41491314c6d-kube-api-access-9mx9n\") pod \"calico-kube-controllers-6674bf8c76-nmdd8\" (UID: \"5a4b6a6b-223d-4d9c-9933-e41491314c6d\") " pod="calico-system/calico-kube-controllers-6674bf8c76-nmdd8" Mar 7 00:54:22.823504 kubelet[2596]: I0307 00:54:22.822700 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvchb\" (UniqueName: \"kubernetes.io/projected/92eb6a25-972b-4cc1-a42c-1135d83a0c74-kube-api-access-hvchb\") pod \"calico-apiserver-75b65f9ff7-snpd2\" (UID: \"92eb6a25-972b-4cc1-a42c-1135d83a0c74\") " pod="calico-system/calico-apiserver-75b65f9ff7-snpd2" Mar 7 00:54:22.823504 kubelet[2596]: I0307 00:54:22.822743 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/16c7f73a-a0d9-4851-9016-87e6dc57075b-calico-apiserver-certs\") pod \"calico-apiserver-75b65f9ff7-9wghj\" (UID: \"16c7f73a-a0d9-4851-9016-87e6dc57075b\") " pod="calico-system/calico-apiserver-75b65f9ff7-9wghj" Mar 7 00:54:22.823504 kubelet[2596]: I0307 00:54:22.822767 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbq5c\" (UniqueName: \"kubernetes.io/projected/60512900-2075-4aec-8ac2-3b41bbf3cac4-kube-api-access-qbq5c\") pod \"whisker-57578867fc-57xjz\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " pod="calico-system/whisker-57578867fc-57xjz" Mar 7 00:54:22.823504 kubelet[2596]: I0307 00:54:22.822788 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7v6p\" (UniqueName: \"kubernetes.io/projected/16c7f73a-a0d9-4851-9016-87e6dc57075b-kube-api-access-n7v6p\") pod \"calico-apiserver-75b65f9ff7-9wghj\" (UID: \"16c7f73a-a0d9-4851-9016-87e6dc57075b\") " pod="calico-system/calico-apiserver-75b65f9ff7-9wghj" Mar 7 00:54:22.823504 kubelet[2596]: I0307 00:54:22.822803 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a4b6a6b-223d-4d9c-9933-e41491314c6d-tigera-ca-bundle\") pod \"calico-kube-controllers-6674bf8c76-nmdd8\" (UID: \"5a4b6a6b-223d-4d9c-9933-e41491314c6d\") " pod="calico-system/calico-kube-controllers-6674bf8c76-nmdd8" Mar 7 00:54:22.823669 kubelet[2596]: I0307 00:54:22.822817 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0106d835-24d5-479e-aa94-270e56a82826-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-z9vmk\" (UID: \"0106d835-24d5-479e-aa94-270e56a82826\") " pod="calico-system/goldmane-cccfbd5cf-z9vmk" Mar 7 00:54:22.823669 kubelet[2596]: I0307 00:54:22.822831 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzm2\" (UniqueName: \"kubernetes.io/projected/0106d835-24d5-479e-aa94-270e56a82826-kube-api-access-bbzm2\") pod \"goldmane-cccfbd5cf-z9vmk\" (UID: \"0106d835-24d5-479e-aa94-270e56a82826\") " pod="calico-system/goldmane-cccfbd5cf-z9vmk" Mar 7 00:54:22.823669 kubelet[2596]: I0307 00:54:22.822846 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-ca-bundle\") pod \"whisker-57578867fc-57xjz\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " pod="calico-system/whisker-57578867fc-57xjz" Mar 7 00:54:22.827809 systemd[1]: Created slice kubepods-besteffort-pod5a4b6a6b_223d_4d9c_9933_e41491314c6d.slice - libcontainer container kubepods-besteffort-pod5a4b6a6b_223d_4d9c_9933_e41491314c6d.slice. Mar 7 00:54:22.837871 systemd[1]: Created slice kubepods-besteffort-pod0106d835_24d5_479e_aa94_270e56a82826.slice - libcontainer container kubepods-besteffort-pod0106d835_24d5_479e_aa94_270e56a82826.slice. Mar 7 00:54:22.848091 systemd[1]: Created slice kubepods-besteffort-pod16c7f73a_a0d9_4851_9016_87e6dc57075b.slice - libcontainer container kubepods-besteffort-pod16c7f73a_a0d9_4851_9016_87e6dc57075b.slice. Mar 7 00:54:22.854500 systemd[1]: Created slice kubepods-besteffort-pod92eb6a25_972b_4cc1_a42c_1135d83a0c74.slice - libcontainer container kubepods-besteffort-pod92eb6a25_972b_4cc1_a42c_1135d83a0c74.slice. Mar 7 00:54:22.947431 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b2fdb1608caf105daad1714801dc2b61a7964b65f416625f7113c100b77e7e29-rootfs.mount: Deactivated successfully. Mar 7 00:54:22.966271 systemd[1]: Created slice kubepods-besteffort-pod16bf6d64_1b62_4d2e_b193_67ace2ec0676.slice - libcontainer container kubepods-besteffort-pod16bf6d64_1b62_4d2e_b193_67ace2ec0676.slice. Mar 7 00:54:22.980693 containerd[1477]: time="2026-03-07T00:54:22.975900216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2zr4s,Uid:16bf6d64-1b62-4d2e-b193-67ace2ec0676,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:23.079531 containerd[1477]: time="2026-03-07T00:54:23.079466064Z" level=error msg="Failed to destroy network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.079874 containerd[1477]: time="2026-03-07T00:54:23.079831108Z" level=error msg="encountered an error cleaning up failed sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.079938 containerd[1477]: time="2026-03-07T00:54:23.079888642Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2zr4s,Uid:16bf6d64-1b62-4d2e-b193-67ace2ec0676,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.080282 kubelet[2596]: E0307 00:54:23.080191 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.080335 kubelet[2596]: E0307 00:54:23.080297 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2zr4s" Mar 7 00:54:23.080370 kubelet[2596]: E0307 00:54:23.080320 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2zr4s" Mar 7 00:54:23.080419 kubelet[2596]: E0307 00:54:23.080393 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2zr4s_calico-system(16bf6d64-1b62-4d2e-b193-67ace2ec0676)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2zr4s_calico-system(16bf6d64-1b62-4d2e-b193-67ace2ec0676)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:23.105056 containerd[1477]: time="2026-03-07T00:54:23.104994550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-kppgg,Uid:5bc543fa-d95f-4c47-aab4-75443fc7ed6f,Namespace:kube-system,Attempt:0,}" Mar 7 00:54:23.112609 containerd[1477]: time="2026-03-07T00:54:23.112571656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6nrd,Uid:dcb976b7-3486-45dd-bbba-92e1e50edecd,Namespace:kube-system,Attempt:0,}" Mar 7 00:54:23.129528 kubelet[2596]: I0307 00:54:23.128421 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:23.129939 containerd[1477]: time="2026-03-07T00:54:23.129909333Z" level=info msg="StopPodSandbox for \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\"" Mar 7 00:54:23.133834 containerd[1477]: time="2026-03-07T00:54:23.133674001Z" level=info msg="Ensure that sandbox 63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0 in task-service has been cleanup successfully" Mar 7 00:54:23.138105 containerd[1477]: time="2026-03-07T00:54:23.137878570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57578867fc-57xjz,Uid:60512900-2075-4aec-8ac2-3b41bbf3cac4,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:23.141495 containerd[1477]: time="2026-03-07T00:54:23.141448553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6674bf8c76-nmdd8,Uid:5a4b6a6b-223d-4d9c-9933-e41491314c6d,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:23.155636 containerd[1477]: time="2026-03-07T00:54:23.155323352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-z9vmk,Uid:0106d835-24d5-479e-aa94-270e56a82826,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:23.155636 containerd[1477]: time="2026-03-07T00:54:23.155360481Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 7 00:54:23.159724 containerd[1477]: time="2026-03-07T00:54:23.159662993Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-9wghj,Uid:16c7f73a-a0d9-4851-9016-87e6dc57075b,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:23.160819 containerd[1477]: time="2026-03-07T00:54:23.160786692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-snpd2,Uid:92eb6a25-972b-4cc1-a42c-1135d83a0c74,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:23.202120 containerd[1477]: time="2026-03-07T00:54:23.201990831Z" level=error msg="StopPodSandbox for \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\" failed" error="failed to destroy network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.202670 kubelet[2596]: E0307 00:54:23.202465 2596 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:23.202670 kubelet[2596]: E0307 00:54:23.202546 2596 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0"} Mar 7 00:54:23.202670 kubelet[2596]: E0307 00:54:23.202795 2596 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"16bf6d64-1b62-4d2e-b193-67ace2ec0676\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 00:54:23.203326 kubelet[2596]: E0307 00:54:23.203188 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"16bf6d64-1b62-4d2e-b193-67ace2ec0676\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2zr4s" podUID="16bf6d64-1b62-4d2e-b193-67ace2ec0676" Mar 7 00:54:23.239970 containerd[1477]: time="2026-03-07T00:54:23.239917814Z" level=info msg="CreateContainer within sandbox \"806e8d61b6e11145d00db8a14769787e55ed2b6a6a0245abc8c252d3122e30f4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"33c5ee29b7065be2b70ae876cc6b0e99189b2509a4aac7a3cc3f859b5a36dca9\"" Mar 7 00:54:23.242841 containerd[1477]: time="2026-03-07T00:54:23.242808121Z" level=info msg="StartContainer for \"33c5ee29b7065be2b70ae876cc6b0e99189b2509a4aac7a3cc3f859b5a36dca9\"" Mar 7 00:54:23.296891 containerd[1477]: time="2026-03-07T00:54:23.296844698Z" level=error msg="Failed to destroy network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.297497 containerd[1477]: time="2026-03-07T00:54:23.297452878Z" level=error msg="encountered an error cleaning up failed sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.297649 containerd[1477]: time="2026-03-07T00:54:23.297595591Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-kppgg,Uid:5bc543fa-d95f-4c47-aab4-75443fc7ed6f,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.298364 kubelet[2596]: E0307 00:54:23.298294 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.298647 kubelet[2596]: E0307 00:54:23.298461 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-kppgg" Mar 7 00:54:23.298647 kubelet[2596]: E0307 00:54:23.298487 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-kppgg" Mar 7 00:54:23.298647 kubelet[2596]: E0307 00:54:23.298549 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-kppgg_kube-system(5bc543fa-d95f-4c47-aab4-75443fc7ed6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-kppgg_kube-system(5bc543fa-d95f-4c47-aab4-75443fc7ed6f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-kppgg" podUID="5bc543fa-d95f-4c47-aab4-75443fc7ed6f" Mar 7 00:54:23.352287 systemd[1]: Started cri-containerd-33c5ee29b7065be2b70ae876cc6b0e99189b2509a4aac7a3cc3f859b5a36dca9.scope - libcontainer container 33c5ee29b7065be2b70ae876cc6b0e99189b2509a4aac7a3cc3f859b5a36dca9. Mar 7 00:54:23.406361 containerd[1477]: time="2026-03-07T00:54:23.406193347Z" level=error msg="Failed to destroy network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.407029 containerd[1477]: time="2026-03-07T00:54:23.406881585Z" level=error msg="encountered an error cleaning up failed sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.407029 containerd[1477]: time="2026-03-07T00:54:23.406935758Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6nrd,Uid:dcb976b7-3486-45dd-bbba-92e1e50edecd,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.407029 containerd[1477]: time="2026-03-07T00:54:23.406997372Z" level=error msg="Failed to destroy network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.407939 containerd[1477]: time="2026-03-07T00:54:23.407256592Z" level=error msg="encountered an error cleaning up failed sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.407939 containerd[1477]: time="2026-03-07T00:54:23.407345932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6674bf8c76-nmdd8,Uid:5a4b6a6b-223d-4d9c-9933-e41491314c6d,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.408090 kubelet[2596]: E0307 00:54:23.407315 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.408090 kubelet[2596]: E0307 00:54:23.407557 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.408090 kubelet[2596]: E0307 00:54:23.407600 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6674bf8c76-nmdd8" Mar 7 00:54:23.408090 kubelet[2596]: E0307 00:54:23.407628 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6674bf8c76-nmdd8" Mar 7 00:54:23.408198 kubelet[2596]: E0307 00:54:23.407691 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6674bf8c76-nmdd8_calico-system(5a4b6a6b-223d-4d9c-9933-e41491314c6d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6674bf8c76-nmdd8_calico-system(5a4b6a6b-223d-4d9c-9933-e41491314c6d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6674bf8c76-nmdd8" podUID="5a4b6a6b-223d-4d9c-9933-e41491314c6d" Mar 7 00:54:23.408449 kubelet[2596]: E0307 00:54:23.407372 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-r6nrd" Mar 7 00:54:23.408449 kubelet[2596]: E0307 00:54:23.408400 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-r6nrd" Mar 7 00:54:23.408812 kubelet[2596]: E0307 00:54:23.408756 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-r6nrd_kube-system(dcb976b7-3486-45dd-bbba-92e1e50edecd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-r6nrd_kube-system(dcb976b7-3486-45dd-bbba-92e1e50edecd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-r6nrd" podUID="dcb976b7-3486-45dd-bbba-92e1e50edecd" Mar 7 00:54:23.416844 containerd[1477]: time="2026-03-07T00:54:23.416801952Z" level=error msg="Failed to destroy network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.417323 containerd[1477]: time="2026-03-07T00:54:23.417201845Z" level=error msg="encountered an error cleaning up failed sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.417323 containerd[1477]: time="2026-03-07T00:54:23.417252216Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-9wghj,Uid:16c7f73a-a0d9-4851-9016-87e6dc57075b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.417920 kubelet[2596]: E0307 00:54:23.417667 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.417920 kubelet[2596]: E0307 00:54:23.417770 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-75b65f9ff7-9wghj" Mar 7 00:54:23.417920 kubelet[2596]: E0307 00:54:23.417813 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-75b65f9ff7-9wghj" Mar 7 00:54:23.418044 kubelet[2596]: E0307 00:54:23.417865 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75b65f9ff7-9wghj_calico-system(16c7f73a-a0d9-4851-9016-87e6dc57075b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75b65f9ff7-9wghj_calico-system(16c7f73a-a0d9-4851-9016-87e6dc57075b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-75b65f9ff7-9wghj" podUID="16c7f73a-a0d9-4851-9016-87e6dc57075b" Mar 7 00:54:23.433238 containerd[1477]: time="2026-03-07T00:54:23.433120114Z" level=info msg="StartContainer for \"33c5ee29b7065be2b70ae876cc6b0e99189b2509a4aac7a3cc3f859b5a36dca9\" returns successfully" Mar 7 00:54:23.446299 containerd[1477]: time="2026-03-07T00:54:23.446084983Z" level=error msg="Failed to destroy network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.450729 containerd[1477]: time="2026-03-07T00:54:23.448917796Z" level=error msg="encountered an error cleaning up failed sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.450729 containerd[1477]: time="2026-03-07T00:54:23.448992253Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-snpd2,Uid:92eb6a25-972b-4cc1-a42c-1135d83a0c74,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.451098 kubelet[2596]: E0307 00:54:23.449216 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.451098 kubelet[2596]: E0307 00:54:23.449276 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-75b65f9ff7-snpd2" Mar 7 00:54:23.451098 kubelet[2596]: E0307 00:54:23.449313 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-75b65f9ff7-snpd2" Mar 7 00:54:23.451192 kubelet[2596]: E0307 00:54:23.449362 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75b65f9ff7-snpd2_calico-system(92eb6a25-972b-4cc1-a42c-1135d83a0c74)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75b65f9ff7-snpd2_calico-system(92eb6a25-972b-4cc1-a42c-1135d83a0c74)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-75b65f9ff7-snpd2" podUID="92eb6a25-972b-4cc1-a42c-1135d83a0c74" Mar 7 00:54:23.465919 containerd[1477]: time="2026-03-07T00:54:23.465801408Z" level=error msg="Failed to destroy network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.467936 containerd[1477]: time="2026-03-07T00:54:23.466765071Z" level=error msg="encountered an error cleaning up failed sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.467936 containerd[1477]: time="2026-03-07T00:54:23.466843009Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-z9vmk,Uid:0106d835-24d5-479e-aa94-270e56a82826,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.468095 kubelet[2596]: E0307 00:54:23.467173 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.468095 kubelet[2596]: E0307 00:54:23.467220 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-z9vmk" Mar 7 00:54:23.468095 kubelet[2596]: E0307 00:54:23.467242 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-z9vmk" Mar 7 00:54:23.468181 kubelet[2596]: E0307 00:54:23.467312 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-z9vmk_calico-system(0106d835-24d5-479e-aa94-270e56a82826)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-z9vmk_calico-system(0106d835-24d5-479e-aa94-270e56a82826)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-z9vmk" podUID="0106d835-24d5-479e-aa94-270e56a82826" Mar 7 00:54:23.471796 containerd[1477]: time="2026-03-07T00:54:23.471753101Z" level=error msg="Failed to destroy network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.472273 containerd[1477]: time="2026-03-07T00:54:23.472079936Z" level=error msg="encountered an error cleaning up failed sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.472273 containerd[1477]: time="2026-03-07T00:54:23.472124706Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57578867fc-57xjz,Uid:60512900-2075-4aec-8ac2-3b41bbf3cac4,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.472826 kubelet[2596]: E0307 00:54:23.472444 2596 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 00:54:23.472826 kubelet[2596]: E0307 00:54:23.472509 2596 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-57578867fc-57xjz" Mar 7 00:54:23.472826 kubelet[2596]: E0307 00:54:23.472531 2596 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-57578867fc-57xjz" Mar 7 00:54:23.473491 kubelet[2596]: E0307 00:54:23.472587 2596 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-57578867fc-57xjz_calico-system(60512900-2075-4aec-8ac2-3b41bbf3cac4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-57578867fc-57xjz_calico-system(60512900-2075-4aec-8ac2-3b41bbf3cac4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-57578867fc-57xjz" podUID="60512900-2075-4aec-8ac2-3b41bbf3cac4" Mar 7 00:54:24.137409 kubelet[2596]: I0307 00:54:24.136879 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:24.139373 containerd[1477]: time="2026-03-07T00:54:24.137805644Z" level=info msg="StopPodSandbox for \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\"" Mar 7 00:54:24.139373 containerd[1477]: time="2026-03-07T00:54:24.138018492Z" level=info msg="Ensure that sandbox 19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae in task-service has been cleanup successfully" Mar 7 00:54:24.139657 kubelet[2596]: I0307 00:54:24.139381 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:24.140151 containerd[1477]: time="2026-03-07T00:54:24.140022264Z" level=info msg="StopPodSandbox for \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\"" Mar 7 00:54:24.140290 containerd[1477]: time="2026-03-07T00:54:24.140263478Z" level=info msg="Ensure that sandbox 5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23 in task-service has been cleanup successfully" Mar 7 00:54:24.145104 kubelet[2596]: I0307 00:54:24.145027 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:24.147283 containerd[1477]: time="2026-03-07T00:54:24.147253014Z" level=info msg="StopPodSandbox for \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\"" Mar 7 00:54:24.149458 containerd[1477]: time="2026-03-07T00:54:24.149201374Z" level=info msg="Ensure that sandbox a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b in task-service has been cleanup successfully" Mar 7 00:54:24.154175 kubelet[2596]: I0307 00:54:24.154137 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:24.156051 containerd[1477]: time="2026-03-07T00:54:24.156016151Z" level=info msg="StopPodSandbox for \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\"" Mar 7 00:54:24.156974 containerd[1477]: time="2026-03-07T00:54:24.156197472Z" level=info msg="Ensure that sandbox 196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed in task-service has been cleanup successfully" Mar 7 00:54:24.163394 kubelet[2596]: I0307 00:54:24.163274 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:24.164344 containerd[1477]: time="2026-03-07T00:54:24.164085251Z" level=info msg="StopPodSandbox for \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\"" Mar 7 00:54:24.169948 kubelet[2596]: I0307 00:54:24.169599 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:24.173574 kubelet[2596]: I0307 00:54:24.172845 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cptjl" podStartSLOduration=3.536494211 podStartE2EDuration="18.172829943s" podCreationTimestamp="2026-03-07 00:54:06 +0000 UTC" firstStartedPulling="2026-03-07 00:54:07.28491059 +0000 UTC m=+22.485684273" lastFinishedPulling="2026-03-07 00:54:21.921246322 +0000 UTC m=+37.122020005" observedRunningTime="2026-03-07 00:54:24.168657922 +0000 UTC m=+39.369431605" watchObservedRunningTime="2026-03-07 00:54:24.172829943 +0000 UTC m=+39.373603626" Mar 7 00:54:24.176004 containerd[1477]: time="2026-03-07T00:54:24.175751682Z" level=info msg="StopPodSandbox for \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\"" Mar 7 00:54:24.176746 containerd[1477]: time="2026-03-07T00:54:24.176713139Z" level=info msg="Ensure that sandbox f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f in task-service has been cleanup successfully" Mar 7 00:54:24.180770 containerd[1477]: time="2026-03-07T00:54:24.176374622Z" level=info msg="Ensure that sandbox 0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369 in task-service has been cleanup successfully" Mar 7 00:54:24.189810 kubelet[2596]: I0307 00:54:24.189734 2596 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:24.191613 containerd[1477]: time="2026-03-07T00:54:24.191534361Z" level=info msg="StopPodSandbox for \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\"" Mar 7 00:54:24.196478 containerd[1477]: time="2026-03-07T00:54:24.196056141Z" level=info msg="Ensure that sandbox b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e in task-service has been cleanup successfully" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.409 [INFO][3824] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.409 [INFO][3824] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" iface="eth0" netns="/var/run/netns/cni-41e241a5-750b-406a-38d7-5e6d78125b77" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.409 [INFO][3824] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" iface="eth0" netns="/var/run/netns/cni-41e241a5-750b-406a-38d7-5e6d78125b77" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.413 [INFO][3824] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" iface="eth0" netns="/var/run/netns/cni-41e241a5-750b-406a-38d7-5e6d78125b77" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.413 [INFO][3824] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.413 [INFO][3824] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.536 [INFO][3921] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.536 [INFO][3921] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.536 [INFO][3921] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.552 [WARNING][3921] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.552 [INFO][3921] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.555 [INFO][3921] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:24.576999 containerd[1477]: 2026-03-07 00:54:24.561 [INFO][3824] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:24.579969 containerd[1477]: time="2026-03-07T00:54:24.579791847Z" level=info msg="TearDown network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\" successfully" Mar 7 00:54:24.579969 containerd[1477]: time="2026-03-07T00:54:24.579833136Z" level=info msg="StopPodSandbox for \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\" returns successfully" Mar 7 00:54:24.581822 systemd[1]: run-netns-cni\x2d41e241a5\x2d750b\x2d406a\x2d38d7\x2d5e6d78125b77.mount: Deactivated successfully. Mar 7 00:54:24.585164 containerd[1477]: time="2026-03-07T00:54:24.583937862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-snpd2,Uid:92eb6a25-972b-4cc1-a42c-1135d83a0c74,Namespace:calico-system,Attempt:1,}" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.396 [INFO][3885] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.396 [INFO][3885] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" iface="eth0" netns="/var/run/netns/cni-dab20814-8e20-f80e-448e-fd0c6e8dfcd1" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.397 [INFO][3885] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" iface="eth0" netns="/var/run/netns/cni-dab20814-8e20-f80e-448e-fd0c6e8dfcd1" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.397 [INFO][3885] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" iface="eth0" netns="/var/run/netns/cni-dab20814-8e20-f80e-448e-fd0c6e8dfcd1" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.397 [INFO][3885] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.397 [INFO][3885] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.535 [INFO][3918] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.537 [INFO][3918] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.558 [INFO][3918] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.574 [WARNING][3918] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.574 [INFO][3918] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.576 [INFO][3918] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:24.591162 containerd[1477]: 2026-03-07 00:54:24.587 [INFO][3885] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:24.591539 containerd[1477]: time="2026-03-07T00:54:24.591368458Z" level=info msg="TearDown network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\" successfully" Mar 7 00:54:24.591539 containerd[1477]: time="2026-03-07T00:54:24.591393623Z" level=info msg="StopPodSandbox for \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\" returns successfully" Mar 7 00:54:24.597238 systemd[1]: run-netns-cni\x2ddab20814\x2d8e20\x2df80e\x2d448e\x2dfd0c6e8dfcd1.mount: Deactivated successfully. Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.443 [INFO][3837] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.445 [INFO][3837] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" iface="eth0" netns="/var/run/netns/cni-2ee64f89-e90a-d368-b13c-95e6056dfc08" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.446 [INFO][3837] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" iface="eth0" netns="/var/run/netns/cni-2ee64f89-e90a-d368-b13c-95e6056dfc08" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.446 [INFO][3837] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" iface="eth0" netns="/var/run/netns/cni-2ee64f89-e90a-d368-b13c-95e6056dfc08" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.446 [INFO][3837] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.446 [INFO][3837] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.562 [INFO][3938] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.563 [INFO][3938] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.580 [INFO][3938] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.606 [WARNING][3938] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.607 [INFO][3938] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.612 [INFO][3938] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:24.619256 containerd[1477]: 2026-03-07 00:54:24.616 [INFO][3837] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:24.621055 containerd[1477]: time="2026-03-07T00:54:24.620330109Z" level=info msg="TearDown network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\" successfully" Mar 7 00:54:24.621055 containerd[1477]: time="2026-03-07T00:54:24.620361516Z" level=info msg="StopPodSandbox for \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\" returns successfully" Mar 7 00:54:24.624861 containerd[1477]: time="2026-03-07T00:54:24.624818122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-9wghj,Uid:16c7f73a-a0d9-4851-9016-87e6dc57075b,Namespace:calico-system,Attempt:1,}" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.413 [INFO][3886] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.413 [INFO][3886] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" iface="eth0" netns="/var/run/netns/cni-1c749fff-788a-cbb6-eef9-20e28b69ab7f" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.413 [INFO][3886] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" iface="eth0" netns="/var/run/netns/cni-1c749fff-788a-cbb6-eef9-20e28b69ab7f" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.423 [INFO][3886] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" iface="eth0" netns="/var/run/netns/cni-1c749fff-788a-cbb6-eef9-20e28b69ab7f" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.423 [INFO][3886] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.423 [INFO][3886] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.569 [INFO][3935] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.571 [INFO][3935] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.612 [INFO][3935] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.631 [WARNING][3935] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.631 [INFO][3935] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.634 [INFO][3935] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:24.638385 containerd[1477]: 2026-03-07 00:54:24.637 [INFO][3886] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:24.642426 containerd[1477]: time="2026-03-07T00:54:24.642255574Z" level=info msg="TearDown network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\" successfully" Mar 7 00:54:24.642426 containerd[1477]: time="2026-03-07T00:54:24.642289982Z" level=info msg="StopPodSandbox for \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\" returns successfully" Mar 7 00:54:24.646318 containerd[1477]: time="2026-03-07T00:54:24.645971772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6nrd,Uid:dcb976b7-3486-45dd-bbba-92e1e50edecd,Namespace:kube-system,Attempt:1,}" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.404 [INFO][3884] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.404 [INFO][3884] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" iface="eth0" netns="/var/run/netns/cni-72941055-db16-8877-5e93-a050f9d018a1" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.418 [INFO][3884] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" iface="eth0" netns="/var/run/netns/cni-72941055-db16-8877-5e93-a050f9d018a1" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.423 [INFO][3884] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" iface="eth0" netns="/var/run/netns/cni-72941055-db16-8877-5e93-a050f9d018a1" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.423 [INFO][3884] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.423 [INFO][3884] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.581 [INFO][3933] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.585 [INFO][3933] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.634 [INFO][3933] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.651 [WARNING][3933] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.651 [INFO][3933] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.654 [INFO][3933] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:24.673906 containerd[1477]: 2026-03-07 00:54:24.666 [INFO][3884] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:24.676504 containerd[1477]: time="2026-03-07T00:54:24.676142297Z" level=info msg="TearDown network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\" successfully" Mar 7 00:54:24.676504 containerd[1477]: time="2026-03-07T00:54:24.676186427Z" level=info msg="StopPodSandbox for \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\" returns successfully" Mar 7 00:54:24.681324 containerd[1477]: time="2026-03-07T00:54:24.680116353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6674bf8c76-nmdd8,Uid:5a4b6a6b-223d-4d9c-9933-e41491314c6d,Namespace:calico-system,Attempt:1,}" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.393 [INFO][3847] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.394 [INFO][3847] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" iface="eth0" netns="/var/run/netns/cni-25f2764e-467f-bb7f-549d-acce568d63af" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.395 [INFO][3847] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" iface="eth0" netns="/var/run/netns/cni-25f2764e-467f-bb7f-549d-acce568d63af" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.396 [INFO][3847] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" iface="eth0" netns="/var/run/netns/cni-25f2764e-467f-bb7f-549d-acce568d63af" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.396 [INFO][3847] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.396 [INFO][3847] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.585 [INFO][3916] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.585 [INFO][3916] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.654 [INFO][3916] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.677 [WARNING][3916] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.677 [INFO][3916] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.683 [INFO][3916] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:24.698918 containerd[1477]: 2026-03-07 00:54:24.688 [INFO][3847] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:24.703993 containerd[1477]: time="2026-03-07T00:54:24.703953409Z" level=info msg="TearDown network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\" successfully" Mar 7 00:54:24.704213 containerd[1477]: time="2026-03-07T00:54:24.704080758Z" level=info msg="StopPodSandbox for \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\" returns successfully" Mar 7 00:54:24.709936 containerd[1477]: time="2026-03-07T00:54:24.709897510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-kppgg,Uid:5bc543fa-d95f-4c47-aab4-75443fc7ed6f,Namespace:kube-system,Attempt:1,}" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.431 [INFO][3848] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.431 [INFO][3848] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" iface="eth0" netns="/var/run/netns/cni-615c20e5-2e7a-32b2-0f50-a561e915b4f9" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.431 [INFO][3848] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" iface="eth0" netns="/var/run/netns/cni-615c20e5-2e7a-32b2-0f50-a561e915b4f9" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.434 [INFO][3848] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" iface="eth0" netns="/var/run/netns/cni-615c20e5-2e7a-32b2-0f50-a561e915b4f9" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.435 [INFO][3848] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.435 [INFO][3848] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.594 [INFO][3934] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.594 [INFO][3934] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.686 [INFO][3934] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.712 [WARNING][3934] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.712 [INFO][3934] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.715 [INFO][3934] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:24.724987 containerd[1477]: 2026-03-07 00:54:24.720 [INFO][3848] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:24.725891 containerd[1477]: time="2026-03-07T00:54:24.725529115Z" level=info msg="TearDown network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\" successfully" Mar 7 00:54:24.726164 containerd[1477]: time="2026-03-07T00:54:24.725780812Z" level=info msg="StopPodSandbox for \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\" returns successfully" Mar 7 00:54:24.732653 containerd[1477]: time="2026-03-07T00:54:24.731497181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-z9vmk,Uid:0106d835-24d5-479e-aa94-270e56a82826,Namespace:calico-system,Attempt:1,}" Mar 7 00:54:24.742673 kubelet[2596]: I0307 00:54:24.742607 2596 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "60512900-2075-4aec-8ac2-3b41bbf3cac4" (UID: "60512900-2075-4aec-8ac2-3b41bbf3cac4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:54:24.743828 kubelet[2596]: I0307 00:54:24.743805 2596 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-ca-bundle\") pod \"60512900-2075-4aec-8ac2-3b41bbf3cac4\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " Mar 7 00:54:24.743977 kubelet[2596]: I0307 00:54:24.743963 2596 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-nginx-config\") pod \"60512900-2075-4aec-8ac2-3b41bbf3cac4\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " Mar 7 00:54:24.744055 kubelet[2596]: I0307 00:54:24.744043 2596 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbq5c\" (UniqueName: \"kubernetes.io/projected/60512900-2075-4aec-8ac2-3b41bbf3cac4-kube-api-access-qbq5c\") pod \"60512900-2075-4aec-8ac2-3b41bbf3cac4\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " Mar 7 00:54:24.744144 kubelet[2596]: I0307 00:54:24.744132 2596 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-backend-key-pair\") pod \"60512900-2075-4aec-8ac2-3b41bbf3cac4\" (UID: \"60512900-2075-4aec-8ac2-3b41bbf3cac4\") " Mar 7 00:54:24.744270 kubelet[2596]: I0307 00:54:24.744226 2596 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "60512900-2075-4aec-8ac2-3b41bbf3cac4" (UID: "60512900-2075-4aec-8ac2-3b41bbf3cac4"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 00:54:24.744336 kubelet[2596]: I0307 00:54:24.744323 2596 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-ca-bundle\") on node \"ci-4081-3-6-n-f47b87f6f2\" DevicePath \"\"" Mar 7 00:54:24.752175 kubelet[2596]: I0307 00:54:24.752128 2596 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60512900-2075-4aec-8ac2-3b41bbf3cac4-kube-api-access-qbq5c" (OuterVolumeSpecName: "kube-api-access-qbq5c") pod "60512900-2075-4aec-8ac2-3b41bbf3cac4" (UID: "60512900-2075-4aec-8ac2-3b41bbf3cac4"). InnerVolumeSpecName "kube-api-access-qbq5c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 7 00:54:24.753888 kubelet[2596]: I0307 00:54:24.753849 2596 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "60512900-2075-4aec-8ac2-3b41bbf3cac4" (UID: "60512900-2075-4aec-8ac2-3b41bbf3cac4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 7 00:54:24.844822 kubelet[2596]: I0307 00:54:24.844721 2596 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/60512900-2075-4aec-8ac2-3b41bbf3cac4-nginx-config\") on node \"ci-4081-3-6-n-f47b87f6f2\" DevicePath \"\"" Mar 7 00:54:24.845148 kubelet[2596]: I0307 00:54:24.845130 2596 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbq5c\" (UniqueName: \"kubernetes.io/projected/60512900-2075-4aec-8ac2-3b41bbf3cac4-kube-api-access-qbq5c\") on node \"ci-4081-3-6-n-f47b87f6f2\" DevicePath \"\"" Mar 7 00:54:24.845398 kubelet[2596]: I0307 00:54:24.845382 2596 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/60512900-2075-4aec-8ac2-3b41bbf3cac4-whisker-backend-key-pair\") on node \"ci-4081-3-6-n-f47b87f6f2\" DevicePath \"\"" Mar 7 00:54:24.960100 systemd[1]: run-netns-cni\x2d2ee64f89\x2de90a\x2dd368\x2db13c\x2d95e6056dfc08.mount: Deactivated successfully. Mar 7 00:54:24.960466 systemd[1]: run-netns-cni\x2d615c20e5\x2d2e7a\x2d32b2\x2d0f50\x2da561e915b4f9.mount: Deactivated successfully. Mar 7 00:54:24.961526 systemd[1]: run-netns-cni\x2d72941055\x2ddb16\x2d8877\x2d5e93\x2da050f9d018a1.mount: Deactivated successfully. Mar 7 00:54:24.961582 systemd[1]: run-netns-cni\x2d1c749fff\x2d788a\x2dcbb6\x2deef9\x2d20e28b69ab7f.mount: Deactivated successfully. Mar 7 00:54:24.961642 systemd[1]: run-netns-cni\x2d25f2764e\x2d467f\x2dbb7f\x2d549d\x2dacce568d63af.mount: Deactivated successfully. Mar 7 00:54:24.961710 systemd[1]: var-lib-kubelet-pods-60512900\x2d2075\x2d4aec\x2d8ac2\x2d3b41bbf3cac4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqbq5c.mount: Deactivated successfully. Mar 7 00:54:24.961766 systemd[1]: var-lib-kubelet-pods-60512900\x2d2075\x2d4aec\x2d8ac2\x2d3b41bbf3cac4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 7 00:54:24.998302 systemd[1]: Removed slice kubepods-besteffort-pod60512900_2075_4aec_8ac2_3b41bbf3cac4.slice - libcontainer container kubepods-besteffort-pod60512900_2075_4aec_8ac2_3b41bbf3cac4.slice. Mar 7 00:54:25.006105 systemd-networkd[1382]: cali25a7fbc2063: Link UP Mar 7 00:54:25.008328 systemd-networkd[1382]: cali25a7fbc2063: Gained carrier Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.696 [ERROR][3977] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.718 [INFO][3977] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0 calico-apiserver-75b65f9ff7- calico-system 16c7f73a-a0d9-4851-9016-87e6dc57075b 885 0 2026-03-07 00:54:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75b65f9ff7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 calico-apiserver-75b65f9ff7-9wghj eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali25a7fbc2063 [] [] }} ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.718 [INFO][3977] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.790 [INFO][3997] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" HandleID="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.814 [INFO][3997] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" HandleID="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f9e90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"calico-apiserver-75b65f9ff7-9wghj", "timestamp":"2026-03-07 00:54:24.790341133 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000338160)} Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.815 [INFO][3997] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.815 [INFO][3997] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.815 [INFO][3997] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.821 [INFO][3997] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.842 [INFO][3997] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.851 [INFO][3997] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.858 [INFO][3997] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.862 [INFO][3997] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.862 [INFO][3997] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.867 [INFO][3997] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.875 [INFO][3997] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.897 [INFO][3997] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.193/26] block=192.168.88.192/26 handle="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.897 [INFO][3997] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.193/26] handle="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.045067 containerd[1477]: 2026-03-07 00:54:24.897 [INFO][3997] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:25.046614 containerd[1477]: 2026-03-07 00:54:24.897 [INFO][3997] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.193/26] IPv6=[] ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" HandleID="k8s-pod-network.f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:25.046614 containerd[1477]: 2026-03-07 00:54:24.923 [INFO][3977] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"16c7f73a-a0d9-4851-9016-87e6dc57075b", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"calico-apiserver-75b65f9ff7-9wghj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali25a7fbc2063", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.046614 containerd[1477]: 2026-03-07 00:54:24.923 [INFO][3977] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.193/32] ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:25.046614 containerd[1477]: 2026-03-07 00:54:24.923 [INFO][3977] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali25a7fbc2063 ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:25.046614 containerd[1477]: 2026-03-07 00:54:25.012 [INFO][3977] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:25.046614 containerd[1477]: 2026-03-07 00:54:25.016 [INFO][3977] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"16c7f73a-a0d9-4851-9016-87e6dc57075b", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab", Pod:"calico-apiserver-75b65f9ff7-9wghj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali25a7fbc2063", MAC:"6a:c0:e1:96:ea:9b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.047570 containerd[1477]: 2026-03-07 00:54:25.037 [INFO][3977] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-9wghj" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:25.101666 systemd-networkd[1382]: cali1d845415f42: Link UP Mar 7 00:54:25.104457 systemd-networkd[1382]: cali1d845415f42: Gained carrier Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.662 [ERROR][3963] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.692 [INFO][3963] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0 calico-apiserver-75b65f9ff7- calico-system 92eb6a25-972b-4cc1-a42c-1135d83a0c74 880 0 2026-03-07 00:54:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75b65f9ff7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 calico-apiserver-75b65f9ff7-snpd2 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali1d845415f42 [] [] }} ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.692 [INFO][3963] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.812 [INFO][3992] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" HandleID="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.844 [INFO][3992] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" HandleID="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000398530), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"calico-apiserver-75b65f9ff7-snpd2", "timestamp":"2026-03-07 00:54:24.812412951 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000267ce0)} Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.844 [INFO][3992] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.897 [INFO][3992] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.897 [INFO][3992] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.922 [INFO][3992] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:24.970 [INFO][3992] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.007 [INFO][3992] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.019 [INFO][3992] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.027 [INFO][3992] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.031 [INFO][3992] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.037 [INFO][3992] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28 Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.053 [INFO][3992] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.064 [INFO][3992] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.194/26] block=192.168.88.192/26 handle="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.064 [INFO][3992] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.194/26] handle="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.162708 containerd[1477]: 2026-03-07 00:54:25.064 [INFO][3992] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:25.166555 containerd[1477]: 2026-03-07 00:54:25.064 [INFO][3992] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.194/26] IPv6=[] ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" HandleID="k8s-pod-network.54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:25.166555 containerd[1477]: 2026-03-07 00:54:25.078 [INFO][3963] cni-plugin/k8s.go 418: Populated endpoint ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"92eb6a25-972b-4cc1-a42c-1135d83a0c74", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"calico-apiserver-75b65f9ff7-snpd2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1d845415f42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.166555 containerd[1477]: 2026-03-07 00:54:25.079 [INFO][3963] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.194/32] ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:25.166555 containerd[1477]: 2026-03-07 00:54:25.080 [INFO][3963] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1d845415f42 ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:25.166555 containerd[1477]: 2026-03-07 00:54:25.111 [INFO][3963] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:25.166555 containerd[1477]: 2026-03-07 00:54:25.115 [INFO][3963] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"92eb6a25-972b-4cc1-a42c-1135d83a0c74", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28", Pod:"calico-apiserver-75b65f9ff7-snpd2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1d845415f42", MAC:"5e:42:b4:1d:88:ac", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.166790 containerd[1477]: 2026-03-07 00:54:25.148 [INFO][3963] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28" Namespace="calico-system" Pod="calico-apiserver-75b65f9ff7-snpd2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:25.274760 systemd[1]: run-containerd-runc-k8s.io-33c5ee29b7065be2b70ae876cc6b0e99189b2509a4aac7a3cc3f859b5a36dca9-runc.lWGv9z.mount: Deactivated successfully. Mar 7 00:54:25.283629 containerd[1477]: time="2026-03-07T00:54:25.283409295Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:25.285021 containerd[1477]: time="2026-03-07T00:54:25.284868617Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:25.285021 containerd[1477]: time="2026-03-07T00:54:25.284912307Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.285431 containerd[1477]: time="2026-03-07T00:54:25.285390172Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.291767 containerd[1477]: time="2026-03-07T00:54:25.291064626Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:25.291767 containerd[1477]: time="2026-03-07T00:54:25.291126999Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:25.291767 containerd[1477]: time="2026-03-07T00:54:25.291139602Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.291767 containerd[1477]: time="2026-03-07T00:54:25.291224701Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.386792 systemd-networkd[1382]: cali70eecd89192: Link UP Mar 7 00:54:25.387061 systemd-networkd[1382]: cali70eecd89192: Gained carrier Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:24.791 [ERROR][4001] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:24.815 [INFO][4001] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0 coredns-66bc5c9577- kube-system dcb976b7-3486-45dd-bbba-92e1e50edecd 882 0 2026-03-07 00:53:50 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 coredns-66bc5c9577-r6nrd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali70eecd89192 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:24.815 [INFO][4001] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.059 [INFO][4056] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" HandleID="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.115 [INFO][4056] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" HandleID="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400041a110), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"coredns-66bc5c9577-r6nrd", "timestamp":"2026-03-07 00:54:25.059940703 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400051cb00)} Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.115 [INFO][4056] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.119 [INFO][4056] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.120 [INFO][4056] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.134 [INFO][4056] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.183 [INFO][4056] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.223 [INFO][4056] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.234 [INFO][4056] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.256 [INFO][4056] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.257 [INFO][4056] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.275 [INFO][4056] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.306 [INFO][4056] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.373 [INFO][4056] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.195/26] block=192.168.88.192/26 handle="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.373 [INFO][4056] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.195/26] handle="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.373 [INFO][4056] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:25.426311 containerd[1477]: 2026-03-07 00:54:25.373 [INFO][4056] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.195/26] IPv6=[] ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" HandleID="k8s-pod-network.e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:25.426914 containerd[1477]: 2026-03-07 00:54:25.380 [INFO][4001] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dcb976b7-3486-45dd-bbba-92e1e50edecd", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"coredns-66bc5c9577-r6nrd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali70eecd89192", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.426914 containerd[1477]: 2026-03-07 00:54:25.380 [INFO][4001] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.195/32] ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:25.426914 containerd[1477]: 2026-03-07 00:54:25.380 [INFO][4001] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali70eecd89192 ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:25.426914 containerd[1477]: 2026-03-07 00:54:25.382 [INFO][4001] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:25.427042 containerd[1477]: 2026-03-07 00:54:25.382 [INFO][4001] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dcb976b7-3486-45dd-bbba-92e1e50edecd", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a", Pod:"coredns-66bc5c9577-r6nrd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali70eecd89192", MAC:"f6:1c:92:ab:a9:67", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.427042 containerd[1477]: 2026-03-07 00:54:25.414 [INFO][4001] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a" Namespace="kube-system" Pod="coredns-66bc5c9577-r6nrd" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:25.471625 containerd[1477]: time="2026-03-07T00:54:25.471441421Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:25.472270 containerd[1477]: time="2026-03-07T00:54:25.472161820Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:25.472700 containerd[1477]: time="2026-03-07T00:54:25.472644286Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.473899 containerd[1477]: time="2026-03-07T00:54:25.473855754Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.530320 systemd-networkd[1382]: calic29b16583e5: Link UP Mar 7 00:54:25.532993 systemd-networkd[1382]: calic29b16583e5: Gained carrier Mar 7 00:54:25.554310 kubelet[2596]: I0307 00:54:25.554023 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7c4p\" (UniqueName: \"kubernetes.io/projected/9efb815f-e9d0-4544-b894-41d00d6b2a92-kube-api-access-d7c4p\") pod \"whisker-5ddf4c559d-tprh2\" (UID: \"9efb815f-e9d0-4544-b894-41d00d6b2a92\") " pod="calico-system/whisker-5ddf4c559d-tprh2" Mar 7 00:54:25.559589 kubelet[2596]: I0307 00:54:25.558201 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/9efb815f-e9d0-4544-b894-41d00d6b2a92-nginx-config\") pod \"whisker-5ddf4c559d-tprh2\" (UID: \"9efb815f-e9d0-4544-b894-41d00d6b2a92\") " pod="calico-system/whisker-5ddf4c559d-tprh2" Mar 7 00:54:25.559589 kubelet[2596]: I0307 00:54:25.559053 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9efb815f-e9d0-4544-b894-41d00d6b2a92-whisker-backend-key-pair\") pod \"whisker-5ddf4c559d-tprh2\" (UID: \"9efb815f-e9d0-4544-b894-41d00d6b2a92\") " pod="calico-system/whisker-5ddf4c559d-tprh2" Mar 7 00:54:25.559589 kubelet[2596]: I0307 00:54:25.559095 2596 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9efb815f-e9d0-4544-b894-41d00d6b2a92-whisker-ca-bundle\") pod \"whisker-5ddf4c559d-tprh2\" (UID: \"9efb815f-e9d0-4544-b894-41d00d6b2a92\") " pod="calico-system/whisker-5ddf4c559d-tprh2" Mar 7 00:54:25.555990 systemd[1]: Started cri-containerd-54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28.scope - libcontainer container 54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28. Mar 7 00:54:25.567667 systemd[1]: Started cri-containerd-e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a.scope - libcontainer container e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a. Mar 7 00:54:25.570998 systemd[1]: Started cri-containerd-f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab.scope - libcontainer container f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab. Mar 7 00:54:25.579116 systemd[1]: Created slice kubepods-besteffort-pod9efb815f_e9d0_4544_b894_41d00d6b2a92.slice - libcontainer container kubepods-besteffort-pod9efb815f_e9d0_4544_b894_41d00d6b2a92.slice. Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:24.898 [ERROR][4008] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:24.926 [INFO][4008] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0 calico-kube-controllers-6674bf8c76- calico-system 5a4b6a6b-223d-4d9c-9933-e41491314c6d 881 0 2026-03-07 00:54:07 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6674bf8c76 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 calico-kube-controllers-6674bf8c76-nmdd8 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic29b16583e5 [] [] }} ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:24.926 [INFO][4008] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.325 [INFO][4101] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" HandleID="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.392 [INFO][4101] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" HandleID="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004daf0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"calico-kube-controllers-6674bf8c76-nmdd8", "timestamp":"2026-03-07 00:54:25.325866071 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000511340)} Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.405 [INFO][4101] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.405 [INFO][4101] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.405 [INFO][4101] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.429 [INFO][4101] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.473 [INFO][4101] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.490 [INFO][4101] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.495 [INFO][4101] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.499 [INFO][4101] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.499 [INFO][4101] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.505 [INFO][4101] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462 Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.512 [INFO][4101] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.520 [INFO][4101] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.196/26] block=192.168.88.192/26 handle="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.520 [INFO][4101] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.196/26] handle="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.585025 containerd[1477]: 2026-03-07 00:54:25.520 [INFO][4101] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:25.585903 containerd[1477]: 2026-03-07 00:54:25.520 [INFO][4101] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.196/26] IPv6=[] ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" HandleID="k8s-pod-network.ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:25.585903 containerd[1477]: 2026-03-07 00:54:25.524 [INFO][4008] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0", GenerateName:"calico-kube-controllers-6674bf8c76-", Namespace:"calico-system", SelfLink:"", UID:"5a4b6a6b-223d-4d9c-9933-e41491314c6d", ResourceVersion:"881", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6674bf8c76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"calico-kube-controllers-6674bf8c76-nmdd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic29b16583e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.585903 containerd[1477]: 2026-03-07 00:54:25.524 [INFO][4008] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.196/32] ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:25.585903 containerd[1477]: 2026-03-07 00:54:25.524 [INFO][4008] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic29b16583e5 ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:25.585903 containerd[1477]: 2026-03-07 00:54:25.533 [INFO][4008] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:25.585903 containerd[1477]: 2026-03-07 00:54:25.541 [INFO][4008] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0", GenerateName:"calico-kube-controllers-6674bf8c76-", Namespace:"calico-system", SelfLink:"", UID:"5a4b6a6b-223d-4d9c-9933-e41491314c6d", ResourceVersion:"881", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6674bf8c76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462", Pod:"calico-kube-controllers-6674bf8c76-nmdd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic29b16583e5", MAC:"32:c7:32:cd:38:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.586084 containerd[1477]: 2026-03-07 00:54:25.580 [INFO][4008] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462" Namespace="calico-system" Pod="calico-kube-controllers-6674bf8c76-nmdd8" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:25.626982 containerd[1477]: time="2026-03-07T00:54:25.626716712Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:25.634097 containerd[1477]: time="2026-03-07T00:54:25.632254455Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:25.634097 containerd[1477]: time="2026-03-07T00:54:25.632291544Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.634097 containerd[1477]: time="2026-03-07T00:54:25.632409490Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.649856 systemd-networkd[1382]: cali958d660f60e: Link UP Mar 7 00:54:25.651435 systemd-networkd[1382]: cali958d660f60e: Gained carrier Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:24.901 [ERROR][4022] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.022 [INFO][4022] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0 coredns-66bc5c9577- kube-system 5bc543fa-d95f-4c47-aab4-75443fc7ed6f 878 0 2026-03-07 00:53:50 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 coredns-66bc5c9577-kppgg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali958d660f60e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.026 [INFO][4022] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.409 [INFO][4115] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" HandleID="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.463 [INFO][4115] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" HandleID="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005252f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"coredns-66bc5c9577-kppgg", "timestamp":"2026-03-07 00:54:25.409904991 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001862c0)} Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.463 [INFO][4115] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.523 [INFO][4115] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.523 [INFO][4115] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.537 [INFO][4115] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.569 [INFO][4115] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.591 [INFO][4115] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.594 [INFO][4115] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.599 [INFO][4115] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.599 [INFO][4115] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.602 [INFO][4115] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.613 [INFO][4115] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.628 [INFO][4115] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.197/26] block=192.168.88.192/26 handle="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.628 [INFO][4115] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.197/26] handle="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.628 [INFO][4115] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:25.688672 containerd[1477]: 2026-03-07 00:54:25.628 [INFO][4115] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.197/26] IPv6=[] ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" HandleID="k8s-pod-network.37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:25.692500 containerd[1477]: 2026-03-07 00:54:25.640 [INFO][4022] cni-plugin/k8s.go 418: Populated endpoint ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5bc543fa-d95f-4c47-aab4-75443fc7ed6f", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"coredns-66bc5c9577-kppgg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali958d660f60e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.692500 containerd[1477]: 2026-03-07 00:54:25.641 [INFO][4022] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.197/32] ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:25.692500 containerd[1477]: 2026-03-07 00:54:25.641 [INFO][4022] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali958d660f60e ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:25.692500 containerd[1477]: 2026-03-07 00:54:25.653 [INFO][4022] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:25.692631 containerd[1477]: 2026-03-07 00:54:25.656 [INFO][4022] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5bc543fa-d95f-4c47-aab4-75443fc7ed6f", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c", Pod:"coredns-66bc5c9577-kppgg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali958d660f60e", MAC:"42:d9:91:86:b6:72", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.692631 containerd[1477]: 2026-03-07 00:54:25.684 [INFO][4022] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c" Namespace="kube-system" Pod="coredns-66bc5c9577-kppgg" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:25.721009 systemd[1]: Started cri-containerd-ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462.scope - libcontainer container ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462. Mar 7 00:54:25.759892 containerd[1477]: time="2026-03-07T00:54:25.759394414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6nrd,Uid:dcb976b7-3486-45dd-bbba-92e1e50edecd,Namespace:kube-system,Attempt:1,} returns sandbox id \"e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a\"" Mar 7 00:54:25.774182 containerd[1477]: time="2026-03-07T00:54:25.773422632Z" level=info msg="CreateContainer within sandbox \"e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:54:25.793788 containerd[1477]: time="2026-03-07T00:54:25.793650859Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-9wghj,Uid:16c7f73a-a0d9-4851-9016-87e6dc57075b,Namespace:calico-system,Attempt:1,} returns sandbox id \"f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab\"" Mar 7 00:54:25.795920 systemd-networkd[1382]: cali36729ad4bf6: Link UP Mar 7 00:54:25.799076 systemd-networkd[1382]: cali36729ad4bf6: Gained carrier Mar 7 00:54:25.806056 containerd[1477]: time="2026-03-07T00:54:25.805574852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:54:25.819335 containerd[1477]: time="2026-03-07T00:54:25.818494225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75b65f9ff7-snpd2,Uid:92eb6a25-972b-4cc1-a42c-1135d83a0c74,Namespace:calico-system,Attempt:1,} returns sandbox id \"54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28\"" Mar 7 00:54:25.824023 containerd[1477]: time="2026-03-07T00:54:25.823889377Z" level=info msg="CreateContainer within sandbox \"e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3263d8493de9420fe1916b93948ffc727b1636a18ff8b5059c0fe96ba5ac25cd\"" Mar 7 00:54:25.832833 containerd[1477]: time="2026-03-07T00:54:25.832769498Z" level=info msg="StartContainer for \"3263d8493de9420fe1916b93948ffc727b1636a18ff8b5059c0fe96ba5ac25cd\"" Mar 7 00:54:25.858300 containerd[1477]: time="2026-03-07T00:54:25.848237274Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:25.858300 containerd[1477]: time="2026-03-07T00:54:25.848305609Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:25.858300 containerd[1477]: time="2026-03-07T00:54:25.848321293Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.858300 containerd[1477]: time="2026-03-07T00:54:25.848465885Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:24.960 [ERROR][4028] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.033 [INFO][4028] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0 goldmane-cccfbd5cf- calico-system 0106d835-24d5-479e-aa94-270e56a82826 883 0 2026-03-07 00:54:05 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 goldmane-cccfbd5cf-z9vmk eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali36729ad4bf6 [] [] }} ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.034 [INFO][4028] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.426 [INFO][4116] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" HandleID="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.472 [INFO][4116] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" HandleID="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003f3560), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"goldmane-cccfbd5cf-z9vmk", "timestamp":"2026-03-07 00:54:25.426028792 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400021c000)} Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.472 [INFO][4116] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.629 [INFO][4116] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.629 [INFO][4116] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.638 [INFO][4116] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.676 [INFO][4116] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.703 [INFO][4116] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.710 [INFO][4116] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.717 [INFO][4116] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.718 [INFO][4116] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.724 [INFO][4116] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.738 [INFO][4116] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.754 [INFO][4116] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.198/26] block=192.168.88.192/26 handle="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.755 [INFO][4116] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.198/26] handle="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.755 [INFO][4116] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:25.878706 containerd[1477]: 2026-03-07 00:54:25.755 [INFO][4116] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.198/26] IPv6=[] ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" HandleID="k8s-pod-network.5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:25.879325 containerd[1477]: 2026-03-07 00:54:25.779 [INFO][4028] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0106d835-24d5-479e-aa94-270e56a82826", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"goldmane-cccfbd5cf-z9vmk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali36729ad4bf6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.879325 containerd[1477]: 2026-03-07 00:54:25.779 [INFO][4028] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.198/32] ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:25.879325 containerd[1477]: 2026-03-07 00:54:25.779 [INFO][4028] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali36729ad4bf6 ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:25.879325 containerd[1477]: 2026-03-07 00:54:25.808 [INFO][4028] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:25.879325 containerd[1477]: 2026-03-07 00:54:25.811 [INFO][4028] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0106d835-24d5-479e-aa94-270e56a82826", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b", Pod:"goldmane-cccfbd5cf-z9vmk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali36729ad4bf6", MAC:"4e:b5:89:cb:96:7e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:25.879325 containerd[1477]: 2026-03-07 00:54:25.844 [INFO][4028] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b" Namespace="calico-system" Pod="goldmane-cccfbd5cf-z9vmk" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:25.893671 containerd[1477]: time="2026-03-07T00:54:25.893610494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ddf4c559d-tprh2,Uid:9efb815f-e9d0-4544-b894-41d00d6b2a92,Namespace:calico-system,Attempt:0,}" Mar 7 00:54:25.949464 containerd[1477]: time="2026-03-07T00:54:25.948158661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6674bf8c76-nmdd8,Uid:5a4b6a6b-223d-4d9c-9933-e41491314c6d,Namespace:calico-system,Attempt:1,} returns sandbox id \"ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462\"" Mar 7 00:54:25.979923 systemd[1]: Started cri-containerd-37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c.scope - libcontainer container 37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c. Mar 7 00:54:26.007828 systemd[1]: run-containerd-runc-k8s.io-3263d8493de9420fe1916b93948ffc727b1636a18ff8b5059c0fe96ba5ac25cd-runc.m3PSDo.mount: Deactivated successfully. Mar 7 00:54:26.022918 systemd[1]: Started cri-containerd-3263d8493de9420fe1916b93948ffc727b1636a18ff8b5059c0fe96ba5ac25cd.scope - libcontainer container 3263d8493de9420fe1916b93948ffc727b1636a18ff8b5059c0fe96ba5ac25cd. Mar 7 00:54:26.045973 containerd[1477]: time="2026-03-07T00:54:26.045782983Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:26.048147 containerd[1477]: time="2026-03-07T00:54:26.047886958Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:26.048147 containerd[1477]: time="2026-03-07T00:54:26.048037311Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:26.048625 containerd[1477]: time="2026-03-07T00:54:26.048571986Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:26.104738 containerd[1477]: time="2026-03-07T00:54:26.104417394Z" level=info msg="StartContainer for \"3263d8493de9420fe1916b93948ffc727b1636a18ff8b5059c0fe96ba5ac25cd\" returns successfully" Mar 7 00:54:26.108799 containerd[1477]: time="2026-03-07T00:54:26.106918095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-kppgg,Uid:5bc543fa-d95f-4c47-aab4-75443fc7ed6f,Namespace:kube-system,Attempt:1,} returns sandbox id \"37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c\"" Mar 7 00:54:26.116576 containerd[1477]: time="2026-03-07T00:54:26.116412990Z" level=info msg="CreateContainer within sandbox \"37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 00:54:26.127930 systemd[1]: Started cri-containerd-5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b.scope - libcontainer container 5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b. Mar 7 00:54:26.176727 containerd[1477]: time="2026-03-07T00:54:26.176592296Z" level=info msg="CreateContainer within sandbox \"37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ade04ccf5d72bc8990a35eafcaa93aa24bfd43ac3765c4cdd397c8dda44086cd\"" Mar 7 00:54:26.178493 containerd[1477]: time="2026-03-07T00:54:26.177938347Z" level=info msg="StartContainer for \"ade04ccf5d72bc8990a35eafcaa93aa24bfd43ac3765c4cdd397c8dda44086cd\"" Mar 7 00:54:26.255906 systemd[1]: Started cri-containerd-ade04ccf5d72bc8990a35eafcaa93aa24bfd43ac3765c4cdd397c8dda44086cd.scope - libcontainer container ade04ccf5d72bc8990a35eafcaa93aa24bfd43ac3765c4cdd397c8dda44086cd. Mar 7 00:54:26.302869 kubelet[2596]: I0307 00:54:26.302803 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-r6nrd" podStartSLOduration=36.302776288 podStartE2EDuration="36.302776288s" podCreationTimestamp="2026-03-07 00:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:54:26.300914885 +0000 UTC m=+41.501688568" watchObservedRunningTime="2026-03-07 00:54:26.302776288 +0000 UTC m=+41.503549971" Mar 7 00:54:26.308090 containerd[1477]: time="2026-03-07T00:54:26.307973933Z" level=info msg="StartContainer for \"ade04ccf5d72bc8990a35eafcaa93aa24bfd43ac3765c4cdd397c8dda44086cd\" returns successfully" Mar 7 00:54:26.367883 systemd-networkd[1382]: calib2c74074558: Link UP Mar 7 00:54:26.369066 systemd-networkd[1382]: calib2c74074558: Gained carrier Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.113 [ERROR][4447] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.162 [INFO][4447] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0 whisker-5ddf4c559d- calico-system 9efb815f-e9d0-4544-b894-41d00d6b2a92 909 0 2026-03-07 00:54:25 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5ddf4c559d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 whisker-5ddf4c559d-tprh2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib2c74074558 [] [] }} ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.162 [INFO][4447] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.257 [INFO][4535] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" HandleID="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.298 [INFO][4535] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" HandleID="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003fe160), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"whisker-5ddf4c559d-tprh2", "timestamp":"2026-03-07 00:54:26.25764492 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000186000)} Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.298 [INFO][4535] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.298 [INFO][4535] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.298 [INFO][4535] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.309 [INFO][4535] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.316 [INFO][4535] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.323 [INFO][4535] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.327 [INFO][4535] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.330 [INFO][4535] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.330 [INFO][4535] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.333 [INFO][4535] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7 Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.343 [INFO][4535] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.355 [INFO][4535] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.199/26] block=192.168.88.192/26 handle="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.356 [INFO][4535] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.199/26] handle="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.356 [INFO][4535] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:26.414292 containerd[1477]: 2026-03-07 00:54:26.356 [INFO][4535] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.199/26] IPv6=[] ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" HandleID="k8s-pod-network.758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" Mar 7 00:54:26.416234 containerd[1477]: 2026-03-07 00:54:26.359 [INFO][4447] cni-plugin/k8s.go 418: Populated endpoint ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0", GenerateName:"whisker-5ddf4c559d-", Namespace:"calico-system", SelfLink:"", UID:"9efb815f-e9d0-4544-b894-41d00d6b2a92", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5ddf4c559d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"whisker-5ddf4c559d-tprh2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib2c74074558", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:26.416234 containerd[1477]: 2026-03-07 00:54:26.360 [INFO][4447] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.199/32] ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" Mar 7 00:54:26.416234 containerd[1477]: 2026-03-07 00:54:26.360 [INFO][4447] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib2c74074558 ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" Mar 7 00:54:26.416234 containerd[1477]: 2026-03-07 00:54:26.371 [INFO][4447] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" Mar 7 00:54:26.416234 containerd[1477]: 2026-03-07 00:54:26.371 [INFO][4447] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0", GenerateName:"whisker-5ddf4c559d-", Namespace:"calico-system", SelfLink:"", UID:"9efb815f-e9d0-4544-b894-41d00d6b2a92", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5ddf4c559d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7", Pod:"whisker-5ddf4c559d-tprh2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib2c74074558", MAC:"f2:34:e0:db:14:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:26.416234 containerd[1477]: 2026-03-07 00:54:26.403 [INFO][4447] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7" Namespace="calico-system" Pod="whisker-5ddf4c559d-tprh2" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--5ddf4c559d--tprh2-eth0" Mar 7 00:54:26.423072 containerd[1477]: time="2026-03-07T00:54:26.423004751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-z9vmk,Uid:0106d835-24d5-479e-aa94-270e56a82826,Namespace:calico-system,Attempt:1,} returns sandbox id \"5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b\"" Mar 7 00:54:26.448990 containerd[1477]: time="2026-03-07T00:54:26.448789012Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:26.448990 containerd[1477]: time="2026-03-07T00:54:26.448850546Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:26.448990 containerd[1477]: time="2026-03-07T00:54:26.448869350Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:26.449546 containerd[1477]: time="2026-03-07T00:54:26.449061071Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:26.472880 systemd[1]: Started cri-containerd-758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7.scope - libcontainer container 758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7. Mar 7 00:54:26.495889 systemd-networkd[1382]: cali1d845415f42: Gained IPv6LL Mar 7 00:54:26.548324 containerd[1477]: time="2026-03-07T00:54:26.548212052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ddf4c559d-tprh2,Uid:9efb815f-e9d0-4544-b894-41d00d6b2a92,Namespace:calico-system,Attempt:0,} returns sandbox id \"758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7\"" Mar 7 00:54:26.663828 kernel: calico-node[4145]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 7 00:54:26.880154 systemd-networkd[1382]: cali36729ad4bf6: Gained IPv6LL Mar 7 00:54:26.938753 kubelet[2596]: I0307 00:54:26.938321 2596 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60512900-2075-4aec-8ac2-3b41bbf3cac4" path="/var/lib/kubelet/pods/60512900-2075-4aec-8ac2-3b41bbf3cac4/volumes" Mar 7 00:54:27.008779 systemd-networkd[1382]: cali25a7fbc2063: Gained IPv6LL Mar 7 00:54:27.124496 systemd-networkd[1382]: vxlan.calico: Link UP Mar 7 00:54:27.124509 systemd-networkd[1382]: vxlan.calico: Gained carrier Mar 7 00:54:27.136774 systemd-networkd[1382]: cali70eecd89192: Gained IPv6LL Mar 7 00:54:27.261301 kubelet[2596]: I0307 00:54:27.261229 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-kppgg" podStartSLOduration=37.261203504 podStartE2EDuration="37.261203504s" podCreationTimestamp="2026-03-07 00:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 00:54:27.261049991 +0000 UTC m=+42.461823674" watchObservedRunningTime="2026-03-07 00:54:27.261203504 +0000 UTC m=+42.461977187" Mar 7 00:54:27.265255 systemd-networkd[1382]: calic29b16583e5: Gained IPv6LL Mar 7 00:54:27.519926 systemd-networkd[1382]: cali958d660f60e: Gained IPv6LL Mar 7 00:54:28.096858 systemd-networkd[1382]: calib2c74074558: Gained IPv6LL Mar 7 00:54:28.287990 systemd-networkd[1382]: vxlan.calico: Gained IPv6LL Mar 7 00:54:28.682707 containerd[1477]: time="2026-03-07T00:54:28.682629034Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:28.684703 containerd[1477]: time="2026-03-07T00:54:28.684326028Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 7 00:54:28.684703 containerd[1477]: time="2026-03-07T00:54:28.684593124Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:28.696883 containerd[1477]: time="2026-03-07T00:54:28.696800189Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:28.698841 containerd[1477]: time="2026-03-07T00:54:28.698707426Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 2.892944732s" Mar 7 00:54:28.698841 containerd[1477]: time="2026-03-07T00:54:28.698746954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:54:28.700119 containerd[1477]: time="2026-03-07T00:54:28.700074591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 00:54:28.703873 containerd[1477]: time="2026-03-07T00:54:28.703827654Z" level=info msg="CreateContainer within sandbox \"f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:54:28.720939 containerd[1477]: time="2026-03-07T00:54:28.720887530Z" level=info msg="CreateContainer within sandbox \"f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"820a82fa0067d4a8755261f7021253077b3f5ccda1141e4fe5d26db6d92b9a01\"" Mar 7 00:54:28.723801 containerd[1477]: time="2026-03-07T00:54:28.722974485Z" level=info msg="StartContainer for \"820a82fa0067d4a8755261f7021253077b3f5ccda1141e4fe5d26db6d92b9a01\"" Mar 7 00:54:28.760948 systemd[1]: Started cri-containerd-820a82fa0067d4a8755261f7021253077b3f5ccda1141e4fe5d26db6d92b9a01.scope - libcontainer container 820a82fa0067d4a8755261f7021253077b3f5ccda1141e4fe5d26db6d92b9a01. Mar 7 00:54:28.805441 containerd[1477]: time="2026-03-07T00:54:28.805292845Z" level=info msg="StartContainer for \"820a82fa0067d4a8755261f7021253077b3f5ccda1141e4fe5d26db6d92b9a01\" returns successfully" Mar 7 00:54:29.097402 containerd[1477]: time="2026-03-07T00:54:29.097326417Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:29.100756 containerd[1477]: time="2026-03-07T00:54:29.099917707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 7 00:54:29.103081 containerd[1477]: time="2026-03-07T00:54:29.103043748Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 402.919147ms" Mar 7 00:54:29.103258 containerd[1477]: time="2026-03-07T00:54:29.103086556Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 7 00:54:29.104850 containerd[1477]: time="2026-03-07T00:54:29.104598986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 7 00:54:29.107620 containerd[1477]: time="2026-03-07T00:54:29.107400480Z" level=info msg="CreateContainer within sandbox \"54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 00:54:29.124338 containerd[1477]: time="2026-03-07T00:54:29.124198801Z" level=info msg="CreateContainer within sandbox \"54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9eccdcdfb978f4a90475f418dfd9733548412be526990f3c56fa44f718bc7030\"" Mar 7 00:54:29.126387 containerd[1477]: time="2026-03-07T00:54:29.125083542Z" level=info msg="StartContainer for \"9eccdcdfb978f4a90475f418dfd9733548412be526990f3c56fa44f718bc7030\"" Mar 7 00:54:29.172880 systemd[1]: Started cri-containerd-9eccdcdfb978f4a90475f418dfd9733548412be526990f3c56fa44f718bc7030.scope - libcontainer container 9eccdcdfb978f4a90475f418dfd9733548412be526990f3c56fa44f718bc7030. Mar 7 00:54:29.210042 containerd[1477]: time="2026-03-07T00:54:29.210002457Z" level=info msg="StartContainer for \"9eccdcdfb978f4a90475f418dfd9733548412be526990f3c56fa44f718bc7030\" returns successfully" Mar 7 00:54:29.281784 kubelet[2596]: I0307 00:54:29.281712 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-75b65f9ff7-snpd2" podStartSLOduration=21.035999458 podStartE2EDuration="24.281690622s" podCreationTimestamp="2026-03-07 00:54:05 +0000 UTC" firstStartedPulling="2026-03-07 00:54:25.858399398 +0000 UTC m=+41.059173081" lastFinishedPulling="2026-03-07 00:54:29.104090562 +0000 UTC m=+44.304864245" observedRunningTime="2026-03-07 00:54:29.281597442 +0000 UTC m=+44.482371125" watchObservedRunningTime="2026-03-07 00:54:29.281690622 +0000 UTC m=+44.482464305" Mar 7 00:54:30.271200 kubelet[2596]: I0307 00:54:30.271165 2596 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:54:30.271200 kubelet[2596]: I0307 00:54:30.271204 2596 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:54:32.344903 containerd[1477]: time="2026-03-07T00:54:32.344799802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:32.347219 containerd[1477]: time="2026-03-07T00:54:32.346823237Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 7 00:54:32.349626 containerd[1477]: time="2026-03-07T00:54:32.348365058Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:32.359041 containerd[1477]: time="2026-03-07T00:54:32.358445186Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:32.359859 containerd[1477]: time="2026-03-07T00:54:32.359814374Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.255168098s" Mar 7 00:54:32.359934 containerd[1477]: time="2026-03-07T00:54:32.359864743Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 7 00:54:32.362096 containerd[1477]: time="2026-03-07T00:54:32.362059372Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 7 00:54:32.382410 containerd[1477]: time="2026-03-07T00:54:32.382358456Z" level=info msg="CreateContainer within sandbox \"ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 7 00:54:32.398343 containerd[1477]: time="2026-03-07T00:54:32.398289207Z" level=info msg="CreateContainer within sandbox \"ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1bb512d359651bb3f570cfb8addf34fae133f761ebc872bc1cbde8975054e31a\"" Mar 7 00:54:32.403399 containerd[1477]: time="2026-03-07T00:54:32.400238267Z" level=info msg="StartContainer for \"1bb512d359651bb3f570cfb8addf34fae133f761ebc872bc1cbde8975054e31a\"" Mar 7 00:54:32.441934 systemd[1]: Started cri-containerd-1bb512d359651bb3f570cfb8addf34fae133f761ebc872bc1cbde8975054e31a.scope - libcontainer container 1bb512d359651bb3f570cfb8addf34fae133f761ebc872bc1cbde8975054e31a. Mar 7 00:54:32.486740 containerd[1477]: time="2026-03-07T00:54:32.486351003Z" level=info msg="StartContainer for \"1bb512d359651bb3f570cfb8addf34fae133f761ebc872bc1cbde8975054e31a\" returns successfully" Mar 7 00:54:33.304729 kubelet[2596]: I0307 00:54:33.304484 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6674bf8c76-nmdd8" podStartSLOduration=19.905952312 podStartE2EDuration="26.30446663s" podCreationTimestamp="2026-03-07 00:54:07 +0000 UTC" firstStartedPulling="2026-03-07 00:54:25.96337002 +0000 UTC m=+41.164143703" lastFinishedPulling="2026-03-07 00:54:32.361884258 +0000 UTC m=+47.562658021" observedRunningTime="2026-03-07 00:54:33.299657344 +0000 UTC m=+48.500431027" watchObservedRunningTime="2026-03-07 00:54:33.30446663 +0000 UTC m=+48.505240393" Mar 7 00:54:33.306516 kubelet[2596]: I0307 00:54:33.304783 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-75b65f9ff7-9wghj" podStartSLOduration=25.408178241 podStartE2EDuration="28.304776489s" podCreationTimestamp="2026-03-07 00:54:05 +0000 UTC" firstStartedPulling="2026-03-07 00:54:25.803065738 +0000 UTC m=+41.003839421" lastFinishedPulling="2026-03-07 00:54:28.699663986 +0000 UTC m=+43.900437669" observedRunningTime="2026-03-07 00:54:29.301328884 +0000 UTC m=+44.502102567" watchObservedRunningTime="2026-03-07 00:54:33.304776489 +0000 UTC m=+48.505550172" Mar 7 00:54:34.140979 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3548174214.mount: Deactivated successfully. Mar 7 00:54:34.489824 containerd[1477]: time="2026-03-07T00:54:34.489242591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:34.490985 containerd[1477]: time="2026-03-07T00:54:34.490930311Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 7 00:54:34.492360 containerd[1477]: time="2026-03-07T00:54:34.492308893Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:34.495451 containerd[1477]: time="2026-03-07T00:54:34.495289419Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:34.496833 containerd[1477]: time="2026-03-07T00:54:34.496604348Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 2.134508369s" Mar 7 00:54:34.496833 containerd[1477]: time="2026-03-07T00:54:34.496642996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 7 00:54:34.499324 containerd[1477]: time="2026-03-07T00:54:34.499240249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 7 00:54:34.503265 containerd[1477]: time="2026-03-07T00:54:34.503223925Z" level=info msg="CreateContainer within sandbox \"5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 7 00:54:34.520296 containerd[1477]: time="2026-03-07T00:54:34.520220952Z" level=info msg="CreateContainer within sandbox \"5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"3b79151afec11a8a2fc4465005c2ee358f482067e5aa2e786e375fdd0c5d3d85\"" Mar 7 00:54:34.524330 containerd[1477]: time="2026-03-07T00:54:34.522730988Z" level=info msg="StartContainer for \"3b79151afec11a8a2fc4465005c2ee358f482067e5aa2e786e375fdd0c5d3d85\"" Mar 7 00:54:34.588044 systemd[1]: Started cri-containerd-3b79151afec11a8a2fc4465005c2ee358f482067e5aa2e786e375fdd0c5d3d85.scope - libcontainer container 3b79151afec11a8a2fc4465005c2ee358f482067e5aa2e786e375fdd0c5d3d85. Mar 7 00:54:34.655266 containerd[1477]: time="2026-03-07T00:54:34.654837869Z" level=info msg="StartContainer for \"3b79151afec11a8a2fc4465005c2ee358f482067e5aa2e786e375fdd0c5d3d85\" returns successfully" Mar 7 00:54:36.164432 containerd[1477]: time="2026-03-07T00:54:36.164338134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:36.166117 containerd[1477]: time="2026-03-07T00:54:36.166058053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 7 00:54:36.167309 containerd[1477]: time="2026-03-07T00:54:36.167268197Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:36.171015 containerd[1477]: time="2026-03-07T00:54:36.170968402Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:36.173427 containerd[1477]: time="2026-03-07T00:54:36.172927764Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.673646627s" Mar 7 00:54:36.173427 containerd[1477]: time="2026-03-07T00:54:36.172964371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 7 00:54:36.179222 containerd[1477]: time="2026-03-07T00:54:36.179174280Z" level=info msg="CreateContainer within sandbox \"758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 7 00:54:36.193085 containerd[1477]: time="2026-03-07T00:54:36.193006121Z" level=info msg="CreateContainer within sandbox \"758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"16badb923410c2bad4df0aeeacbcc042db1dad0323cf4c84241e1fdfbd0d7d32\"" Mar 7 00:54:36.196692 containerd[1477]: time="2026-03-07T00:54:36.195592799Z" level=info msg="StartContainer for \"16badb923410c2bad4df0aeeacbcc042db1dad0323cf4c84241e1fdfbd0d7d32\"" Mar 7 00:54:36.233872 systemd[1]: Started cri-containerd-16badb923410c2bad4df0aeeacbcc042db1dad0323cf4c84241e1fdfbd0d7d32.scope - libcontainer container 16badb923410c2bad4df0aeeacbcc042db1dad0323cf4c84241e1fdfbd0d7d32. Mar 7 00:54:36.276539 containerd[1477]: time="2026-03-07T00:54:36.276394515Z" level=info msg="StartContainer for \"16badb923410c2bad4df0aeeacbcc042db1dad0323cf4c84241e1fdfbd0d7d32\" returns successfully" Mar 7 00:54:36.279697 containerd[1477]: time="2026-03-07T00:54:36.279631874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 7 00:54:36.938487 containerd[1477]: time="2026-03-07T00:54:36.937835058Z" level=info msg="StopPodSandbox for \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\"" Mar 7 00:54:37.009217 kubelet[2596]: I0307 00:54:37.008740 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-z9vmk" podStartSLOduration=23.934319852 podStartE2EDuration="32.008722201s" podCreationTimestamp="2026-03-07 00:54:05 +0000 UTC" firstStartedPulling="2026-03-07 00:54:26.424494114 +0000 UTC m=+41.625267797" lastFinishedPulling="2026-03-07 00:54:34.498896463 +0000 UTC m=+49.699670146" observedRunningTime="2026-03-07 00:54:35.311286537 +0000 UTC m=+50.512060220" watchObservedRunningTime="2026-03-07 00:54:37.008722201 +0000 UTC m=+52.209495884" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.010 [INFO][5105] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.011 [INFO][5105] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" iface="eth0" netns="/var/run/netns/cni-3ee5b315-0034-c12c-731d-f77edc4d0837" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.011 [INFO][5105] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" iface="eth0" netns="/var/run/netns/cni-3ee5b315-0034-c12c-731d-f77edc4d0837" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.011 [INFO][5105] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" iface="eth0" netns="/var/run/netns/cni-3ee5b315-0034-c12c-731d-f77edc4d0837" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.011 [INFO][5105] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.011 [INFO][5105] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.036 [INFO][5113] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.036 [INFO][5113] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.037 [INFO][5113] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.052 [WARNING][5113] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.052 [INFO][5113] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.055 [INFO][5113] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:37.059361 containerd[1477]: 2026-03-07 00:54:37.057 [INFO][5105] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:37.062466 containerd[1477]: time="2026-03-07T00:54:37.059565221Z" level=info msg="TearDown network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\" successfully" Mar 7 00:54:37.062466 containerd[1477]: time="2026-03-07T00:54:37.059593586Z" level=info msg="StopPodSandbox for \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\" returns successfully" Mar 7 00:54:37.063669 systemd[1]: run-netns-cni\x2d3ee5b315\x2d0034\x2dc12c\x2d731d\x2df77edc4d0837.mount: Deactivated successfully. Mar 7 00:54:37.067998 containerd[1477]: time="2026-03-07T00:54:37.067803928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2zr4s,Uid:16bf6d64-1b62-4d2e-b193-67ace2ec0676,Namespace:calico-system,Attempt:1,}" Mar 7 00:54:37.228771 systemd-networkd[1382]: calic5d6793ff64: Link UP Mar 7 00:54:37.228935 systemd-networkd[1382]: calic5d6793ff64: Gained carrier Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.132 [INFO][5122] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0 csi-node-driver- calico-system 16bf6d64-1b62-4d2e-b193-67ace2ec0676 1011 0 2026-03-07 00:54:06 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-n-f47b87f6f2 csi-node-driver-2zr4s eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic5d6793ff64 [] [] }} ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.132 [INFO][5122] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.162 [INFO][5134] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" HandleID="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.176 [INFO][5134] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" HandleID="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbe80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-f47b87f6f2", "pod":"csi-node-driver-2zr4s", "timestamp":"2026-03-07 00:54:37.162124862 +0000 UTC"}, Hostname:"ci-4081-3-6-n-f47b87f6f2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001862c0)} Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.176 [INFO][5134] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.177 [INFO][5134] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.177 [INFO][5134] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-f47b87f6f2' Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.181 [INFO][5134] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.187 [INFO][5134] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.196 [INFO][5134] ipam/ipam.go 526: Trying affinity for 192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.199 [INFO][5134] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.202 [INFO][5134] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.202 [INFO][5134] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.204 [INFO][5134] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367 Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.210 [INFO][5134] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.220 [INFO][5134] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.200/26] block=192.168.88.192/26 handle="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.220 [INFO][5134] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.200/26] handle="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" host="ci-4081-3-6-n-f47b87f6f2" Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.220 [INFO][5134] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:37.259334 containerd[1477]: 2026-03-07 00:54:37.221 [INFO][5134] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.200/26] IPv6=[] ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" HandleID="k8s-pod-network.42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.261009 containerd[1477]: 2026-03-07 00:54:37.224 [INFO][5122] cni-plugin/k8s.go 418: Populated endpoint ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"16bf6d64-1b62-4d2e-b193-67ace2ec0676", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"", Pod:"csi-node-driver-2zr4s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic5d6793ff64", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:37.261009 containerd[1477]: 2026-03-07 00:54:37.225 [INFO][5122] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.200/32] ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.261009 containerd[1477]: 2026-03-07 00:54:37.225 [INFO][5122] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic5d6793ff64 ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.261009 containerd[1477]: 2026-03-07 00:54:37.228 [INFO][5122] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.261009 containerd[1477]: 2026-03-07 00:54:37.231 [INFO][5122] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"16bf6d64-1b62-4d2e-b193-67ace2ec0676", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367", Pod:"csi-node-driver-2zr4s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic5d6793ff64", MAC:"e2:ce:a4:1d:1e:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:37.261009 containerd[1477]: 2026-03-07 00:54:37.249 [INFO][5122] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367" Namespace="calico-system" Pod="csi-node-driver-2zr4s" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:37.281788 containerd[1477]: time="2026-03-07T00:54:37.281660568Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 00:54:37.281788 containerd[1477]: time="2026-03-07T00:54:37.281746344Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 00:54:37.282855 containerd[1477]: time="2026-03-07T00:54:37.281806275Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:37.282855 containerd[1477]: time="2026-03-07T00:54:37.281960223Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 00:54:37.313880 systemd[1]: Started cri-containerd-42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367.scope - libcontainer container 42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367. Mar 7 00:54:37.352899 containerd[1477]: time="2026-03-07T00:54:37.352826146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2zr4s,Uid:16bf6d64-1b62-4d2e-b193-67ace2ec0676,Namespace:calico-system,Attempt:1,} returns sandbox id \"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367\"" Mar 7 00:54:38.912164 systemd-networkd[1382]: calic5d6793ff64: Gained IPv6LL Mar 7 00:54:43.456632 kubelet[2596]: I0307 00:54:43.455601 2596 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:54:44.329143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2642966696.mount: Deactivated successfully. Mar 7 00:54:44.345845 containerd[1477]: time="2026-03-07T00:54:44.345101914Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:44.346410 containerd[1477]: time="2026-03-07T00:54:44.346380173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 7 00:54:44.347052 containerd[1477]: time="2026-03-07T00:54:44.347024043Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:44.349779 containerd[1477]: time="2026-03-07T00:54:44.349747069Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:44.350643 containerd[1477]: time="2026-03-07T00:54:44.350608617Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 8.070939616s" Mar 7 00:54:44.350836 containerd[1477]: time="2026-03-07T00:54:44.350767164Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 7 00:54:44.352019 containerd[1477]: time="2026-03-07T00:54:44.351978411Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 7 00:54:44.357596 containerd[1477]: time="2026-03-07T00:54:44.357561727Z" level=info msg="CreateContainer within sandbox \"758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 7 00:54:44.390031 containerd[1477]: time="2026-03-07T00:54:44.389953510Z" level=info msg="CreateContainer within sandbox \"758e2ef80f8eb59d07a2c9509aa361cb289f6af21b4c12174b9dbb5291d545e7\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"86a1c817ac64840fb40061b6c6b75d2c1e703f36360d59755fdf9369f06b0f14\"" Mar 7 00:54:44.392303 containerd[1477]: time="2026-03-07T00:54:44.392021344Z" level=info msg="StartContainer for \"86a1c817ac64840fb40061b6c6b75d2c1e703f36360d59755fdf9369f06b0f14\"" Mar 7 00:54:44.424883 systemd[1]: Started cri-containerd-86a1c817ac64840fb40061b6c6b75d2c1e703f36360d59755fdf9369f06b0f14.scope - libcontainer container 86a1c817ac64840fb40061b6c6b75d2c1e703f36360d59755fdf9369f06b0f14. Mar 7 00:54:44.462654 containerd[1477]: time="2026-03-07T00:54:44.462611104Z" level=info msg="StartContainer for \"86a1c817ac64840fb40061b6c6b75d2c1e703f36360d59755fdf9369f06b0f14\" returns successfully" Mar 7 00:54:44.955233 containerd[1477]: time="2026-03-07T00:54:44.955191819Z" level=info msg="StopPodSandbox for \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\"" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:44.999 [WARNING][5274] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0", GenerateName:"calico-kube-controllers-6674bf8c76-", Namespace:"calico-system", SelfLink:"", UID:"5a4b6a6b-223d-4d9c-9933-e41491314c6d", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6674bf8c76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462", Pod:"calico-kube-controllers-6674bf8c76-nmdd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic29b16583e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.000 [INFO][5274] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.000 [INFO][5274] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" iface="eth0" netns="" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.000 [INFO][5274] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.000 [INFO][5274] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.037 [INFO][5281] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.038 [INFO][5281] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.038 [INFO][5281] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.047 [WARNING][5281] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.047 [INFO][5281] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.050 [INFO][5281] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.057850 containerd[1477]: 2026-03-07 00:54:45.054 [INFO][5274] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.057850 containerd[1477]: time="2026-03-07T00:54:45.057754542Z" level=info msg="TearDown network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\" successfully" Mar 7 00:54:45.057850 containerd[1477]: time="2026-03-07T00:54:45.057779617Z" level=info msg="StopPodSandbox for \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\" returns successfully" Mar 7 00:54:45.059456 containerd[1477]: time="2026-03-07T00:54:45.059299538Z" level=info msg="RemovePodSandbox for \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\"" Mar 7 00:54:45.062708 containerd[1477]: time="2026-03-07T00:54:45.062638436Z" level=info msg="Forcibly stopping sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\"" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.104 [WARNING][5296] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0", GenerateName:"calico-kube-controllers-6674bf8c76-", Namespace:"calico-system", SelfLink:"", UID:"5a4b6a6b-223d-4d9c-9933-e41491314c6d", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6674bf8c76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"ae68bd9ff2e074a0c133ee8fc6ea80cb0beb1aefbb401c24bac108124c04f462", Pod:"calico-kube-controllers-6674bf8c76-nmdd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic29b16583e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.104 [INFO][5296] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.104 [INFO][5296] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" iface="eth0" netns="" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.104 [INFO][5296] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.104 [INFO][5296] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.128 [INFO][5303] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.130 [INFO][5303] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.130 [INFO][5303] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.141 [WARNING][5303] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.141 [INFO][5303] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" HandleID="k8s-pod-network.0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--kube--controllers--6674bf8c76--nmdd8-eth0" Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.143 [INFO][5303] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.148755 containerd[1477]: 2026-03-07 00:54:45.145 [INFO][5296] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369" Mar 7 00:54:45.148755 containerd[1477]: time="2026-03-07T00:54:45.148558271Z" level=info msg="TearDown network for sandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\" successfully" Mar 7 00:54:45.153820 containerd[1477]: time="2026-03-07T00:54:45.153746981Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:45.153929 containerd[1477]: time="2026-03-07T00:54:45.153859078Z" level=info msg="RemovePodSandbox \"0e02293ddf8ecff2d2a1c7e774e10d67af233f8da3432288970d0b009203d369\" returns successfully" Mar 7 00:54:45.155053 containerd[1477]: time="2026-03-07T00:54:45.154575847Z" level=info msg="StopPodSandbox for \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\"" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.193 [WARNING][5317] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"92eb6a25-972b-4cc1-a42c-1135d83a0c74", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28", Pod:"calico-apiserver-75b65f9ff7-snpd2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1d845415f42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.193 [INFO][5317] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.193 [INFO][5317] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" iface="eth0" netns="" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.193 [INFO][5317] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.193 [INFO][5317] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.217 [INFO][5324] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.217 [INFO][5324] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.217 [INFO][5324] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.231 [WARNING][5324] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.231 [INFO][5324] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.233 [INFO][5324] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.237671 containerd[1477]: 2026-03-07 00:54:45.235 [INFO][5317] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.239304 containerd[1477]: time="2026-03-07T00:54:45.239153204Z" level=info msg="TearDown network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\" successfully" Mar 7 00:54:45.239304 containerd[1477]: time="2026-03-07T00:54:45.239192276Z" level=info msg="StopPodSandbox for \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\" returns successfully" Mar 7 00:54:45.239961 containerd[1477]: time="2026-03-07T00:54:45.239818145Z" level=info msg="RemovePodSandbox for \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\"" Mar 7 00:54:45.240305 containerd[1477]: time="2026-03-07T00:54:45.239998467Z" level=info msg="Forcibly stopping sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\"" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.291 [WARNING][5338] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"92eb6a25-972b-4cc1-a42c-1135d83a0c74", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"54d71ed03861a135fe52da135c66582f4c38b546f30fdbb9309b67b04573fa28", Pod:"calico-apiserver-75b65f9ff7-snpd2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1d845415f42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.291 [INFO][5338] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.291 [INFO][5338] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" iface="eth0" netns="" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.291 [INFO][5338] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.291 [INFO][5338] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.319 [INFO][5347] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.319 [INFO][5347] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.319 [INFO][5347] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.329 [WARNING][5347] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.329 [INFO][5347] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" HandleID="k8s-pod-network.19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--snpd2-eth0" Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.332 [INFO][5347] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.339697 containerd[1477]: 2026-03-07 00:54:45.334 [INFO][5338] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae" Mar 7 00:54:45.341872 containerd[1477]: time="2026-03-07T00:54:45.341828440Z" level=info msg="TearDown network for sandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\" successfully" Mar 7 00:54:45.345851 containerd[1477]: time="2026-03-07T00:54:45.345806764Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:45.345932 containerd[1477]: time="2026-03-07T00:54:45.345872591Z" level=info msg="RemovePodSandbox \"19aba2c5b8d0f97d6e99cd7976b793c95645ee19492f5e7ea19e157c0b92e5ae\" returns successfully" Mar 7 00:54:45.346700 containerd[1477]: time="2026-03-07T00:54:45.346560566Z" level=info msg="StopPodSandbox for \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\"" Mar 7 00:54:45.350746 kubelet[2596]: I0307 00:54:45.350675 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5ddf4c559d-tprh2" podStartSLOduration=2.551701675 podStartE2EDuration="20.350658785s" podCreationTimestamp="2026-03-07 00:54:25 +0000 UTC" firstStartedPulling="2026-03-07 00:54:26.553207053 +0000 UTC m=+41.753980736" lastFinishedPulling="2026-03-07 00:54:44.352164163 +0000 UTC m=+59.552937846" observedRunningTime="2026-03-07 00:54:45.349958652 +0000 UTC m=+60.550732335" watchObservedRunningTime="2026-03-07 00:54:45.350658785 +0000 UTC m=+60.551432468" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.408 [WARNING][5361] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"16bf6d64-1b62-4d2e-b193-67ace2ec0676", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367", Pod:"csi-node-driver-2zr4s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic5d6793ff64", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.409 [INFO][5361] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.409 [INFO][5361] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" iface="eth0" netns="" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.409 [INFO][5361] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.409 [INFO][5361] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.432 [INFO][5371] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.432 [INFO][5371] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.432 [INFO][5371] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.444 [WARNING][5371] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.444 [INFO][5371] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.446 [INFO][5371] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.451936 containerd[1477]: 2026-03-07 00:54:45.449 [INFO][5361] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.452993 containerd[1477]: time="2026-03-07T00:54:45.452461445Z" level=info msg="TearDown network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\" successfully" Mar 7 00:54:45.452993 containerd[1477]: time="2026-03-07T00:54:45.452854762Z" level=info msg="StopPodSandbox for \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\" returns successfully" Mar 7 00:54:45.453371 containerd[1477]: time="2026-03-07T00:54:45.453340500Z" level=info msg="RemovePodSandbox for \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\"" Mar 7 00:54:45.453459 containerd[1477]: time="2026-03-07T00:54:45.453373613Z" level=info msg="Forcibly stopping sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\"" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.492 [WARNING][5385] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"16bf6d64-1b62-4d2e-b193-67ace2ec0676", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367", Pod:"csi-node-driver-2zr4s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic5d6793ff64", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.492 [INFO][5385] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.492 [INFO][5385] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" iface="eth0" netns="" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.492 [INFO][5385] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.492 [INFO][5385] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.516 [INFO][5392] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.516 [INFO][5392] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.516 [INFO][5392] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.527 [WARNING][5392] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.527 [INFO][5392] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" HandleID="k8s-pod-network.63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-csi--node--driver--2zr4s-eth0" Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.529 [INFO][5392] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.534119 containerd[1477]: 2026-03-07 00:54:45.531 [INFO][5385] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0" Mar 7 00:54:45.534119 containerd[1477]: time="2026-03-07T00:54:45.533982123Z" level=info msg="TearDown network for sandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\" successfully" Mar 7 00:54:45.542324 containerd[1477]: time="2026-03-07T00:54:45.541737255Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:45.542324 containerd[1477]: time="2026-03-07T00:54:45.541948290Z" level=info msg="RemovePodSandbox \"63fd82461c98b841494a232f48722b9c5670dd5d74a0d2d0910f567dc785c2a0\" returns successfully" Mar 7 00:54:45.542869 containerd[1477]: time="2026-03-07T00:54:45.542819187Z" level=info msg="StopPodSandbox for \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\"" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.602 [WARNING][5406] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"16c7f73a-a0d9-4851-9016-87e6dc57075b", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab", Pod:"calico-apiserver-75b65f9ff7-9wghj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali25a7fbc2063", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.603 [INFO][5406] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.603 [INFO][5406] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" iface="eth0" netns="" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.603 [INFO][5406] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.603 [INFO][5406] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.628 [INFO][5413] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.629 [INFO][5413] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.629 [INFO][5413] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.646 [WARNING][5413] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.646 [INFO][5413] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.648 [INFO][5413] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.653512 containerd[1477]: 2026-03-07 00:54:45.651 [INFO][5406] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.654486 containerd[1477]: time="2026-03-07T00:54:45.653671106Z" level=info msg="TearDown network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\" successfully" Mar 7 00:54:45.654486 containerd[1477]: time="2026-03-07T00:54:45.653713497Z" level=info msg="StopPodSandbox for \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\" returns successfully" Mar 7 00:54:45.654820 containerd[1477]: time="2026-03-07T00:54:45.654650820Z" level=info msg="RemovePodSandbox for \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\"" Mar 7 00:54:45.654820 containerd[1477]: time="2026-03-07T00:54:45.654705049Z" level=info msg="Forcibly stopping sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\"" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.702 [WARNING][5433] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0", GenerateName:"calico-apiserver-75b65f9ff7-", Namespace:"calico-system", SelfLink:"", UID:"16c7f73a-a0d9-4851-9016-87e6dc57075b", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75b65f9ff7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"f195a62a316965d72a99cd699e846d11289e0bd4c06969d0d21b1591212729ab", Pod:"calico-apiserver-75b65f9ff7-9wghj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali25a7fbc2063", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.702 [INFO][5433] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.702 [INFO][5433] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" iface="eth0" netns="" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.702 [INFO][5433] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.702 [INFO][5433] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.734 [INFO][5441] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.734 [INFO][5441] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.734 [INFO][5441] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.752 [WARNING][5441] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.752 [INFO][5441] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" HandleID="k8s-pod-network.5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-calico--apiserver--75b65f9ff7--9wghj-eth0" Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.757 [INFO][5441] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.760601 containerd[1477]: 2026-03-07 00:54:45.758 [INFO][5433] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23" Mar 7 00:54:45.761109 containerd[1477]: time="2026-03-07T00:54:45.760631962Z" level=info msg="TearDown network for sandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\" successfully" Mar 7 00:54:45.769805 containerd[1477]: time="2026-03-07T00:54:45.767703996Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 7 00:54:45.769805 containerd[1477]: time="2026-03-07T00:54:45.769117140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:45.772421 containerd[1477]: time="2026-03-07T00:54:45.772381534Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:45.772780 containerd[1477]: time="2026-03-07T00:54:45.772753416Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.420739082s" Mar 7 00:54:45.772830 containerd[1477]: time="2026-03-07T00:54:45.772783090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 7 00:54:45.773189 containerd[1477]: time="2026-03-07T00:54:45.773153772Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:45.780810 containerd[1477]: time="2026-03-07T00:54:45.780763854Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:45.781260 containerd[1477]: time="2026-03-07T00:54:45.781235554Z" level=info msg="RemovePodSandbox \"5e35dbac9fdd5cbaa11f7de8f2d5c4705172c85017814cb7e319cdd89a8d0a23\" returns successfully" Mar 7 00:54:45.781414 containerd[1477]: time="2026-03-07T00:54:45.780791888Z" level=info msg="CreateContainer within sandbox \"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 7 00:54:45.781775 containerd[1477]: time="2026-03-07T00:54:45.781740288Z" level=info msg="StopPodSandbox for \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\"" Mar 7 00:54:45.809585 containerd[1477]: time="2026-03-07T00:54:45.809049193Z" level=info msg="CreateContainer within sandbox \"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"15b24379d02a5002ccbe63f87c1eb729784837df0401c4669cbe0237505bb470\"" Mar 7 00:54:45.809864 containerd[1477]: time="2026-03-07T00:54:45.809836388Z" level=info msg="StartContainer for \"15b24379d02a5002ccbe63f87c1eb729784837df0401c4669cbe0237505bb470\"" Mar 7 00:54:45.858560 systemd[1]: Started cri-containerd-15b24379d02a5002ccbe63f87c1eb729784837df0401c4669cbe0237505bb470.scope - libcontainer container 15b24379d02a5002ccbe63f87c1eb729784837df0401c4669cbe0237505bb470. Mar 7 00:54:45.896349 containerd[1477]: time="2026-03-07T00:54:45.896150300Z" level=info msg="StartContainer for \"15b24379d02a5002ccbe63f87c1eb729784837df0401c4669cbe0237505bb470\" returns successfully" Mar 7 00:54:45.901244 containerd[1477]: time="2026-03-07T00:54:45.900601885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.856 [WARNING][5455] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dcb976b7-3486-45dd-bbba-92e1e50edecd", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a", Pod:"coredns-66bc5c9577-r6nrd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali70eecd89192", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.857 [INFO][5455] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.857 [INFO][5455] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" iface="eth0" netns="" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.857 [INFO][5455] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.857 [INFO][5455] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.887 [INFO][5480] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.887 [INFO][5480] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.887 [INFO][5480] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.900 [WARNING][5480] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.900 [INFO][5480] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.904 [INFO][5480] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:45.908761 containerd[1477]: 2026-03-07 00:54:45.906 [INFO][5455] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:45.908761 containerd[1477]: time="2026-03-07T00:54:45.908504225Z" level=info msg="TearDown network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\" successfully" Mar 7 00:54:45.908761 containerd[1477]: time="2026-03-07T00:54:45.908527420Z" level=info msg="StopPodSandbox for \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\" returns successfully" Mar 7 00:54:45.909999 containerd[1477]: time="2026-03-07T00:54:45.909967798Z" level=info msg="RemovePodSandbox for \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\"" Mar 7 00:54:45.910079 containerd[1477]: time="2026-03-07T00:54:45.910005430Z" level=info msg="Forcibly stopping sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\"" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.961 [WARNING][5511] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dcb976b7-3486-45dd-bbba-92e1e50edecd", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"e7402bdb6a740257f02baee2352a7df5a92e982679065a2b6d1a3f77aaf2871a", Pod:"coredns-66bc5c9577-r6nrd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali70eecd89192", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.962 [INFO][5511] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.962 [INFO][5511] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" iface="eth0" netns="" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.962 [INFO][5511] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.962 [INFO][5511] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.987 [INFO][5518] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.988 [INFO][5518] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.988 [INFO][5518] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.997 [WARNING][5518] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.998 [INFO][5518] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" HandleID="k8s-pod-network.f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--r6nrd-eth0" Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:45.999 [INFO][5518] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:46.003520 containerd[1477]: 2026-03-07 00:54:46.002 [INFO][5511] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f" Mar 7 00:54:46.004160 containerd[1477]: time="2026-03-07T00:54:46.003569928Z" level=info msg="TearDown network for sandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\" successfully" Mar 7 00:54:46.008149 containerd[1477]: time="2026-03-07T00:54:46.008100662Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:46.008475 containerd[1477]: time="2026-03-07T00:54:46.008191004Z" level=info msg="RemovePodSandbox \"f85180283130d37c63bc6f3c5d62899367f530304045989a20c3b0e2c111795f\" returns successfully" Mar 7 00:54:46.009003 containerd[1477]: time="2026-03-07T00:54:46.008669468Z" level=info msg="StopPodSandbox for \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\"" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.053 [WARNING][5532] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.053 [INFO][5532] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.053 [INFO][5532] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" iface="eth0" netns="" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.053 [INFO][5532] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.053 [INFO][5532] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.076 [INFO][5539] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.077 [INFO][5539] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.077 [INFO][5539] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.088 [WARNING][5539] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.088 [INFO][5539] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.090 [INFO][5539] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:46.094277 containerd[1477]: 2026-03-07 00:54:46.092 [INFO][5532] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.094277 containerd[1477]: time="2026-03-07T00:54:46.094107980Z" level=info msg="TearDown network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\" successfully" Mar 7 00:54:46.094277 containerd[1477]: time="2026-03-07T00:54:46.094149692Z" level=info msg="StopPodSandbox for \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\" returns successfully" Mar 7 00:54:46.096025 containerd[1477]: time="2026-03-07T00:54:46.094834035Z" level=info msg="RemovePodSandbox for \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\"" Mar 7 00:54:46.096025 containerd[1477]: time="2026-03-07T00:54:46.094873627Z" level=info msg="Forcibly stopping sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\"" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.138 [WARNING][5554] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" WorkloadEndpoint="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.138 [INFO][5554] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.138 [INFO][5554] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" iface="eth0" netns="" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.138 [INFO][5554] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.138 [INFO][5554] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.158 [INFO][5561] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.159 [INFO][5561] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.159 [INFO][5561] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.174 [WARNING][5561] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.174 [INFO][5561] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" HandleID="k8s-pod-network.b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-whisker--57578867fc--57xjz-eth0" Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.176 [INFO][5561] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:46.180357 containerd[1477]: 2026-03-07 00:54:46.178 [INFO][5554] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e" Mar 7 00:54:46.180876 containerd[1477]: time="2026-03-07T00:54:46.180391483Z" level=info msg="TearDown network for sandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\" successfully" Mar 7 00:54:46.184556 containerd[1477]: time="2026-03-07T00:54:46.184506420Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:46.184659 containerd[1477]: time="2026-03-07T00:54:46.184581965Z" level=info msg="RemovePodSandbox \"b9fc8b3d180814532ef6ba6a322cc274fdf5c519a6a5067bfd85fbc08506246e\" returns successfully" Mar 7 00:54:46.185401 containerd[1477]: time="2026-03-07T00:54:46.185274026Z" level=info msg="StopPodSandbox for \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\"" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.235 [WARNING][5576] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0106d835-24d5-479e-aa94-270e56a82826", ResourceVersion:"996", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b", Pod:"goldmane-cccfbd5cf-z9vmk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali36729ad4bf6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.236 [INFO][5576] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.236 [INFO][5576] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" iface="eth0" netns="" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.236 [INFO][5576] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.236 [INFO][5576] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.261 [INFO][5583] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.262 [INFO][5583] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.262 [INFO][5583] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.274 [WARNING][5583] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.274 [INFO][5583] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.277 [INFO][5583] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:46.280464 containerd[1477]: 2026-03-07 00:54:46.278 [INFO][5576] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.281568 containerd[1477]: time="2026-03-07T00:54:46.281104820Z" level=info msg="TearDown network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\" successfully" Mar 7 00:54:46.281568 containerd[1477]: time="2026-03-07T00:54:46.281145052Z" level=info msg="StopPodSandbox for \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\" returns successfully" Mar 7 00:54:46.282357 containerd[1477]: time="2026-03-07T00:54:46.281974086Z" level=info msg="RemovePodSandbox for \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\"" Mar 7 00:54:46.282357 containerd[1477]: time="2026-03-07T00:54:46.282007119Z" level=info msg="Forcibly stopping sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\"" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.330 [WARNING][5597] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0106d835-24d5-479e-aa94-270e56a82826", ResourceVersion:"996", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 54, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"5ef134dc4da1c30fe958f08b960d337dc165738dd7d2d7a1c41769a33a6ff87b", Pod:"goldmane-cccfbd5cf-z9vmk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali36729ad4bf6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.331 [INFO][5597] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.331 [INFO][5597] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" iface="eth0" netns="" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.331 [INFO][5597] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.331 [INFO][5597] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.366 [INFO][5604] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.366 [INFO][5604] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.366 [INFO][5604] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.378 [WARNING][5604] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.378 [INFO][5604] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" HandleID="k8s-pod-network.196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-goldmane--cccfbd5cf--z9vmk-eth0" Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.380 [INFO][5604] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:46.383820 containerd[1477]: 2026-03-07 00:54:46.382 [INFO][5597] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed" Mar 7 00:54:46.386298 containerd[1477]: time="2026-03-07T00:54:46.383885343Z" level=info msg="TearDown network for sandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\" successfully" Mar 7 00:54:46.388762 containerd[1477]: time="2026-03-07T00:54:46.388722736Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:46.388846 containerd[1477]: time="2026-03-07T00:54:46.388802640Z" level=info msg="RemovePodSandbox \"196e4aedbdbea894d9a0bb94609ca7b1155f11af9a3d74155e4d56ca970569ed\" returns successfully" Mar 7 00:54:46.389361 containerd[1477]: time="2026-03-07T00:54:46.389332014Z" level=info msg="StopPodSandbox for \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\"" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.433 [WARNING][5619] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5bc543fa-d95f-4c47-aab4-75443fc7ed6f", ResourceVersion:"947", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c", Pod:"coredns-66bc5c9577-kppgg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali958d660f60e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.434 [INFO][5619] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.434 [INFO][5619] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" iface="eth0" netns="" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.434 [INFO][5619] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.434 [INFO][5619] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.454 [INFO][5626] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.454 [INFO][5626] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.454 [INFO][5626] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.471 [WARNING][5626] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.471 [INFO][5626] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.474 [INFO][5626] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:46.478524 containerd[1477]: 2026-03-07 00:54:46.476 [INFO][5619] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.479622 containerd[1477]: time="2026-03-07T00:54:46.478546370Z" level=info msg="TearDown network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\" successfully" Mar 7 00:54:46.479622 containerd[1477]: time="2026-03-07T00:54:46.478575685Z" level=info msg="StopPodSandbox for \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\" returns successfully" Mar 7 00:54:46.479908 containerd[1477]: time="2026-03-07T00:54:46.479766486Z" level=info msg="RemovePodSandbox for \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\"" Mar 7 00:54:46.479908 containerd[1477]: time="2026-03-07T00:54:46.479855069Z" level=info msg="Forcibly stopping sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\"" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.520 [WARNING][5641] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5bc543fa-d95f-4c47-aab4-75443fc7ed6f", ResourceVersion:"947", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 0, 53, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-f47b87f6f2", ContainerID:"37acf52764154b786ab97a10b78242f37eb9ca428328a2574624ef20874f630c", Pod:"coredns-66bc5c9577-kppgg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali958d660f60e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.520 [INFO][5641] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.520 [INFO][5641] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" iface="eth0" netns="" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.520 [INFO][5641] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.520 [INFO][5641] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.542 [INFO][5648] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.542 [INFO][5648] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.542 [INFO][5648] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.555 [WARNING][5648] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.555 [INFO][5648] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" HandleID="k8s-pod-network.a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Workload="ci--4081--3--6--n--f47b87f6f2-k8s-coredns--66bc5c9577--kppgg-eth0" Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.557 [INFO][5648] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 00:54:46.561457 containerd[1477]: 2026-03-07 00:54:46.559 [INFO][5641] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b" Mar 7 00:54:46.562710 containerd[1477]: time="2026-03-07T00:54:46.562003639Z" level=info msg="TearDown network for sandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\" successfully" Mar 7 00:54:46.565852 containerd[1477]: time="2026-03-07T00:54:46.565821755Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 00:54:46.566051 containerd[1477]: time="2026-03-07T00:54:46.566027754Z" level=info msg="RemovePodSandbox \"a1d89a41e0f4ab4693bac302caf460d5ec6afd5351c96d85c766b498349f2b4b\" returns successfully" Mar 7 00:54:47.299719 containerd[1477]: time="2026-03-07T00:54:47.299307752Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:47.300188 containerd[1477]: time="2026-03-07T00:54:47.300156790Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 7 00:54:47.301726 containerd[1477]: time="2026-03-07T00:54:47.301097811Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:47.304140 containerd[1477]: time="2026-03-07T00:54:47.304092441Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 00:54:47.306253 containerd[1477]: time="2026-03-07T00:54:47.306201960Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.405499135s" Mar 7 00:54:47.306454 containerd[1477]: time="2026-03-07T00:54:47.306418838Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 7 00:54:47.311160 containerd[1477]: time="2026-03-07T00:54:47.311053476Z" level=info msg="CreateContainer within sandbox \"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 7 00:54:47.333572 containerd[1477]: time="2026-03-07T00:54:47.333518041Z" level=info msg="CreateContainer within sandbox \"42bef7e29719d031eb8342df7be6220c1ff65e016e9ae71609a796eef0663367\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"cba3ba0dca02e168b80487b5ac45b18b847241ead7495b590edf0f070c3a0926\"" Mar 7 00:54:47.335692 containerd[1477]: time="2026-03-07T00:54:47.334585998Z" level=info msg="StartContainer for \"cba3ba0dca02e168b80487b5ac45b18b847241ead7495b590edf0f070c3a0926\"" Mar 7 00:54:47.399963 systemd[1]: Started cri-containerd-cba3ba0dca02e168b80487b5ac45b18b847241ead7495b590edf0f070c3a0926.scope - libcontainer container cba3ba0dca02e168b80487b5ac45b18b847241ead7495b590edf0f070c3a0926. Mar 7 00:54:47.431660 containerd[1477]: time="2026-03-07T00:54:47.431510553Z" level=info msg="StartContainer for \"cba3ba0dca02e168b80487b5ac45b18b847241ead7495b590edf0f070c3a0926\" returns successfully" Mar 7 00:54:48.072488 kubelet[2596]: I0307 00:54:48.072415 2596 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 7 00:54:48.077806 kubelet[2596]: I0307 00:54:48.077766 2596 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 7 00:54:48.389953 kubelet[2596]: I0307 00:54:48.389849 2596 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-2zr4s" podStartSLOduration=32.437214163 podStartE2EDuration="42.389830396s" podCreationTimestamp="2026-03-07 00:54:06 +0000 UTC" firstStartedPulling="2026-03-07 00:54:37.354878881 +0000 UTC m=+52.555652564" lastFinishedPulling="2026-03-07 00:54:47.307495114 +0000 UTC m=+62.508268797" observedRunningTime="2026-03-07 00:54:48.389492897 +0000 UTC m=+63.590266580" watchObservedRunningTime="2026-03-07 00:54:48.389830396 +0000 UTC m=+63.590604119" Mar 7 00:55:00.257019 kubelet[2596]: I0307 00:55:00.256728 2596 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 00:55:11.531167 systemd[1]: Started sshd@7-116.202.17.139:22-185.156.73.233:26056.service - OpenSSH per-connection server daemon (185.156.73.233:26056). Mar 7 00:55:12.668079 sshd[5821]: Invalid user admin from 185.156.73.233 port 26056 Mar 7 00:55:12.840786 sshd[5821]: Connection closed by invalid user admin 185.156.73.233 port 26056 [preauth] Mar 7 00:55:12.844431 systemd[1]: sshd@7-116.202.17.139:22-185.156.73.233:26056.service: Deactivated successfully. Mar 7 00:55:56.644560 systemd[1]: run-containerd-runc-k8s.io-33c5ee29b7065be2b70ae876cc6b0e99189b2509a4aac7a3cc3f859b5a36dca9-runc.T4aFtV.mount: Deactivated successfully. Mar 7 00:56:15.985805 update_engine[1460]: I20260307 00:56:15.984802 1460 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 7 00:56:15.985805 update_engine[1460]: I20260307 00:56:15.984874 1460 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 7 00:56:15.985805 update_engine[1460]: I20260307 00:56:15.985250 1460 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 7 00:56:15.987103 update_engine[1460]: I20260307 00:56:15.987071 1460 omaha_request_params.cc:62] Current group set to lts Mar 7 00:56:15.988005 update_engine[1460]: I20260307 00:56:15.987557 1460 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 7 00:56:15.988005 update_engine[1460]: I20260307 00:56:15.987586 1460 update_attempter.cc:643] Scheduling an action processor start. Mar 7 00:56:15.988005 update_engine[1460]: I20260307 00:56:15.987607 1460 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 7 00:56:15.989243 update_engine[1460]: I20260307 00:56:15.989032 1460 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 7 00:56:15.989243 update_engine[1460]: I20260307 00:56:15.989115 1460 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 7 00:56:15.989243 update_engine[1460]: I20260307 00:56:15.989125 1460 omaha_request_action.cc:272] Request: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: Mar 7 00:56:15.989243 update_engine[1460]: I20260307 00:56:15.989130 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:56:15.996414 update_engine[1460]: I20260307 00:56:15.995616 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:56:15.996414 update_engine[1460]: I20260307 00:56:15.996042 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:56:15.996798 locksmithd[1496]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 7 00:56:15.997986 update_engine[1460]: E20260307 00:56:15.997897 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:56:15.997986 update_engine[1460]: I20260307 00:56:15.997962 1460 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 7 00:56:20.654922 systemd[1]: Started sshd@8-116.202.17.139:22-20.161.92.111:43728.service - OpenSSH per-connection server daemon (20.161.92.111:43728). Mar 7 00:56:21.260430 sshd[6058]: Accepted publickey for core from 20.161.92.111 port 43728 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:21.263934 sshd[6058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:21.269631 systemd-logind[1459]: New session 8 of user core. Mar 7 00:56:21.271854 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 7 00:56:21.769807 sshd[6058]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:21.774949 systemd[1]: sshd@8-116.202.17.139:22-20.161.92.111:43728.service: Deactivated successfully. Mar 7 00:56:21.777534 systemd[1]: session-8.scope: Deactivated successfully. Mar 7 00:56:21.780327 systemd-logind[1459]: Session 8 logged out. Waiting for processes to exit. Mar 7 00:56:21.781300 systemd-logind[1459]: Removed session 8. Mar 7 00:56:25.898971 update_engine[1460]: I20260307 00:56:25.898070 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:56:25.898971 update_engine[1460]: I20260307 00:56:25.898428 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:56:25.898971 update_engine[1460]: I20260307 00:56:25.898761 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:56:25.900254 update_engine[1460]: E20260307 00:56:25.900197 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:56:25.900505 update_engine[1460]: I20260307 00:56:25.900285 1460 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 7 00:56:26.877209 systemd[1]: Started sshd@9-116.202.17.139:22-20.161.92.111:43738.service - OpenSSH per-connection server daemon (20.161.92.111:43738). Mar 7 00:56:27.477901 sshd[6096]: Accepted publickey for core from 20.161.92.111 port 43738 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:27.479340 sshd[6096]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:27.486156 systemd-logind[1459]: New session 9 of user core. Mar 7 00:56:27.491054 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 7 00:56:27.970530 sshd[6096]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:27.976559 systemd[1]: sshd@9-116.202.17.139:22-20.161.92.111:43738.service: Deactivated successfully. Mar 7 00:56:27.976795 systemd-logind[1459]: Session 9 logged out. Waiting for processes to exit. Mar 7 00:56:27.979290 systemd[1]: session-9.scope: Deactivated successfully. Mar 7 00:56:27.980298 systemd-logind[1459]: Removed session 9. Mar 7 00:56:33.075284 systemd[1]: Started sshd@10-116.202.17.139:22-20.161.92.111:58982.service - OpenSSH per-connection server daemon (20.161.92.111:58982). Mar 7 00:56:33.304500 systemd[1]: run-containerd-runc-k8s.io-1bb512d359651bb3f570cfb8addf34fae133f761ebc872bc1cbde8975054e31a-runc.06lMIk.mount: Deactivated successfully. Mar 7 00:56:33.678423 sshd[6109]: Accepted publickey for core from 20.161.92.111 port 58982 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:33.680938 sshd[6109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:33.685842 systemd-logind[1459]: New session 10 of user core. Mar 7 00:56:33.692999 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 7 00:56:34.177185 sshd[6109]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:34.182281 systemd[1]: sshd@10-116.202.17.139:22-20.161.92.111:58982.service: Deactivated successfully. Mar 7 00:56:34.185103 systemd[1]: session-10.scope: Deactivated successfully. Mar 7 00:56:34.186407 systemd-logind[1459]: Session 10 logged out. Waiting for processes to exit. Mar 7 00:56:34.187537 systemd-logind[1459]: Removed session 10. Mar 7 00:56:34.290150 systemd[1]: Started sshd@11-116.202.17.139:22-20.161.92.111:58998.service - OpenSSH per-connection server daemon (20.161.92.111:58998). Mar 7 00:56:34.897219 sshd[6142]: Accepted publickey for core from 20.161.92.111 port 58998 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:34.899457 sshd[6142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:34.904318 systemd-logind[1459]: New session 11 of user core. Mar 7 00:56:34.908949 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 7 00:56:35.463711 sshd[6142]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:35.473828 systemd[1]: sshd@11-116.202.17.139:22-20.161.92.111:58998.service: Deactivated successfully. Mar 7 00:56:35.478828 systemd[1]: session-11.scope: Deactivated successfully. Mar 7 00:56:35.481249 systemd-logind[1459]: Session 11 logged out. Waiting for processes to exit. Mar 7 00:56:35.484050 systemd-logind[1459]: Removed session 11. Mar 7 00:56:35.573992 systemd[1]: Started sshd@12-116.202.17.139:22-20.161.92.111:59012.service - OpenSSH per-connection server daemon (20.161.92.111:59012). Mar 7 00:56:35.897798 update_engine[1460]: I20260307 00:56:35.897718 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:56:35.898271 update_engine[1460]: I20260307 00:56:35.897982 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:56:35.898271 update_engine[1460]: I20260307 00:56:35.898236 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:56:35.899013 update_engine[1460]: E20260307 00:56:35.898983 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:56:35.899076 update_engine[1460]: I20260307 00:56:35.899039 1460 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 7 00:56:36.165731 sshd[6189]: Accepted publickey for core from 20.161.92.111 port 59012 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:36.167268 sshd[6189]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:36.172414 systemd-logind[1459]: New session 12 of user core. Mar 7 00:56:36.177913 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 7 00:56:36.660501 sshd[6189]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:36.667395 systemd[1]: sshd@12-116.202.17.139:22-20.161.92.111:59012.service: Deactivated successfully. Mar 7 00:56:36.667571 systemd-logind[1459]: Session 12 logged out. Waiting for processes to exit. Mar 7 00:56:36.669909 systemd[1]: session-12.scope: Deactivated successfully. Mar 7 00:56:36.671128 systemd-logind[1459]: Removed session 12. Mar 7 00:56:41.772099 systemd[1]: Started sshd@13-116.202.17.139:22-20.161.92.111:42210.service - OpenSSH per-connection server daemon (20.161.92.111:42210). Mar 7 00:56:42.357744 sshd[6223]: Accepted publickey for core from 20.161.92.111 port 42210 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:42.359343 sshd[6223]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:42.365477 systemd-logind[1459]: New session 13 of user core. Mar 7 00:56:42.369882 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 7 00:56:42.855025 sshd[6223]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:42.859991 systemd[1]: sshd@13-116.202.17.139:22-20.161.92.111:42210.service: Deactivated successfully. Mar 7 00:56:42.862902 systemd[1]: session-13.scope: Deactivated successfully. Mar 7 00:56:42.865039 systemd-logind[1459]: Session 13 logged out. Waiting for processes to exit. Mar 7 00:56:42.866457 systemd-logind[1459]: Removed session 13. Mar 7 00:56:42.968951 systemd[1]: Started sshd@14-116.202.17.139:22-20.161.92.111:42212.service - OpenSSH per-connection server daemon (20.161.92.111:42212). Mar 7 00:56:43.554311 sshd[6237]: Accepted publickey for core from 20.161.92.111 port 42212 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:43.555914 sshd[6237]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:43.561089 systemd-logind[1459]: New session 14 of user core. Mar 7 00:56:43.565938 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 7 00:56:44.196450 sshd[6237]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:44.204304 systemd[1]: sshd@14-116.202.17.139:22-20.161.92.111:42212.service: Deactivated successfully. Mar 7 00:56:44.208151 systemd[1]: session-14.scope: Deactivated successfully. Mar 7 00:56:44.210240 systemd-logind[1459]: Session 14 logged out. Waiting for processes to exit. Mar 7 00:56:44.211491 systemd-logind[1459]: Removed session 14. Mar 7 00:56:44.314113 systemd[1]: Started sshd@15-116.202.17.139:22-20.161.92.111:42220.service - OpenSSH per-connection server daemon (20.161.92.111:42220). Mar 7 00:56:44.910708 sshd[6248]: Accepted publickey for core from 20.161.92.111 port 42220 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:44.913217 sshd[6248]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:44.918312 systemd-logind[1459]: New session 15 of user core. Mar 7 00:56:44.926970 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 7 00:56:45.897912 update_engine[1460]: I20260307 00:56:45.897832 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:56:45.898329 update_engine[1460]: I20260307 00:56:45.898095 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:56:45.898329 update_engine[1460]: I20260307 00:56:45.898300 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:56:45.900047 update_engine[1460]: E20260307 00:56:45.899241 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899333 1460 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899351 1460 omaha_request_action.cc:617] Omaha request response: Mar 7 00:56:45.900047 update_engine[1460]: E20260307 00:56:45.899479 1460 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899509 1460 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899521 1460 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899533 1460 update_attempter.cc:306] Processing Done. Mar 7 00:56:45.900047 update_engine[1460]: E20260307 00:56:45.899554 1460 update_attempter.cc:619] Update failed. Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899566 1460 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899577 1460 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899588 1460 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899720 1460 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899832 1460 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 7 00:56:45.900047 update_engine[1460]: I20260307 00:56:45.899845 1460 omaha_request_action.cc:272] Request: Mar 7 00:56:45.900047 update_engine[1460]: Mar 7 00:56:45.900047 update_engine[1460]: Mar 7 00:56:45.900437 locksmithd[1496]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 7 00:56:45.900666 update_engine[1460]: Mar 7 00:56:45.900666 update_engine[1460]: Mar 7 00:56:45.900666 update_engine[1460]: Mar 7 00:56:45.900666 update_engine[1460]: Mar 7 00:56:45.900666 update_engine[1460]: I20260307 00:56:45.899857 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 00:56:45.900666 update_engine[1460]: I20260307 00:56:45.900147 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 00:56:45.900666 update_engine[1460]: I20260307 00:56:45.900337 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 00:56:45.901860 update_engine[1460]: E20260307 00:56:45.901751 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 00:56:45.901860 update_engine[1460]: I20260307 00:56:45.901813 1460 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 7 00:56:45.901860 update_engine[1460]: I20260307 00:56:45.901823 1460 omaha_request_action.cc:617] Omaha request response: Mar 7 00:56:45.901860 update_engine[1460]: I20260307 00:56:45.901830 1460 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 7 00:56:45.901860 update_engine[1460]: I20260307 00:56:45.901836 1460 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 7 00:56:45.901860 update_engine[1460]: I20260307 00:56:45.901844 1460 update_attempter.cc:306] Processing Done. Mar 7 00:56:45.901860 update_engine[1460]: I20260307 00:56:45.901849 1460 update_attempter.cc:310] Error event sent. Mar 7 00:56:45.901860 update_engine[1460]: I20260307 00:56:45.901858 1460 update_check_scheduler.cc:74] Next update check in 48m59s Mar 7 00:56:45.902822 locksmithd[1496]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 7 00:56:46.026702 sshd[6248]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:46.031995 systemd[1]: sshd@15-116.202.17.139:22-20.161.92.111:42220.service: Deactivated successfully. Mar 7 00:56:46.035903 systemd[1]: session-15.scope: Deactivated successfully. Mar 7 00:56:46.037201 systemd-logind[1459]: Session 15 logged out. Waiting for processes to exit. Mar 7 00:56:46.038090 systemd-logind[1459]: Removed session 15. Mar 7 00:56:46.142230 systemd[1]: Started sshd@16-116.202.17.139:22-20.161.92.111:42228.service - OpenSSH per-connection server daemon (20.161.92.111:42228). Mar 7 00:56:46.725566 sshd[6274]: Accepted publickey for core from 20.161.92.111 port 42228 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:46.727573 sshd[6274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:46.733669 systemd-logind[1459]: New session 16 of user core. Mar 7 00:56:46.743046 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 7 00:56:47.338238 sshd[6274]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:47.345333 systemd[1]: sshd@16-116.202.17.139:22-20.161.92.111:42228.service: Deactivated successfully. Mar 7 00:56:47.347975 systemd[1]: session-16.scope: Deactivated successfully. Mar 7 00:56:47.348757 systemd-logind[1459]: Session 16 logged out. Waiting for processes to exit. Mar 7 00:56:47.350166 systemd-logind[1459]: Removed session 16. Mar 7 00:56:47.450138 systemd[1]: Started sshd@17-116.202.17.139:22-20.161.92.111:42236.service - OpenSSH per-connection server daemon (20.161.92.111:42236). Mar 7 00:56:48.036700 sshd[6287]: Accepted publickey for core from 20.161.92.111 port 42236 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:48.038742 sshd[6287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:48.045577 systemd-logind[1459]: New session 17 of user core. Mar 7 00:56:48.054062 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 7 00:56:48.517882 sshd[6287]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:48.522182 systemd[1]: sshd@17-116.202.17.139:22-20.161.92.111:42236.service: Deactivated successfully. Mar 7 00:56:48.524490 systemd[1]: session-17.scope: Deactivated successfully. Mar 7 00:56:48.528017 systemd-logind[1459]: Session 17 logged out. Waiting for processes to exit. Mar 7 00:56:48.529917 systemd-logind[1459]: Removed session 17. Mar 7 00:56:53.637176 systemd[1]: Started sshd@18-116.202.17.139:22-20.161.92.111:40458.service - OpenSSH per-connection server daemon (20.161.92.111:40458). Mar 7 00:56:54.233807 sshd[6304]: Accepted publickey for core from 20.161.92.111 port 40458 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:56:54.235387 sshd[6304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:56:54.241344 systemd-logind[1459]: New session 18 of user core. Mar 7 00:56:54.245924 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 7 00:56:54.721499 sshd[6304]: pam_unix(sshd:session): session closed for user core Mar 7 00:56:54.726073 systemd[1]: sshd@18-116.202.17.139:22-20.161.92.111:40458.service: Deactivated successfully. Mar 7 00:56:54.728980 systemd[1]: session-18.scope: Deactivated successfully. Mar 7 00:56:54.731841 systemd-logind[1459]: Session 18 logged out. Waiting for processes to exit. Mar 7 00:56:54.733080 systemd-logind[1459]: Removed session 18. Mar 7 00:56:59.838008 systemd[1]: Started sshd@19-116.202.17.139:22-20.161.92.111:40466.service - OpenSSH per-connection server daemon (20.161.92.111:40466). Mar 7 00:57:00.427637 sshd[6340]: Accepted publickey for core from 20.161.92.111 port 40466 ssh2: RSA SHA256:fFFMlaCBm9OkQatq7Cg+moKRVH6SG+EKtX7SFDagfEI Mar 7 00:57:00.429735 sshd[6340]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 00:57:00.437837 systemd-logind[1459]: New session 19 of user core. Mar 7 00:57:00.440470 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 7 00:57:00.926182 sshd[6340]: pam_unix(sshd:session): session closed for user core Mar 7 00:57:00.932218 systemd[1]: sshd@19-116.202.17.139:22-20.161.92.111:40466.service: Deactivated successfully. Mar 7 00:57:00.938072 systemd[1]: session-19.scope: Deactivated successfully. Mar 7 00:57:00.940788 systemd-logind[1459]: Session 19 logged out. Waiting for processes to exit. Mar 7 00:57:00.944334 systemd-logind[1459]: Removed session 19. Mar 7 00:57:03.309876 systemd[1]: run-containerd-runc-k8s.io-1bb512d359651bb3f570cfb8addf34fae133f761ebc872bc1cbde8975054e31a-runc.1NigFm.mount: Deactivated successfully. Mar 7 00:57:06.323547 systemd[1]: run-containerd-runc-k8s.io-3b79151afec11a8a2fc4465005c2ee358f482067e5aa2e786e375fdd0c5d3d85-runc.nn9djv.mount: Deactivated successfully. Mar 7 00:57:48.289820 kubelet[2596]: E0307 00:57:48.289706 2596 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:54308->10.0.0.2:2379: read: connection timed out" Mar 7 00:57:48.952066 systemd[1]: cri-containerd-f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158.scope: Deactivated successfully. Mar 7 00:57:48.952328 systemd[1]: cri-containerd-f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158.scope: Consumed 16.201s CPU time. Mar 7 00:57:48.978428 containerd[1477]: time="2026-03-07T00:57:48.978352796Z" level=info msg="shim disconnected" id=f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158 namespace=k8s.io Mar 7 00:57:48.978428 containerd[1477]: time="2026-03-07T00:57:48.978422276Z" level=warning msg="cleaning up after shim disconnected" id=f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158 namespace=k8s.io Mar 7 00:57:48.978428 containerd[1477]: time="2026-03-07T00:57:48.978432836Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:57:48.979184 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158-rootfs.mount: Deactivated successfully. Mar 7 00:57:49.354265 systemd[1]: cri-containerd-62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925.scope: Deactivated successfully. Mar 7 00:57:49.354531 systemd[1]: cri-containerd-62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925.scope: Consumed 4.555s CPU time, 16.8M memory peak, 0B memory swap peak. Mar 7 00:57:49.384937 containerd[1477]: time="2026-03-07T00:57:49.384620320Z" level=info msg="shim disconnected" id=62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925 namespace=k8s.io Mar 7 00:57:49.384937 containerd[1477]: time="2026-03-07T00:57:49.384688759Z" level=warning msg="cleaning up after shim disconnected" id=62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925 namespace=k8s.io Mar 7 00:57:49.384937 containerd[1477]: time="2026-03-07T00:57:49.384699159Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:57:49.386251 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925-rootfs.mount: Deactivated successfully. Mar 7 00:57:49.961196 kubelet[2596]: I0307 00:57:49.961159 2596 scope.go:117] "RemoveContainer" containerID="62c6280fcc28a91c336cccf76439e8633b73c2ff8330c3b68cca5fd7cb2b7925" Mar 7 00:57:49.962107 kubelet[2596]: I0307 00:57:49.962081 2596 scope.go:117] "RemoveContainer" containerID="f148a2b6be27d7f7f0a56cffe9c8188dabe19f74468c2758bf5bc283ed630158" Mar 7 00:57:49.964575 containerd[1477]: time="2026-03-07T00:57:49.964533785Z" level=info msg="CreateContainer within sandbox \"94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 7 00:57:49.966627 containerd[1477]: time="2026-03-07T00:57:49.966595577Z" level=info msg="CreateContainer within sandbox \"502c7ce3caa6b4d94c3d501ad53b8a5017f7e860c3193ba442e4a7e12a8509cc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 7 00:57:49.995092 containerd[1477]: time="2026-03-07T00:57:49.994937791Z" level=info msg="CreateContainer within sandbox \"94082b217751a433541c2a28b67498c9ce48938b69ff8c4429c99ef0edd1a7e1\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"3282275ef3eacd3a93d130c21bbb8e0c6d4001e1e6313b6fdf31b7d8fb058214\"" Mar 7 00:57:49.995748 containerd[1477]: time="2026-03-07T00:57:49.995715708Z" level=info msg="CreateContainer within sandbox \"502c7ce3caa6b4d94c3d501ad53b8a5017f7e860c3193ba442e4a7e12a8509cc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"8b2139a018f3167b9026d1b8cde5b44c023e2a222424a7fdb73daa7f69b6cfad\"" Mar 7 00:57:49.996558 containerd[1477]: time="2026-03-07T00:57:49.996525385Z" level=info msg="StartContainer for \"8b2139a018f3167b9026d1b8cde5b44c023e2a222424a7fdb73daa7f69b6cfad\"" Mar 7 00:57:49.996689 containerd[1477]: time="2026-03-07T00:57:49.996648464Z" level=info msg="StartContainer for \"3282275ef3eacd3a93d130c21bbb8e0c6d4001e1e6313b6fdf31b7d8fb058214\"" Mar 7 00:57:50.039409 systemd[1]: Started cri-containerd-8b2139a018f3167b9026d1b8cde5b44c023e2a222424a7fdb73daa7f69b6cfad.scope - libcontainer container 8b2139a018f3167b9026d1b8cde5b44c023e2a222424a7fdb73daa7f69b6cfad. Mar 7 00:57:50.053591 systemd[1]: Started cri-containerd-3282275ef3eacd3a93d130c21bbb8e0c6d4001e1e6313b6fdf31b7d8fb058214.scope - libcontainer container 3282275ef3eacd3a93d130c21bbb8e0c6d4001e1e6313b6fdf31b7d8fb058214. Mar 7 00:57:50.090235 containerd[1477]: time="2026-03-07T00:57:50.090189155Z" level=info msg="StartContainer for \"8b2139a018f3167b9026d1b8cde5b44c023e2a222424a7fdb73daa7f69b6cfad\" returns successfully" Mar 7 00:57:50.094215 containerd[1477]: time="2026-03-07T00:57:50.094163066Z" level=info msg="StartContainer for \"3282275ef3eacd3a93d130c21bbb8e0c6d4001e1e6313b6fdf31b7d8fb058214\" returns successfully" Mar 7 00:57:53.802299 systemd[1]: cri-containerd-163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d.scope: Deactivated successfully. Mar 7 00:57:53.802566 systemd[1]: cri-containerd-163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d.scope: Consumed 3.498s CPU time, 16.1M memory peak, 0B memory swap peak. Mar 7 00:57:53.828123 containerd[1477]: time="2026-03-07T00:57:53.827907306Z" level=info msg="shim disconnected" id=163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d namespace=k8s.io Mar 7 00:57:53.828123 containerd[1477]: time="2026-03-07T00:57:53.828115906Z" level=warning msg="cleaning up after shim disconnected" id=163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d namespace=k8s.io Mar 7 00:57:53.828123 containerd[1477]: time="2026-03-07T00:57:53.828126746Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 00:57:53.831297 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d-rootfs.mount: Deactivated successfully. Mar 7 00:57:53.981715 kubelet[2596]: I0307 00:57:53.981649 2596 scope.go:117] "RemoveContainer" containerID="163ab4687a08c77e4eb7cd2d2e77963a0a44eb2f45b20ebff6a4a2f6631bf33d" Mar 7 00:57:53.984410 containerd[1477]: time="2026-03-07T00:57:53.984214920Z" level=info msg="CreateContainer within sandbox \"e95d8ed2a3cc30e54be777143732bb20b3e6568c4f0c467cccc8a018f37c5bb7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 7 00:57:54.001020 containerd[1477]: time="2026-03-07T00:57:54.000947227Z" level=info msg="CreateContainer within sandbox \"e95d8ed2a3cc30e54be777143732bb20b3e6568c4f0c467cccc8a018f37c5bb7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"457a8c4b56df6832ad945c957484d04f71f24b97b2d369c9e246e006fdfaf40d\"" Mar 7 00:57:54.001705 containerd[1477]: time="2026-03-07T00:57:54.001630508Z" level=info msg="StartContainer for \"457a8c4b56df6832ad945c957484d04f71f24b97b2d369c9e246e006fdfaf40d\"" Mar 7 00:57:54.037858 systemd[1]: Started cri-containerd-457a8c4b56df6832ad945c957484d04f71f24b97b2d369c9e246e006fdfaf40d.scope - libcontainer container 457a8c4b56df6832ad945c957484d04f71f24b97b2d369c9e246e006fdfaf40d. Mar 7 00:57:54.075770 containerd[1477]: time="2026-03-07T00:57:54.075250802Z" level=info msg="StartContainer for \"457a8c4b56df6832ad945c957484d04f71f24b97b2d369c9e246e006fdfaf40d\" returns successfully"