Mar 3 12:42:25.790164 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 3 12:42:25.790187 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Mar 3 11:03:33 -00 2026 Mar 3 12:42:25.790198 kernel: KASLR enabled Mar 3 12:42:25.790203 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 3 12:42:25.790209 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Mar 3 12:42:25.790215 kernel: random: crng init done Mar 3 12:42:25.790221 kernel: secureboot: Secure boot disabled Mar 3 12:42:25.790227 kernel: ACPI: Early table checksum verification disabled Mar 3 12:42:25.790233 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Mar 3 12:42:25.790239 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Mar 3 12:42:25.790246 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790252 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790257 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790263 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790270 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790278 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790284 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790290 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790296 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 12:42:25.790302 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Mar 3 12:42:25.790308 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Mar 3 12:42:25.790314 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 3 12:42:25.790320 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Mar 3 12:42:25.790326 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Mar 3 12:42:25.790332 kernel: Zone ranges: Mar 3 12:42:25.790338 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 3 12:42:25.790345 kernel: DMA32 empty Mar 3 12:42:25.790351 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Mar 3 12:42:25.790357 kernel: Device empty Mar 3 12:42:25.790363 kernel: Movable zone start for each node Mar 3 12:42:25.790369 kernel: Early memory node ranges Mar 3 12:42:25.790375 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Mar 3 12:42:25.790381 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Mar 3 12:42:25.790387 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Mar 3 12:42:25.790393 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Mar 3 12:42:25.790399 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Mar 3 12:42:25.790405 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Mar 3 12:42:25.790411 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Mar 3 12:42:25.790418 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Mar 3 12:42:25.790424 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Mar 3 12:42:25.790433 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Mar 3 12:42:25.790439 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 3 12:42:25.790446 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Mar 3 12:42:25.790454 kernel: psci: probing for conduit method from ACPI. Mar 3 12:42:25.790460 kernel: psci: PSCIv1.1 detected in firmware. Mar 3 12:42:25.790466 kernel: psci: Using standard PSCI v0.2 function IDs Mar 3 12:42:25.790473 kernel: psci: Trusted OS migration not required Mar 3 12:42:25.790479 kernel: psci: SMC Calling Convention v1.1 Mar 3 12:42:25.790486 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 3 12:42:25.790492 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 3 12:42:25.790498 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 3 12:42:25.790505 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 3 12:42:25.790511 kernel: Detected PIPT I-cache on CPU0 Mar 3 12:42:25.790518 kernel: CPU features: detected: GIC system register CPU interface Mar 3 12:42:25.790525 kernel: CPU features: detected: Spectre-v4 Mar 3 12:42:25.790532 kernel: CPU features: detected: Spectre-BHB Mar 3 12:42:25.790538 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 3 12:42:25.790544 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 3 12:42:25.790551 kernel: CPU features: detected: ARM erratum 1418040 Mar 3 12:42:25.790587 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 3 12:42:25.790596 kernel: alternatives: applying boot alternatives Mar 3 12:42:25.790603 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9550c2083f3062ad7c57f28a015a3afab95dfddb073076612b771af8d5df9e06 Mar 3 12:42:25.790610 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 3 12:42:25.790616 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 3 12:42:25.790623 kernel: Fallback order for Node 0: 0 Mar 3 12:42:25.790632 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Mar 3 12:42:25.790638 kernel: Policy zone: Normal Mar 3 12:42:25.790645 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 3 12:42:25.790651 kernel: software IO TLB: area num 2. Mar 3 12:42:25.790658 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Mar 3 12:42:25.790664 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 3 12:42:25.790670 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 3 12:42:25.790677 kernel: rcu: RCU event tracing is enabled. Mar 3 12:42:25.790684 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 3 12:42:25.790691 kernel: Trampoline variant of Tasks RCU enabled. Mar 3 12:42:25.790697 kernel: Tracing variant of Tasks RCU enabled. Mar 3 12:42:25.790704 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 3 12:42:25.790712 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 3 12:42:25.790718 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 12:42:25.790725 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 12:42:25.790731 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 3 12:42:25.790737 kernel: GICv3: 256 SPIs implemented Mar 3 12:42:25.790744 kernel: GICv3: 0 Extended SPIs implemented Mar 3 12:42:25.790750 kernel: Root IRQ handler: gic_handle_irq Mar 3 12:42:25.790756 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 3 12:42:25.790763 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Mar 3 12:42:25.790769 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 3 12:42:25.790775 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 3 12:42:25.790783 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Mar 3 12:42:25.790790 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Mar 3 12:42:25.790796 kernel: GICv3: using LPI property table @0x0000000100120000 Mar 3 12:42:25.790803 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Mar 3 12:42:25.790809 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 3 12:42:25.790816 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 3 12:42:25.790822 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 3 12:42:25.790829 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 3 12:42:25.790835 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 3 12:42:25.790842 kernel: Console: colour dummy device 80x25 Mar 3 12:42:25.790848 kernel: ACPI: Core revision 20240827 Mar 3 12:42:25.790856 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 3 12:42:25.790863 kernel: pid_max: default: 32768 minimum: 301 Mar 3 12:42:25.790870 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 3 12:42:25.790876 kernel: landlock: Up and running. Mar 3 12:42:25.790883 kernel: SELinux: Initializing. Mar 3 12:42:25.790889 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 3 12:42:25.790897 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 3 12:42:25.790903 kernel: rcu: Hierarchical SRCU implementation. Mar 3 12:42:25.790910 kernel: rcu: Max phase no-delay instances is 400. Mar 3 12:42:25.790918 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 3 12:42:25.790925 kernel: Remapping and enabling EFI services. Mar 3 12:42:25.790931 kernel: smp: Bringing up secondary CPUs ... Mar 3 12:42:25.790938 kernel: Detected PIPT I-cache on CPU1 Mar 3 12:42:25.790944 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 3 12:42:25.790951 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Mar 3 12:42:25.790958 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 3 12:42:25.790965 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 3 12:42:25.790971 kernel: smp: Brought up 1 node, 2 CPUs Mar 3 12:42:25.790978 kernel: SMP: Total of 2 processors activated. Mar 3 12:42:25.790991 kernel: CPU: All CPU(s) started at EL1 Mar 3 12:42:25.790997 kernel: CPU features: detected: 32-bit EL0 Support Mar 3 12:42:25.791006 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 3 12:42:25.791013 kernel: CPU features: detected: Common not Private translations Mar 3 12:42:25.791020 kernel: CPU features: detected: CRC32 instructions Mar 3 12:42:25.791027 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 3 12:42:25.791034 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 3 12:42:25.791042 kernel: CPU features: detected: LSE atomic instructions Mar 3 12:42:25.791050 kernel: CPU features: detected: Privileged Access Never Mar 3 12:42:25.791057 kernel: CPU features: detected: RAS Extension Support Mar 3 12:42:25.791064 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 3 12:42:25.791071 kernel: alternatives: applying system-wide alternatives Mar 3 12:42:25.791078 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Mar 3 12:42:25.791085 kernel: Memory: 3858852K/4096000K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 215668K reserved, 16384K cma-reserved) Mar 3 12:42:25.791092 kernel: devtmpfs: initialized Mar 3 12:42:25.791100 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 3 12:42:25.791237 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 3 12:42:25.791246 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 3 12:42:25.791253 kernel: 0 pages in range for non-PLT usage Mar 3 12:42:25.791260 kernel: 508400 pages in range for PLT usage Mar 3 12:42:25.791267 kernel: pinctrl core: initialized pinctrl subsystem Mar 3 12:42:25.791274 kernel: SMBIOS 3.0.0 present. Mar 3 12:42:25.791281 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Mar 3 12:42:25.791288 kernel: DMI: Memory slots populated: 1/1 Mar 3 12:42:25.791295 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 3 12:42:25.791304 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 3 12:42:25.791311 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 3 12:42:25.791319 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 3 12:42:25.791326 kernel: audit: initializing netlink subsys (disabled) Mar 3 12:42:25.791333 kernel: audit: type=2000 audit(0.014:1): state=initialized audit_enabled=0 res=1 Mar 3 12:42:25.791340 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 3 12:42:25.791347 kernel: cpuidle: using governor menu Mar 3 12:42:25.791354 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 3 12:42:25.791361 kernel: ASID allocator initialised with 32768 entries Mar 3 12:42:25.791369 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 3 12:42:25.791376 kernel: Serial: AMBA PL011 UART driver Mar 3 12:42:25.791383 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 3 12:42:25.791390 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 3 12:42:25.791397 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 3 12:42:25.791404 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 3 12:42:25.791411 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 3 12:42:25.791417 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 3 12:42:25.791424 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 3 12:42:25.791433 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 3 12:42:25.791440 kernel: ACPI: Added _OSI(Module Device) Mar 3 12:42:25.791447 kernel: ACPI: Added _OSI(Processor Device) Mar 3 12:42:25.791454 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 3 12:42:25.791461 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 3 12:42:25.791468 kernel: ACPI: Interpreter enabled Mar 3 12:42:25.791474 kernel: ACPI: Using GIC for interrupt routing Mar 3 12:42:25.791481 kernel: ACPI: MCFG table detected, 1 entries Mar 3 12:42:25.791489 kernel: ACPI: CPU0 has been hot-added Mar 3 12:42:25.791496 kernel: ACPI: CPU1 has been hot-added Mar 3 12:42:25.791504 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 3 12:42:25.791511 kernel: printk: legacy console [ttyAMA0] enabled Mar 3 12:42:25.791518 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 3 12:42:25.791697 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 3 12:42:25.791765 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 3 12:42:25.791824 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 3 12:42:25.791881 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 3 12:42:25.791944 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 3 12:42:25.791953 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 3 12:42:25.791960 kernel: PCI host bridge to bus 0000:00 Mar 3 12:42:25.792024 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 3 12:42:25.792078 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 3 12:42:25.792386 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 3 12:42:25.792452 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 3 12:42:25.793016 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Mar 3 12:42:25.793129 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Mar 3 12:42:25.793226 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Mar 3 12:42:25.793291 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Mar 3 12:42:25.793360 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.793420 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Mar 3 12:42:25.793485 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 12:42:25.793545 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Mar 3 12:42:25.793649 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Mar 3 12:42:25.793721 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.793782 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Mar 3 12:42:25.793841 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 12:42:25.793904 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Mar 3 12:42:25.793976 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.794044 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Mar 3 12:42:25.794150 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 12:42:25.794216 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Mar 3 12:42:25.794275 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Mar 3 12:42:25.794344 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.794404 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Mar 3 12:42:25.794466 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 12:42:25.794524 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Mar 3 12:42:25.794595 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Mar 3 12:42:25.794662 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.794724 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Mar 3 12:42:25.794782 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 12:42:25.794840 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Mar 3 12:42:25.794900 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Mar 3 12:42:25.794966 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.795025 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Mar 3 12:42:25.795083 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 12:42:25.795158 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Mar 3 12:42:25.795225 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Mar 3 12:42:25.795303 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.795368 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Mar 3 12:42:25.795427 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 12:42:25.795493 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Mar 3 12:42:25.795553 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Mar 3 12:42:25.795636 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.795697 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Mar 3 12:42:25.795759 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 12:42:25.795818 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Mar 3 12:42:25.795884 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 12:42:25.795943 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Mar 3 12:42:25.796001 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 12:42:25.796059 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Mar 3 12:42:25.796155 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Mar 3 12:42:25.796221 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Mar 3 12:42:25.796292 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 3 12:42:25.796356 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Mar 3 12:42:25.796417 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Mar 3 12:42:25.796478 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 3 12:42:25.796546 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 3 12:42:25.796644 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Mar 3 12:42:25.796728 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Mar 3 12:42:25.796793 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Mar 3 12:42:25.796856 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Mar 3 12:42:25.796926 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 3 12:42:25.796990 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Mar 3 12:42:25.797061 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 3 12:42:25.797172 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Mar 3 12:42:25.797240 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Mar 3 12:42:25.797310 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Mar 3 12:42:25.799198 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Mar 3 12:42:25.799297 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Mar 3 12:42:25.799378 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 3 12:42:25.799444 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Mar 3 12:42:25.799515 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Mar 3 12:42:25.799596 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 3 12:42:25.799671 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Mar 3 12:42:25.799739 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Mar 3 12:42:25.799799 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Mar 3 12:42:25.799865 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Mar 3 12:42:25.799924 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Mar 3 12:42:25.799986 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Mar 3 12:42:25.800049 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 3 12:42:25.800143 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Mar 3 12:42:25.800207 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Mar 3 12:42:25.800271 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 3 12:42:25.800333 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Mar 3 12:42:25.800392 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Mar 3 12:42:25.800458 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 3 12:42:25.800533 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Mar 3 12:42:25.801202 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Mar 3 12:42:25.801301 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 3 12:42:25.801364 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Mar 3 12:42:25.801424 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Mar 3 12:42:25.801487 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 3 12:42:25.801554 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Mar 3 12:42:25.801655 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Mar 3 12:42:25.801720 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 3 12:42:25.801781 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Mar 3 12:42:25.801840 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Mar 3 12:42:25.801903 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 3 12:42:25.801966 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Mar 3 12:42:25.802025 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Mar 3 12:42:25.802086 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Mar 3 12:42:25.803022 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Mar 3 12:42:25.803102 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Mar 3 12:42:25.803187 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Mar 3 12:42:25.803250 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Mar 3 12:42:25.803310 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Mar 3 12:42:25.803377 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Mar 3 12:42:25.803436 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Mar 3 12:42:25.803496 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Mar 3 12:42:25.803555 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Mar 3 12:42:25.803632 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Mar 3 12:42:25.803692 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Mar 3 12:42:25.803763 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Mar 3 12:42:25.803826 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Mar 3 12:42:25.803887 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Mar 3 12:42:25.803947 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Mar 3 12:42:25.804008 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Mar 3 12:42:25.804066 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Mar 3 12:42:25.804150 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Mar 3 12:42:25.804213 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Mar 3 12:42:25.804273 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Mar 3 12:42:25.804335 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Mar 3 12:42:25.804393 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Mar 3 12:42:25.804452 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Mar 3 12:42:25.804514 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Mar 3 12:42:25.804607 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Mar 3 12:42:25.804676 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Mar 3 12:42:25.804735 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Mar 3 12:42:25.804794 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Mar 3 12:42:25.804852 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Mar 3 12:42:25.804912 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Mar 3 12:42:25.804971 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Mar 3 12:42:25.805030 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Mar 3 12:42:25.805088 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Mar 3 12:42:25.805461 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Mar 3 12:42:25.805537 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Mar 3 12:42:25.805618 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Mar 3 12:42:25.805685 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Mar 3 12:42:25.805750 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Mar 3 12:42:25.805819 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Mar 3 12:42:25.805881 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Mar 3 12:42:25.805949 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Mar 3 12:42:25.807152 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 12:42:25.807242 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 3 12:42:25.807303 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Mar 3 12:42:25.807363 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Mar 3 12:42:25.807430 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Mar 3 12:42:25.807491 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 12:42:25.807551 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 3 12:42:25.807634 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Mar 3 12:42:25.807694 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Mar 3 12:42:25.807761 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Mar 3 12:42:25.807823 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Mar 3 12:42:25.807883 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 12:42:25.807943 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 3 12:42:25.808005 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Mar 3 12:42:25.808063 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Mar 3 12:42:25.808152 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Mar 3 12:42:25.809237 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 12:42:25.809302 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 3 12:42:25.809363 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Mar 3 12:42:25.809422 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Mar 3 12:42:25.809496 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Mar 3 12:42:25.809571 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Mar 3 12:42:25.809641 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 12:42:25.809783 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 3 12:42:25.809851 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Mar 3 12:42:25.809912 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Mar 3 12:42:25.809978 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Mar 3 12:42:25.810044 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Mar 3 12:42:25.810120 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 12:42:25.812161 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 3 12:42:25.812267 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Mar 3 12:42:25.812337 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 3 12:42:25.812407 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Mar 3 12:42:25.812470 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Mar 3 12:42:25.812536 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Mar 3 12:42:25.812614 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 12:42:25.812682 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 3 12:42:25.812741 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Mar 3 12:42:25.812802 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 3 12:42:25.812863 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 12:42:25.812922 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 3 12:42:25.812981 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Mar 3 12:42:25.813047 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 3 12:42:25.813145 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 12:42:25.813217 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Mar 3 12:42:25.813279 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Mar 3 12:42:25.813346 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Mar 3 12:42:25.813422 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 3 12:42:25.813483 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 3 12:42:25.813540 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 3 12:42:25.813644 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 3 12:42:25.813707 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Mar 3 12:42:25.813769 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Mar 3 12:42:25.813833 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Mar 3 12:42:25.813892 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Mar 3 12:42:25.813949 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Mar 3 12:42:25.814013 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Mar 3 12:42:25.814077 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Mar 3 12:42:25.814154 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Mar 3 12:42:25.814226 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Mar 3 12:42:25.814284 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Mar 3 12:42:25.814341 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Mar 3 12:42:25.814406 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Mar 3 12:42:25.814463 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Mar 3 12:42:25.814521 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Mar 3 12:42:25.814600 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Mar 3 12:42:25.814665 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Mar 3 12:42:25.814723 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 3 12:42:25.814786 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Mar 3 12:42:25.814845 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Mar 3 12:42:25.814903 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 3 12:42:25.814971 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Mar 3 12:42:25.815032 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Mar 3 12:42:25.815090 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 3 12:42:25.815498 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Mar 3 12:42:25.815578 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Mar 3 12:42:25.815640 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Mar 3 12:42:25.815650 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 3 12:42:25.815658 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 3 12:42:25.815669 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 3 12:42:25.815677 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 3 12:42:25.815684 kernel: iommu: Default domain type: Translated Mar 3 12:42:25.815692 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 3 12:42:25.815699 kernel: efivars: Registered efivars operations Mar 3 12:42:25.815707 kernel: vgaarb: loaded Mar 3 12:42:25.815728 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 3 12:42:25.815737 kernel: VFS: Disk quotas dquot_6.6.0 Mar 3 12:42:25.815744 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 3 12:42:25.815754 kernel: pnp: PnP ACPI init Mar 3 12:42:25.815830 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 3 12:42:25.815843 kernel: pnp: PnP ACPI: found 1 devices Mar 3 12:42:25.815851 kernel: NET: Registered PF_INET protocol family Mar 3 12:42:25.816180 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 3 12:42:25.816192 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 3 12:42:25.816200 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 3 12:42:25.816207 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 3 12:42:25.816219 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 3 12:42:25.816227 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 3 12:42:25.816234 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 3 12:42:25.816242 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 3 12:42:25.816249 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 3 12:42:25.816346 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Mar 3 12:42:25.816358 kernel: PCI: CLS 0 bytes, default 64 Mar 3 12:42:25.816366 kernel: kvm [1]: HYP mode not available Mar 3 12:42:25.816373 kernel: Initialise system trusted keyrings Mar 3 12:42:25.816383 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 3 12:42:25.816391 kernel: Key type asymmetric registered Mar 3 12:42:25.816398 kernel: Asymmetric key parser 'x509' registered Mar 3 12:42:25.816405 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 3 12:42:25.816413 kernel: io scheduler mq-deadline registered Mar 3 12:42:25.816420 kernel: io scheduler kyber registered Mar 3 12:42:25.816428 kernel: io scheduler bfq registered Mar 3 12:42:25.816436 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 3 12:42:25.816501 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Mar 3 12:42:25.816616 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Mar 3 12:42:25.817271 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.817350 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Mar 3 12:42:25.817411 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Mar 3 12:42:25.817471 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.817535 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Mar 3 12:42:25.817614 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Mar 3 12:42:25.817677 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.817746 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Mar 3 12:42:25.817808 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Mar 3 12:42:25.817867 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.819235 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Mar 3 12:42:25.819315 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Mar 3 12:42:25.819379 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.819444 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Mar 3 12:42:25.819515 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Mar 3 12:42:25.819626 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.819695 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Mar 3 12:42:25.819755 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Mar 3 12:42:25.819815 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.819876 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Mar 3 12:42:25.819936 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Mar 3 12:42:25.819995 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.820009 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Mar 3 12:42:25.820069 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Mar 3 12:42:25.820147 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Mar 3 12:42:25.820213 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 3 12:42:25.820223 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 3 12:42:25.820231 kernel: ACPI: button: Power Button [PWRB] Mar 3 12:42:25.820238 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 3 12:42:25.820304 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Mar 3 12:42:25.820370 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Mar 3 12:42:25.820384 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 3 12:42:25.820392 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 3 12:42:25.820458 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Mar 3 12:42:25.820469 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Mar 3 12:42:25.820476 kernel: thunder_xcv, ver 1.0 Mar 3 12:42:25.820484 kernel: thunder_bgx, ver 1.0 Mar 3 12:42:25.820491 kernel: nicpf, ver 1.0 Mar 3 12:42:25.820499 kernel: nicvf, ver 1.0 Mar 3 12:42:25.820587 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 3 12:42:25.820653 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-03T12:42:25 UTC (1772541745) Mar 3 12:42:25.820663 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 3 12:42:25.820672 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 3 12:42:25.820680 kernel: NET: Registered PF_INET6 protocol family Mar 3 12:42:25.820687 kernel: watchdog: NMI not fully supported Mar 3 12:42:25.820695 kernel: watchdog: Hard watchdog permanently disabled Mar 3 12:42:25.820702 kernel: Segment Routing with IPv6 Mar 3 12:42:25.820710 kernel: In-situ OAM (IOAM) with IPv6 Mar 3 12:42:25.820718 kernel: NET: Registered PF_PACKET protocol family Mar 3 12:42:25.820726 kernel: Key type dns_resolver registered Mar 3 12:42:25.820733 kernel: registered taskstats version 1 Mar 3 12:42:25.820741 kernel: Loading compiled-in X.509 certificates Mar 3 12:42:25.820748 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 14a741e1e2b172e51b42fe87d143cf4cae2ad92c' Mar 3 12:42:25.820756 kernel: Demotion targets for Node 0: null Mar 3 12:42:25.820763 kernel: Key type .fscrypt registered Mar 3 12:42:25.820770 kernel: Key type fscrypt-provisioning registered Mar 3 12:42:25.820777 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 3 12:42:25.820786 kernel: ima: Allocated hash algorithm: sha1 Mar 3 12:42:25.820794 kernel: ima: No architecture policies found Mar 3 12:42:25.820801 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 3 12:42:25.820808 kernel: clk: Disabling unused clocks Mar 3 12:42:25.820816 kernel: PM: genpd: Disabling unused power domains Mar 3 12:42:25.820823 kernel: Warning: unable to open an initial console. Mar 3 12:42:25.820831 kernel: Freeing unused kernel memory: 39552K Mar 3 12:42:25.820838 kernel: Run /init as init process Mar 3 12:42:25.820845 kernel: with arguments: Mar 3 12:42:25.820854 kernel: /init Mar 3 12:42:25.820861 kernel: with environment: Mar 3 12:42:25.820868 kernel: HOME=/ Mar 3 12:42:25.820876 kernel: TERM=linux Mar 3 12:42:25.820884 systemd[1]: Successfully made /usr/ read-only. Mar 3 12:42:25.820895 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 3 12:42:25.820904 systemd[1]: Detected virtualization kvm. Mar 3 12:42:25.820911 systemd[1]: Detected architecture arm64. Mar 3 12:42:25.820920 systemd[1]: Running in initrd. Mar 3 12:42:25.820928 systemd[1]: No hostname configured, using default hostname. Mar 3 12:42:25.820936 systemd[1]: Hostname set to . Mar 3 12:42:25.820944 systemd[1]: Initializing machine ID from VM UUID. Mar 3 12:42:25.820951 systemd[1]: Queued start job for default target initrd.target. Mar 3 12:42:25.820959 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 12:42:25.820967 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 12:42:25.820976 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 3 12:42:25.820985 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 3 12:42:25.820993 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 3 12:42:25.821002 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 3 12:42:25.821011 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 3 12:42:25.821019 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 3 12:42:25.821027 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 12:42:25.821036 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 3 12:42:25.821044 systemd[1]: Reached target paths.target - Path Units. Mar 3 12:42:25.821052 systemd[1]: Reached target slices.target - Slice Units. Mar 3 12:42:25.821059 systemd[1]: Reached target swap.target - Swaps. Mar 3 12:42:25.821067 systemd[1]: Reached target timers.target - Timer Units. Mar 3 12:42:25.821075 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 3 12:42:25.821083 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 3 12:42:25.821091 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 3 12:42:25.821099 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 3 12:42:25.825189 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 3 12:42:25.825204 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 3 12:42:25.825212 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 12:42:25.825220 systemd[1]: Reached target sockets.target - Socket Units. Mar 3 12:42:25.825228 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 3 12:42:25.825236 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 3 12:42:25.825244 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 3 12:42:25.825253 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 3 12:42:25.825281 systemd[1]: Starting systemd-fsck-usr.service... Mar 3 12:42:25.825291 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 3 12:42:25.825300 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 3 12:42:25.825307 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 12:42:25.825315 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 3 12:42:25.825324 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 12:42:25.825368 systemd-journald[245]: Collecting audit messages is disabled. Mar 3 12:42:25.825389 systemd[1]: Finished systemd-fsck-usr.service. Mar 3 12:42:25.825398 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 3 12:42:25.825408 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 3 12:42:25.825416 kernel: Bridge firewalling registered Mar 3 12:42:25.825424 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 3 12:42:25.825432 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 3 12:42:25.825440 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 3 12:42:25.825448 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 12:42:25.825457 systemd-journald[245]: Journal started Mar 3 12:42:25.825476 systemd-journald[245]: Runtime Journal (/run/log/journal/45a9ff0c9e984dc188c47eba8bd45427) is 8M, max 76.5M, 68.5M free. Mar 3 12:42:25.780043 systemd-modules-load[247]: Inserted module 'overlay' Mar 3 12:42:25.802155 systemd-modules-load[247]: Inserted module 'br_netfilter' Mar 3 12:42:25.828479 systemd[1]: Started systemd-journald.service - Journal Service. Mar 3 12:42:25.830137 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 3 12:42:25.834076 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 3 12:42:25.835295 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 3 12:42:25.840683 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 3 12:42:25.858146 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 12:42:25.860087 systemd-tmpfiles[270]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 3 12:42:25.865468 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 12:42:25.868279 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 3 12:42:25.880318 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 3 12:42:25.884430 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 3 12:42:25.914900 systemd-resolved[281]: Positive Trust Anchors: Mar 3 12:42:25.915358 systemd-resolved[281]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 3 12:42:25.915392 systemd-resolved[281]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 3 12:42:25.923260 dracut-cmdline[286]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9550c2083f3062ad7c57f28a015a3afab95dfddb073076612b771af8d5df9e06 Mar 3 12:42:25.922594 systemd-resolved[281]: Defaulting to hostname 'linux'. Mar 3 12:42:25.924308 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 3 12:42:25.925017 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 3 12:42:26.026127 kernel: SCSI subsystem initialized Mar 3 12:42:26.030162 kernel: Loading iSCSI transport class v2.0-870. Mar 3 12:42:26.038132 kernel: iscsi: registered transport (tcp) Mar 3 12:42:26.051216 kernel: iscsi: registered transport (qla4xxx) Mar 3 12:42:26.051286 kernel: QLogic iSCSI HBA Driver Mar 3 12:42:26.075140 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 3 12:42:26.113297 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 12:42:26.117670 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 3 12:42:26.174494 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 3 12:42:26.178509 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 3 12:42:26.244175 kernel: raid6: neonx8 gen() 11674 MB/s Mar 3 12:42:26.261162 kernel: raid6: neonx4 gen() 15645 MB/s Mar 3 12:42:26.278164 kernel: raid6: neonx2 gen() 12868 MB/s Mar 3 12:42:26.295183 kernel: raid6: neonx1 gen() 10340 MB/s Mar 3 12:42:26.312171 kernel: raid6: int64x8 gen() 6842 MB/s Mar 3 12:42:26.329183 kernel: raid6: int64x4 gen() 7291 MB/s Mar 3 12:42:26.346174 kernel: raid6: int64x2 gen() 6052 MB/s Mar 3 12:42:26.363182 kernel: raid6: int64x1 gen() 4965 MB/s Mar 3 12:42:26.363272 kernel: raid6: using algorithm neonx4 gen() 15645 MB/s Mar 3 12:42:26.380175 kernel: raid6: .... xor() 12257 MB/s, rmw enabled Mar 3 12:42:26.380245 kernel: raid6: using neon recovery algorithm Mar 3 12:42:26.385389 kernel: xor: measuring software checksum speed Mar 3 12:42:26.385474 kernel: 8regs : 20619 MB/sec Mar 3 12:42:26.385499 kernel: 32regs : 21710 MB/sec Mar 3 12:42:26.385521 kernel: arm64_neon : 28167 MB/sec Mar 3 12:42:26.386168 kernel: xor: using function: arm64_neon (28167 MB/sec) Mar 3 12:42:26.440163 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 3 12:42:26.450287 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 3 12:42:26.453290 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 12:42:26.482341 systemd-udevd[493]: Using default interface naming scheme 'v255'. Mar 3 12:42:26.486686 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 12:42:26.491468 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 3 12:42:26.518287 dracut-pre-trigger[503]: rd.md=0: removing MD RAID activation Mar 3 12:42:26.550297 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 3 12:42:26.553404 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 3 12:42:26.623337 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 12:42:26.627319 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 3 12:42:26.714128 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Mar 3 12:42:26.718182 kernel: scsi host0: Virtio SCSI HBA Mar 3 12:42:26.727141 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 3 12:42:26.729165 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 3 12:42:26.740127 kernel: ACPI: bus type USB registered Mar 3 12:42:26.740180 kernel: usbcore: registered new interface driver usbfs Mar 3 12:42:26.743129 kernel: usbcore: registered new interface driver hub Mar 3 12:42:26.743177 kernel: usbcore: registered new device driver usb Mar 3 12:42:26.753019 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 12:42:26.753598 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 12:42:26.756942 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 12:42:26.758981 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 12:42:26.773292 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 3 12:42:26.793369 kernel: sd 0:0:0:1: Power-on or device reset occurred Mar 3 12:42:26.793577 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Mar 3 12:42:26.795810 kernel: sd 0:0:0:1: [sda] Write Protect is off Mar 3 12:42:26.795992 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Mar 3 12:42:26.796078 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 3 12:42:26.796220 kernel: sr 0:0:0:0: Power-on or device reset occurred Mar 3 12:42:26.797618 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 12:42:26.800649 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Mar 3 12:42:26.800782 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 3 12:42:26.800793 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Mar 3 12:42:26.803351 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 3 12:42:26.803378 kernel: GPT:17805311 != 80003071 Mar 3 12:42:26.803398 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 3 12:42:26.803409 kernel: GPT:17805311 != 80003071 Mar 3 12:42:26.804288 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 3 12:42:26.804336 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 12:42:26.805134 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Mar 3 12:42:26.810128 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 3 12:42:26.810302 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 3 12:42:26.812676 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 3 12:42:26.812857 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 3 12:42:26.812940 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 3 12:42:26.813576 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 3 12:42:26.814139 kernel: hub 1-0:1.0: USB hub found Mar 3 12:42:26.815352 kernel: hub 1-0:1.0: 4 ports detected Mar 3 12:42:26.815472 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 3 12:42:26.816119 kernel: hub 2-0:1.0: USB hub found Mar 3 12:42:26.817127 kernel: hub 2-0:1.0: 4 ports detected Mar 3 12:42:26.876957 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 3 12:42:26.888771 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 3 12:42:26.895855 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 3 12:42:26.896625 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 3 12:42:26.905181 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 3 12:42:26.907531 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 3 12:42:26.915161 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 3 12:42:26.916460 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 3 12:42:26.917354 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 12:42:26.918927 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 3 12:42:26.926532 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 3 12:42:26.933987 disk-uuid[599]: Primary Header is updated. Mar 3 12:42:26.933987 disk-uuid[599]: Secondary Entries is updated. Mar 3 12:42:26.933987 disk-uuid[599]: Secondary Header is updated. Mar 3 12:42:26.945836 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 12:42:26.955421 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 3 12:42:27.053142 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 3 12:42:27.185892 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Mar 3 12:42:27.185951 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 3 12:42:27.187143 kernel: usbcore: registered new interface driver usbhid Mar 3 12:42:27.187178 kernel: usbhid: USB HID core driver Mar 3 12:42:27.293152 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Mar 3 12:42:27.419139 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Mar 3 12:42:27.472189 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Mar 3 12:42:27.967887 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 12:42:27.969102 disk-uuid[600]: The operation has completed successfully. Mar 3 12:42:28.019634 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 3 12:42:28.019732 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 3 12:42:28.050136 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 3 12:42:28.077062 sh[626]: Success Mar 3 12:42:28.092231 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 3 12:42:28.092291 kernel: device-mapper: uevent: version 1.0.3 Mar 3 12:42:28.093124 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 3 12:42:28.101154 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 3 12:42:28.152574 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 3 12:42:28.156494 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 3 12:42:28.172654 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 3 12:42:28.183141 kernel: BTRFS: device fsid 639fb782-fb4f-4fdd-a572-72667a093996 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (638) Mar 3 12:42:28.186131 kernel: BTRFS info (device dm-0): first mount of filesystem 639fb782-fb4f-4fdd-a572-72667a093996 Mar 3 12:42:28.186176 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 3 12:42:28.193596 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 3 12:42:28.193652 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 3 12:42:28.193678 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 3 12:42:28.196486 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 3 12:42:28.198443 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 3 12:42:28.200588 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 3 12:42:28.201832 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 3 12:42:28.206377 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 3 12:42:28.237189 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (671) Mar 3 12:42:28.239150 kernel: BTRFS info (device sda6): first mount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 3 12:42:28.239203 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 3 12:42:28.244162 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 12:42:28.244218 kernel: BTRFS info (device sda6): turning on async discard Mar 3 12:42:28.244228 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 12:42:28.250173 kernel: BTRFS info (device sda6): last unmount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 3 12:42:28.252376 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 3 12:42:28.254283 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 3 12:42:28.348966 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 3 12:42:28.351429 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 3 12:42:28.385836 systemd-networkd[812]: lo: Link UP Mar 3 12:42:28.385848 systemd-networkd[812]: lo: Gained carrier Mar 3 12:42:28.387836 systemd-networkd[812]: Enumeration completed Mar 3 12:42:28.387930 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 3 12:42:28.388725 systemd[1]: Reached target network.target - Network. Mar 3 12:42:28.389709 systemd-networkd[812]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:28.389713 systemd-networkd[812]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 12:42:28.390961 systemd-networkd[812]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:28.390964 systemd-networkd[812]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 12:42:28.391824 systemd-networkd[812]: eth0: Link UP Mar 3 12:42:28.392043 systemd-networkd[812]: eth1: Link UP Mar 3 12:42:28.392632 systemd-networkd[812]: eth0: Gained carrier Mar 3 12:42:28.392642 systemd-networkd[812]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:28.400333 systemd-networkd[812]: eth1: Gained carrier Mar 3 12:42:28.400354 systemd-networkd[812]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:28.403393 ignition[721]: Ignition 2.22.0 Mar 3 12:42:28.403411 ignition[721]: Stage: fetch-offline Mar 3 12:42:28.403441 ignition[721]: no configs at "/usr/lib/ignition/base.d" Mar 3 12:42:28.403450 ignition[721]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 12:42:28.403538 ignition[721]: parsed url from cmdline: "" Mar 3 12:42:28.403541 ignition[721]: no config URL provided Mar 3 12:42:28.403546 ignition[721]: reading system config file "/usr/lib/ignition/user.ign" Mar 3 12:42:28.403554 ignition[721]: no config at "/usr/lib/ignition/user.ign" Mar 3 12:42:28.406767 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 3 12:42:28.403559 ignition[721]: failed to fetch config: resource requires networking Mar 3 12:42:28.408808 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 3 12:42:28.403895 ignition[721]: Ignition finished successfully Mar 3 12:42:28.438433 ignition[816]: Ignition 2.22.0 Mar 3 12:42:28.438449 ignition[816]: Stage: fetch Mar 3 12:42:28.438614 ignition[816]: no configs at "/usr/lib/ignition/base.d" Mar 3 12:42:28.438624 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 12:42:28.438702 ignition[816]: parsed url from cmdline: "" Mar 3 12:42:28.440196 systemd-networkd[812]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 3 12:42:28.438705 ignition[816]: no config URL provided Mar 3 12:42:28.438710 ignition[816]: reading system config file "/usr/lib/ignition/user.ign" Mar 3 12:42:28.438716 ignition[816]: no config at "/usr/lib/ignition/user.ign" Mar 3 12:42:28.438744 ignition[816]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 3 12:42:28.439418 ignition[816]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 3 12:42:28.444215 systemd-networkd[812]: eth0: DHCPv4 address 78.47.249.221/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 3 12:42:28.639956 ignition[816]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 3 12:42:28.648586 ignition[816]: GET result: OK Mar 3 12:42:28.649397 ignition[816]: parsing config with SHA512: 54fbfe99164d57eca917e315613bdf8afdace243ca432ae01dfd36d07f5c2a6a0969d8733e48a1e0194970929e27062e82a7d2f43d933822b3d5cc6a7d821e28 Mar 3 12:42:28.657377 unknown[816]: fetched base config from "system" Mar 3 12:42:28.657393 unknown[816]: fetched base config from "system" Mar 3 12:42:28.657828 ignition[816]: fetch: fetch complete Mar 3 12:42:28.657397 unknown[816]: fetched user config from "hetzner" Mar 3 12:42:28.657835 ignition[816]: fetch: fetch passed Mar 3 12:42:28.659940 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 3 12:42:28.657888 ignition[816]: Ignition finished successfully Mar 3 12:42:28.662808 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 3 12:42:28.694042 ignition[823]: Ignition 2.22.0 Mar 3 12:42:28.694061 ignition[823]: Stage: kargs Mar 3 12:42:28.694223 ignition[823]: no configs at "/usr/lib/ignition/base.d" Mar 3 12:42:28.694232 ignition[823]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 12:42:28.695228 ignition[823]: kargs: kargs passed Mar 3 12:42:28.695281 ignition[823]: Ignition finished successfully Mar 3 12:42:28.700863 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 3 12:42:28.703804 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 3 12:42:28.739782 ignition[829]: Ignition 2.22.0 Mar 3 12:42:28.739797 ignition[829]: Stage: disks Mar 3 12:42:28.740012 ignition[829]: no configs at "/usr/lib/ignition/base.d" Mar 3 12:42:28.740022 ignition[829]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 12:42:28.740898 ignition[829]: disks: disks passed Mar 3 12:42:28.743085 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 3 12:42:28.740949 ignition[829]: Ignition finished successfully Mar 3 12:42:28.745254 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 3 12:42:28.746359 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 3 12:42:28.747647 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 3 12:42:28.748809 systemd[1]: Reached target sysinit.target - System Initialization. Mar 3 12:42:28.750065 systemd[1]: Reached target basic.target - Basic System. Mar 3 12:42:28.752035 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 3 12:42:28.781874 systemd-fsck[838]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 3 12:42:28.785273 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 3 12:42:28.788175 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 3 12:42:28.860145 kernel: EXT4-fs (sda9): mounted filesystem f44cfd4f-a1a9-472a-86a7-c3154f299e07 r/w with ordered data mode. Quota mode: none. Mar 3 12:42:28.861891 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 3 12:42:28.864187 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 3 12:42:28.866539 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 3 12:42:28.867820 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 3 12:42:28.871345 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 3 12:42:28.873805 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 3 12:42:28.875146 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 3 12:42:28.882824 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 3 12:42:28.887270 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 3 12:42:28.905255 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (846) Mar 3 12:42:28.905323 kernel: BTRFS info (device sda6): first mount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 3 12:42:28.908791 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 3 12:42:28.916813 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 12:42:28.916866 kernel: BTRFS info (device sda6): turning on async discard Mar 3 12:42:28.916876 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 12:42:28.922853 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 3 12:42:28.936400 coreos-metadata[848]: Mar 03 12:42:28.936 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 3 12:42:28.938251 coreos-metadata[848]: Mar 03 12:42:28.937 INFO Fetch successful Mar 3 12:42:28.938251 coreos-metadata[848]: Mar 03 12:42:28.938 INFO wrote hostname ci-4459-2-4-8-fcaab3b7ef to /sysroot/etc/hostname Mar 3 12:42:28.944240 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 3 12:42:28.954533 initrd-setup-root[874]: cut: /sysroot/etc/passwd: No such file or directory Mar 3 12:42:28.962087 initrd-setup-root[881]: cut: /sysroot/etc/group: No such file or directory Mar 3 12:42:28.966833 initrd-setup-root[888]: cut: /sysroot/etc/shadow: No such file or directory Mar 3 12:42:28.971479 initrd-setup-root[895]: cut: /sysroot/etc/gshadow: No such file or directory Mar 3 12:42:29.073881 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 3 12:42:29.075855 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 3 12:42:29.077082 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 3 12:42:29.101141 kernel: BTRFS info (device sda6): last unmount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 3 12:42:29.119352 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 3 12:42:29.131953 ignition[965]: INFO : Ignition 2.22.0 Mar 3 12:42:29.134007 ignition[965]: INFO : Stage: mount Mar 3 12:42:29.134007 ignition[965]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 12:42:29.134007 ignition[965]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 12:42:29.134007 ignition[965]: INFO : mount: mount passed Mar 3 12:42:29.134007 ignition[965]: INFO : Ignition finished successfully Mar 3 12:42:29.136762 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 3 12:42:29.138588 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 3 12:42:29.184945 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 3 12:42:29.187870 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 3 12:42:29.214617 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (975) Mar 3 12:42:29.214690 kernel: BTRFS info (device sda6): first mount of filesystem 5bcc6201-9983-4e1f-9352-8a67e2a2e71d Mar 3 12:42:29.214717 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 3 12:42:29.220288 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 12:42:29.220337 kernel: BTRFS info (device sda6): turning on async discard Mar 3 12:42:29.220360 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 12:42:29.224832 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 3 12:42:29.264513 ignition[993]: INFO : Ignition 2.22.0 Mar 3 12:42:29.265378 ignition[993]: INFO : Stage: files Mar 3 12:42:29.266032 ignition[993]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 12:42:29.267942 ignition[993]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 12:42:29.267942 ignition[993]: DEBUG : files: compiled without relabeling support, skipping Mar 3 12:42:29.269840 ignition[993]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 3 12:42:29.270614 ignition[993]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 3 12:42:29.273715 ignition[993]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 3 12:42:29.275487 ignition[993]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 3 12:42:29.277482 unknown[993]: wrote ssh authorized keys file for user: core Mar 3 12:42:29.278840 ignition[993]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 3 12:42:29.280794 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 3 12:42:29.282084 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 3 12:42:29.378933 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 3 12:42:29.443352 systemd-networkd[812]: eth1: Gained IPv6LL Mar 3 12:42:29.475556 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 3 12:42:29.477148 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 3 12:42:29.485992 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 3 12:42:29.485992 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 3 12:42:29.485992 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 3 12:42:29.485992 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 3 12:42:29.485992 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 3 12:42:29.485992 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-arm64.raw: attempt #1 Mar 3 12:42:29.848533 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 3 12:42:30.083518 systemd-networkd[812]: eth0: Gained IPv6LL Mar 3 12:42:30.302932 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 3 12:42:30.302932 ignition[993]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 3 12:42:30.306603 ignition[993]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 3 12:42:30.309317 ignition[993]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 3 12:42:30.309317 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 3 12:42:30.309317 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 3 12:42:30.309317 ignition[993]: INFO : files: files passed Mar 3 12:42:30.309317 ignition[993]: INFO : Ignition finished successfully Mar 3 12:42:30.310453 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 3 12:42:30.313537 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 3 12:42:30.319366 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 3 12:42:30.342234 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 3 12:42:30.342470 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 3 12:42:30.349139 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 3 12:42:30.349139 initrd-setup-root-after-ignition[1022]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 3 12:42:30.352483 initrd-setup-root-after-ignition[1026]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 3 12:42:30.354564 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 3 12:42:30.355512 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 3 12:42:30.358268 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 3 12:42:30.439798 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 3 12:42:30.440008 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 3 12:42:30.442907 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 3 12:42:30.444094 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 3 12:42:30.445707 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 3 12:42:30.446895 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 3 12:42:30.483766 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 3 12:42:30.486463 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 3 12:42:30.514121 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 3 12:42:30.515653 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 12:42:30.516576 systemd[1]: Stopped target timers.target - Timer Units. Mar 3 12:42:30.517786 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 3 12:42:30.517964 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 3 12:42:30.519545 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 3 12:42:30.520806 systemd[1]: Stopped target basic.target - Basic System. Mar 3 12:42:30.521732 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 3 12:42:30.522748 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 3 12:42:30.523917 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 3 12:42:30.525024 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 3 12:42:30.526074 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 3 12:42:30.527079 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 3 12:42:30.528233 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 3 12:42:30.529347 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 3 12:42:30.530257 systemd[1]: Stopped target swap.target - Swaps. Mar 3 12:42:30.531141 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 3 12:42:30.531324 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 3 12:42:30.532610 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 3 12:42:30.533742 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 12:42:30.534811 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 3 12:42:30.534923 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 12:42:30.535962 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 3 12:42:30.536150 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 3 12:42:30.537599 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 3 12:42:30.537757 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 3 12:42:30.538805 systemd[1]: ignition-files.service: Deactivated successfully. Mar 3 12:42:30.538941 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 3 12:42:30.539871 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 3 12:42:30.540008 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 3 12:42:30.542633 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 3 12:42:30.543702 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 3 12:42:30.545477 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 12:42:30.549089 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 3 12:42:30.549650 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 3 12:42:30.549813 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 12:42:30.551142 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 3 12:42:30.551285 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 3 12:42:30.560050 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 3 12:42:30.564339 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 3 12:42:30.578089 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 3 12:42:30.582519 ignition[1046]: INFO : Ignition 2.22.0 Mar 3 12:42:30.582519 ignition[1046]: INFO : Stage: umount Mar 3 12:42:30.587920 ignition[1046]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 12:42:30.587920 ignition[1046]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 12:42:30.587920 ignition[1046]: INFO : umount: umount passed Mar 3 12:42:30.587920 ignition[1046]: INFO : Ignition finished successfully Mar 3 12:42:30.589032 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 3 12:42:30.589237 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 3 12:42:30.593898 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 3 12:42:30.593950 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 3 12:42:30.601547 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 3 12:42:30.601635 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 3 12:42:30.603002 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 3 12:42:30.603048 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 3 12:42:30.604301 systemd[1]: Stopped target network.target - Network. Mar 3 12:42:30.605896 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 3 12:42:30.605952 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 3 12:42:30.607693 systemd[1]: Stopped target paths.target - Path Units. Mar 3 12:42:30.608716 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 3 12:42:30.608765 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 12:42:30.614821 systemd[1]: Stopped target slices.target - Slice Units. Mar 3 12:42:30.615787 systemd[1]: Stopped target sockets.target - Socket Units. Mar 3 12:42:30.617012 systemd[1]: iscsid.socket: Deactivated successfully. Mar 3 12:42:30.617133 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 3 12:42:30.619070 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 3 12:42:30.619103 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 3 12:42:30.620089 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 3 12:42:30.622201 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 3 12:42:30.623538 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 3 12:42:30.623587 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 3 12:42:30.624692 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 3 12:42:30.625461 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 3 12:42:30.626981 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 3 12:42:30.628174 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 3 12:42:30.629344 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 3 12:42:30.629445 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 3 12:42:30.637143 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 3 12:42:30.637412 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 3 12:42:30.643633 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 3 12:42:30.643925 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 3 12:42:30.644057 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 3 12:42:30.651232 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 3 12:42:30.652329 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 3 12:42:30.653035 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 3 12:42:30.653084 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 3 12:42:30.655445 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 3 12:42:30.657256 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 3 12:42:30.657321 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 3 12:42:30.658024 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 3 12:42:30.658066 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 3 12:42:30.661653 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 3 12:42:30.661700 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 3 12:42:30.662416 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 3 12:42:30.662457 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 12:42:30.663259 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 12:42:30.667641 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 3 12:42:30.667709 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 3 12:42:30.686882 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 3 12:42:30.687182 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 12:42:30.690139 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 3 12:42:30.690222 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 3 12:42:30.691552 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 3 12:42:30.691599 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 12:42:30.693051 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 3 12:42:30.693122 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 3 12:42:30.695049 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 3 12:42:30.695096 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 3 12:42:30.696641 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 3 12:42:30.696686 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 3 12:42:30.699047 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 3 12:42:30.702205 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 3 12:42:30.702274 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 12:42:30.705226 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 3 12:42:30.705289 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 12:42:30.707800 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 12:42:30.707854 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 12:42:30.711264 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 3 12:42:30.711318 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 3 12:42:30.711355 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 3 12:42:30.713235 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 3 12:42:30.713343 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 3 12:42:30.719186 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 3 12:42:30.719295 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 3 12:42:30.721041 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 3 12:42:30.723556 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 3 12:42:30.743354 systemd[1]: Switching root. Mar 3 12:42:30.783469 systemd-journald[245]: Journal stopped Mar 3 12:42:31.699414 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Mar 3 12:42:31.699532 kernel: SELinux: policy capability network_peer_controls=1 Mar 3 12:42:31.699547 kernel: SELinux: policy capability open_perms=1 Mar 3 12:42:31.699560 kernel: SELinux: policy capability extended_socket_class=1 Mar 3 12:42:31.699572 kernel: SELinux: policy capability always_check_network=0 Mar 3 12:42:31.699581 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 3 12:42:31.699590 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 3 12:42:31.699599 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 3 12:42:31.699609 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 3 12:42:31.699618 kernel: SELinux: policy capability userspace_initial_context=0 Mar 3 12:42:31.699627 kernel: audit: type=1403 audit(1772541750.913:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 3 12:42:31.699639 systemd[1]: Successfully loaded SELinux policy in 59.700ms. Mar 3 12:42:31.699663 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.069ms. Mar 3 12:42:31.699674 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 3 12:42:31.699684 systemd[1]: Detected virtualization kvm. Mar 3 12:42:31.699694 systemd[1]: Detected architecture arm64. Mar 3 12:42:31.699703 systemd[1]: Detected first boot. Mar 3 12:42:31.699718 systemd[1]: Hostname set to . Mar 3 12:42:31.699728 systemd[1]: Initializing machine ID from VM UUID. Mar 3 12:42:31.699738 zram_generator::config[1090]: No configuration found. Mar 3 12:42:31.699751 kernel: NET: Registered PF_VSOCK protocol family Mar 3 12:42:31.699760 systemd[1]: Populated /etc with preset unit settings. Mar 3 12:42:31.699775 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 3 12:42:31.699788 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 3 12:42:31.699797 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 3 12:42:31.699808 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 3 12:42:31.699822 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 3 12:42:31.699833 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 3 12:42:31.699843 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 3 12:42:31.699853 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 3 12:42:31.699864 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 3 12:42:31.699874 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 3 12:42:31.699884 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 3 12:42:31.699895 systemd[1]: Created slice user.slice - User and Session Slice. Mar 3 12:42:31.699905 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 12:42:31.699916 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 12:42:31.699926 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 3 12:42:31.699936 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 3 12:42:31.699950 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 3 12:42:31.699960 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 3 12:42:31.699970 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 3 12:42:31.699981 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 12:42:31.699991 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 3 12:42:31.700001 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 3 12:42:31.700011 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 3 12:42:31.700021 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 3 12:42:31.700031 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 3 12:42:31.700041 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 12:42:31.700051 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 3 12:42:31.700063 systemd[1]: Reached target slices.target - Slice Units. Mar 3 12:42:31.700073 systemd[1]: Reached target swap.target - Swaps. Mar 3 12:42:31.700083 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 3 12:42:31.700093 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 3 12:42:31.700103 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 3 12:42:31.700128 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 3 12:42:31.700139 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 3 12:42:31.700149 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 12:42:31.700159 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 3 12:42:31.700171 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 3 12:42:31.700181 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 3 12:42:31.700190 systemd[1]: Mounting media.mount - External Media Directory... Mar 3 12:42:31.700200 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 3 12:42:31.700210 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 3 12:42:31.700220 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 3 12:42:31.700231 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 3 12:42:31.700241 systemd[1]: Reached target machines.target - Containers. Mar 3 12:42:31.700252 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 3 12:42:31.700262 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 12:42:31.700273 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 3 12:42:31.700282 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 3 12:42:31.700292 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 12:42:31.700302 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 3 12:42:31.700312 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 12:42:31.700322 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 3 12:42:31.700332 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 12:42:31.700343 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 3 12:42:31.700354 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 3 12:42:31.700364 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 3 12:42:31.700373 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 3 12:42:31.700383 systemd[1]: Stopped systemd-fsck-usr.service. Mar 3 12:42:31.700393 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 12:42:31.700403 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 3 12:42:31.700414 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 3 12:42:31.700425 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 3 12:42:31.700435 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 3 12:42:31.700445 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 3 12:42:31.700455 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 3 12:42:31.700474 systemd[1]: verity-setup.service: Deactivated successfully. Mar 3 12:42:31.700486 systemd[1]: Stopped verity-setup.service. Mar 3 12:42:31.700496 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 3 12:42:31.700507 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 3 12:42:31.700517 systemd[1]: Mounted media.mount - External Media Directory. Mar 3 12:42:31.700527 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 3 12:42:31.700539 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 3 12:42:31.700548 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 3 12:42:31.700558 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 12:42:31.700568 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 3 12:42:31.700578 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 3 12:42:31.700588 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 12:42:31.700598 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 12:42:31.700610 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 12:42:31.700621 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 12:42:31.700631 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 3 12:42:31.700641 kernel: fuse: init (API version 7.41) Mar 3 12:42:31.700650 kernel: loop: module loaded Mar 3 12:42:31.700660 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 3 12:42:31.700669 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 3 12:42:31.700679 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 3 12:42:31.700689 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 3 12:42:31.700699 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 3 12:42:31.700709 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 12:42:31.700720 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 12:42:31.703173 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 12:42:31.703239 systemd-journald[1154]: Collecting audit messages is disabled. Mar 3 12:42:31.703271 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 3 12:42:31.703282 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 3 12:42:31.703292 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 3 12:42:31.703303 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 3 12:42:31.703318 systemd-journald[1154]: Journal started Mar 3 12:42:31.703340 systemd-journald[1154]: Runtime Journal (/run/log/journal/45a9ff0c9e984dc188c47eba8bd45427) is 8M, max 76.5M, 68.5M free. Mar 3 12:42:31.409710 systemd[1]: Queued start job for default target multi-user.target. Mar 3 12:42:31.436228 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 3 12:42:31.436773 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 3 12:42:31.705629 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 3 12:42:31.705680 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 3 12:42:31.708173 kernel: ACPI: bus type drm_connector registered Mar 3 12:42:31.708204 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 3 12:42:31.714137 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 3 12:42:31.717158 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 12:42:31.729158 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 3 12:42:31.729230 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 12:42:31.732629 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 3 12:42:31.732698 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 12:42:31.739134 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 3 12:42:31.739209 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 3 12:42:31.744954 systemd[1]: Started systemd-journald.service - Journal Service. Mar 3 12:42:31.745130 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 3 12:42:31.746396 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 3 12:42:31.748172 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 3 12:42:31.771723 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 3 12:42:31.777199 kernel: loop0: detected capacity change from 0 to 200864 Mar 3 12:42:31.794286 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 3 12:42:31.795996 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 3 12:42:31.806330 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 3 12:42:31.814926 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 3 12:42:31.815082 systemd-journald[1154]: Time spent on flushing to /var/log/journal/45a9ff0c9e984dc188c47eba8bd45427 is 100.627ms for 1177 entries. Mar 3 12:42:31.815082 systemd-journald[1154]: System Journal (/var/log/journal/45a9ff0c9e984dc188c47eba8bd45427) is 8M, max 584.8M, 576.8M free. Mar 3 12:42:31.929853 systemd-journald[1154]: Received client request to flush runtime journal. Mar 3 12:42:31.929894 kernel: loop1: detected capacity change from 0 to 8 Mar 3 12:42:31.929908 kernel: loop2: detected capacity change from 0 to 100632 Mar 3 12:42:31.929919 kernel: loop3: detected capacity change from 0 to 119840 Mar 3 12:42:31.890571 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 3 12:42:31.896364 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 3 12:42:31.897965 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 3 12:42:31.926312 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 12:42:31.934280 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 3 12:42:31.964380 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Mar 3 12:42:31.964395 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Mar 3 12:42:31.970869 kernel: loop4: detected capacity change from 0 to 200864 Mar 3 12:42:31.971837 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 12:42:31.997725 kernel: loop5: detected capacity change from 0 to 8 Mar 3 12:42:32.001147 kernel: loop6: detected capacity change from 0 to 100632 Mar 3 12:42:32.021137 kernel: loop7: detected capacity change from 0 to 119840 Mar 3 12:42:32.033326 (sd-merge)[1233]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 3 12:42:32.036339 (sd-merge)[1233]: Merged extensions into '/usr'. Mar 3 12:42:32.043946 systemd[1]: Reload requested from client PID 1191 ('systemd-sysext') (unit systemd-sysext.service)... Mar 3 12:42:32.044187 systemd[1]: Reloading... Mar 3 12:42:32.153131 zram_generator::config[1270]: No configuration found. Mar 3 12:42:32.293064 ldconfig[1187]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 3 12:42:32.335378 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 3 12:42:32.335716 systemd[1]: Reloading finished in 291 ms. Mar 3 12:42:32.359146 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 3 12:42:32.360195 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 3 12:42:32.370317 systemd[1]: Starting ensure-sysext.service... Mar 3 12:42:32.374291 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 3 12:42:32.401711 systemd[1]: Reload requested from client PID 1298 ('systemctl') (unit ensure-sysext.service)... Mar 3 12:42:32.401732 systemd[1]: Reloading... Mar 3 12:42:32.406278 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 3 12:42:32.406317 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 3 12:42:32.406614 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 3 12:42:32.406807 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 3 12:42:32.407449 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 3 12:42:32.407677 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 3 12:42:32.407721 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 3 12:42:32.417281 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Mar 3 12:42:32.417414 systemd-tmpfiles[1299]: Skipping /boot Mar 3 12:42:32.426001 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Mar 3 12:42:32.426186 systemd-tmpfiles[1299]: Skipping /boot Mar 3 12:42:32.470241 zram_generator::config[1326]: No configuration found. Mar 3 12:42:32.633601 systemd[1]: Reloading finished in 231 ms. Mar 3 12:42:32.659133 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 3 12:42:32.665047 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 12:42:32.675280 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 3 12:42:32.680371 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 3 12:42:32.685310 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 3 12:42:32.689285 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 3 12:42:32.694132 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 3 12:42:32.701379 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 12:42:32.705315 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 3 12:42:32.709921 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 3 12:42:32.719749 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 3 12:42:32.724239 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 12:42:32.725914 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 12:42:32.729310 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 12:42:32.732252 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 12:42:32.734280 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 12:42:32.734399 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 12:42:32.737879 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 12:42:32.738012 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 12:42:32.738086 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 12:42:32.744688 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 12:42:32.749002 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 3 12:42:32.750381 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 12:42:32.751247 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 12:42:32.756877 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 3 12:42:32.762391 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 3 12:42:32.764488 systemd[1]: Finished ensure-sysext.service. Mar 3 12:42:32.779392 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 3 12:42:32.781062 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 3 12:42:32.791817 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Mar 3 12:42:32.803191 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 3 12:42:32.804264 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 3 12:42:32.811192 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 3 12:42:32.812952 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 12:42:32.813577 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 12:42:32.821842 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 12:42:32.826388 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 12:42:32.828505 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 12:42:32.830333 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 12:42:32.830816 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 12:42:32.833034 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 3 12:42:32.833252 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 3 12:42:32.844101 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 3 12:42:32.845561 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 12:42:32.845644 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 12:42:32.903284 augenrules[1436]: No rules Mar 3 12:42:32.908665 systemd[1]: audit-rules.service: Deactivated successfully. Mar 3 12:42:32.909418 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 3 12:42:32.931935 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 3 12:42:32.958678 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 3 12:42:33.123141 kernel: mousedev: PS/2 mouse device common for all mice Mar 3 12:42:33.129172 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 3 12:42:33.132182 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 3 12:42:33.180690 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 3 12:42:33.194639 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 3 12:42:33.195702 systemd[1]: Reached target time-set.target - System Time Set. Mar 3 12:42:33.206484 systemd-networkd[1422]: lo: Link UP Mar 3 12:42:33.206494 systemd-networkd[1422]: lo: Gained carrier Mar 3 12:42:33.210643 systemd-networkd[1422]: Enumeration completed Mar 3 12:42:33.210646 systemd-timesyncd[1387]: No network connectivity, watching for changes. Mar 3 12:42:33.213239 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 3 12:42:33.213693 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:33.213698 systemd-networkd[1422]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 12:42:33.214940 systemd-networkd[1422]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:33.215015 systemd-networkd[1422]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 12:42:33.215651 systemd-networkd[1422]: eth0: Link UP Mar 3 12:42:33.215898 systemd-networkd[1422]: eth0: Gained carrier Mar 3 12:42:33.216162 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 3 12:42:33.217169 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:33.218277 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 3 12:42:33.223425 systemd-networkd[1422]: eth1: Link UP Mar 3 12:42:33.224141 systemd-networkd[1422]: eth1: Gained carrier Mar 3 12:42:33.224765 systemd-networkd[1422]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 12:42:33.233060 systemd-resolved[1369]: Positive Trust Anchors: Mar 3 12:42:33.233082 systemd-resolved[1369]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 3 12:42:33.233124 systemd-resolved[1369]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 3 12:42:33.243510 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 3 12:42:33.245516 systemd-resolved[1369]: Using system hostname 'ci-4459-2-4-8-fcaab3b7ef'. Mar 3 12:42:33.247764 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 3 12:42:33.249512 systemd[1]: Reached target network.target - Network. Mar 3 12:42:33.250086 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 3 12:42:33.251198 systemd[1]: Reached target sysinit.target - System Initialization. Mar 3 12:42:33.251918 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 3 12:42:33.252769 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 3 12:42:33.253749 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 3 12:42:33.254591 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 3 12:42:33.255278 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 3 12:42:33.256482 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 3 12:42:33.256518 systemd[1]: Reached target paths.target - Path Units. Mar 3 12:42:33.257161 systemd[1]: Reached target timers.target - Timer Units. Mar 3 12:42:33.259389 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 3 12:42:33.262689 systemd-networkd[1422]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 3 12:42:33.264018 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 3 12:42:33.267212 systemd-networkd[1422]: eth0: DHCPv4 address 78.47.249.221/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 3 12:42:33.268092 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 3 12:42:33.270369 systemd-timesyncd[1387]: Network configuration changed, trying to establish connection. Mar 3 12:42:33.272024 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 3 12:42:33.273028 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 3 12:42:33.278215 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 3 12:42:33.279226 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 3 12:42:33.281038 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 3 12:42:33.282871 systemd[1]: Reached target sockets.target - Socket Units. Mar 3 12:42:33.283993 systemd[1]: Reached target basic.target - Basic System. Mar 3 12:42:33.287164 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 3 12:42:33.287197 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 3 12:42:33.288563 systemd[1]: Starting containerd.service - containerd container runtime... Mar 3 12:42:33.291588 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 3 12:42:33.293801 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 3 12:42:33.299087 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 3 12:42:33.301914 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 3 12:42:33.306985 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 3 12:42:33.307675 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 3 12:42:33.315561 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 3 12:42:33.320407 jq[1485]: false Mar 3 12:42:33.322552 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 3 12:42:33.326327 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 3 12:42:33.328558 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 3 12:42:33.342913 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 3 12:42:33.344561 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 3 12:42:33.345081 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 3 12:42:33.346210 systemd[1]: Starting update-engine.service - Update Engine... Mar 3 12:42:33.346856 systemd-timesyncd[1387]: Contacted time server 85.215.166.214:123 (1.flatcar.pool.ntp.org). Mar 3 12:42:33.346921 systemd-timesyncd[1387]: Initial clock synchronization to Tue 2026-03-03 12:42:32.957892 UTC. Mar 3 12:42:33.351836 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 3 12:42:33.363208 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 3 12:42:33.364178 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 3 12:42:33.364359 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 3 12:42:33.366533 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 3 12:42:33.366706 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 3 12:42:33.370032 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 3 12:42:33.383067 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 3 12:42:33.408493 jq[1495]: true Mar 3 12:42:33.411872 coreos-metadata[1482]: Mar 03 12:42:33.411 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 3 12:42:33.418737 coreos-metadata[1482]: Mar 03 12:42:33.418 INFO Fetch successful Mar 3 12:42:33.419385 coreos-metadata[1482]: Mar 03 12:42:33.419 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 3 12:42:33.424132 coreos-metadata[1482]: Mar 03 12:42:33.423 INFO Fetch successful Mar 3 12:42:33.428653 extend-filesystems[1486]: Found /dev/sda6 Mar 3 12:42:33.439435 systemd[1]: motdgen.service: Deactivated successfully. Mar 3 12:42:33.439699 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 3 12:42:33.446946 tar[1502]: linux-arm64/LICENSE Mar 3 12:42:33.446946 tar[1502]: linux-arm64/helm Mar 3 12:42:33.452371 jq[1518]: true Mar 3 12:42:33.460413 extend-filesystems[1486]: Found /dev/sda9 Mar 3 12:42:33.469529 update_engine[1494]: I20260303 12:42:33.461949 1494 main.cc:92] Flatcar Update Engine starting Mar 3 12:42:33.466954 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 3 12:42:33.466754 dbus-daemon[1483]: [system] SELinux support is enabled Mar 3 12:42:33.470614 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 3 12:42:33.470662 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 3 12:42:33.472289 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 3 12:42:33.472315 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 3 12:42:33.473272 (ntainerd)[1519]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 3 12:42:33.479162 extend-filesystems[1486]: Checking size of /dev/sda9 Mar 3 12:42:33.496755 systemd[1]: Started update-engine.service - Update Engine. Mar 3 12:42:33.499572 update_engine[1494]: I20260303 12:42:33.499487 1494 update_check_scheduler.cc:74] Next update check in 9m22s Mar 3 12:42:33.517165 extend-filesystems[1486]: Resized partition /dev/sda9 Mar 3 12:42:33.529493 extend-filesystems[1541]: resize2fs 1.47.3 (8-Jul-2025) Mar 3 12:42:33.533388 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 3 12:42:33.539720 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Mar 3 12:42:33.650188 bash[1562]: Updated "/home/core/.ssh/authorized_keys" Mar 3 12:42:33.673729 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Mar 3 12:42:33.673799 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 3 12:42:33.673812 kernel: [drm] features: -context_init Mar 3 12:42:33.681694 containerd[1519]: time="2026-03-03T12:42:33Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 3 12:42:33.721304 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Mar 3 12:42:33.714895 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 3 12:42:33.721646 containerd[1519]: time="2026-03-03T12:42:33.682334640Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 3 12:42:33.716043 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 3 12:42:33.723734 extend-filesystems[1541]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 3 12:42:33.723734 extend-filesystems[1541]: old_desc_blocks = 1, new_desc_blocks = 5 Mar 3 12:42:33.723734 extend-filesystems[1541]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Mar 3 12:42:33.732239 extend-filesystems[1486]: Resized filesystem in /dev/sda9 Mar 3 12:42:33.740344 kernel: [drm] number of scanouts: 1 Mar 3 12:42:33.740369 kernel: [drm] number of cap sets: 0 Mar 3 12:42:33.725892 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 3 12:42:33.726346 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 3 12:42:33.729280 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 3 12:42:33.731154 systemd[1]: Starting sshkeys.service... Mar 3 12:42:33.736394 systemd-logind[1493]: New seat seat0. Mar 3 12:42:33.741323 systemd[1]: Started systemd-logind.service - User Login Management. Mar 3 12:42:33.744340 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 3 12:42:33.751128 containerd[1519]: time="2026-03-03T12:42:33.749885880Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.12µs" Mar 3 12:42:33.756980 containerd[1519]: time="2026-03-03T12:42:33.756402920Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 3 12:42:33.756980 containerd[1519]: time="2026-03-03T12:42:33.756492000Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 3 12:42:33.756980 containerd[1519]: time="2026-03-03T12:42:33.756691720Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 3 12:42:33.756980 containerd[1519]: time="2026-03-03T12:42:33.756715080Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 3 12:42:33.756980 containerd[1519]: time="2026-03-03T12:42:33.756742760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 3 12:42:33.756980 containerd[1519]: time="2026-03-03T12:42:33.756820560Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 3 12:42:33.756980 containerd[1519]: time="2026-03-03T12:42:33.756833320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 3 12:42:33.757728 containerd[1519]: time="2026-03-03T12:42:33.757699240Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 3 12:42:33.758037 containerd[1519]: time="2026-03-03T12:42:33.758016600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 3 12:42:33.759008 containerd[1519]: time="2026-03-03T12:42:33.758654080Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 3 12:42:33.759008 containerd[1519]: time="2026-03-03T12:42:33.758681480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 3 12:42:33.763294 containerd[1519]: time="2026-03-03T12:42:33.759282240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 3 12:42:33.763294 containerd[1519]: time="2026-03-03T12:42:33.761405920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 3 12:42:33.764329 containerd[1519]: time="2026-03-03T12:42:33.763814640Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 3 12:42:33.764329 containerd[1519]: time="2026-03-03T12:42:33.763838640Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 3 12:42:33.764329 containerd[1519]: time="2026-03-03T12:42:33.763906280Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 3 12:42:33.764329 containerd[1519]: time="2026-03-03T12:42:33.764178400Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 3 12:42:33.764329 containerd[1519]: time="2026-03-03T12:42:33.764270160Z" level=info msg="metadata content store policy set" policy=shared Mar 3 12:42:33.776215 containerd[1519]: time="2026-03-03T12:42:33.776172240Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.776947760Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.776976880Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.776991280Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777007240Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777029800Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777043120Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777056520Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777069680Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777080240Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777099240Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777237920Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 3 12:42:33.777596 containerd[1519]: time="2026-03-03T12:42:33.777497000Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777599360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777630000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777648000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777663720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777677600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777694440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777709640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777727240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777739480Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 3 12:42:33.777855 containerd[1519]: time="2026-03-03T12:42:33.777754640Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 3 12:42:33.778013 containerd[1519]: time="2026-03-03T12:42:33.777939320Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 3 12:42:33.778013 containerd[1519]: time="2026-03-03T12:42:33.777959520Z" level=info msg="Start snapshots syncer" Mar 3 12:42:33.778013 containerd[1519]: time="2026-03-03T12:42:33.777981320Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 3 12:42:33.778387 containerd[1519]: time="2026-03-03T12:42:33.778337800Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 3 12:42:33.778512 containerd[1519]: time="2026-03-03T12:42:33.778418720Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 3 12:42:33.778537 containerd[1519]: time="2026-03-03T12:42:33.778509760Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 3 12:42:33.778669 containerd[1519]: time="2026-03-03T12:42:33.778639440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 3 12:42:33.778694 containerd[1519]: time="2026-03-03T12:42:33.778677400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 3 12:42:33.778726 containerd[1519]: time="2026-03-03T12:42:33.778695000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 3 12:42:33.778726 containerd[1519]: time="2026-03-03T12:42:33.778709640Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 3 12:42:33.778759 containerd[1519]: time="2026-03-03T12:42:33.778722480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 3 12:42:33.778759 containerd[1519]: time="2026-03-03T12:42:33.778737600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 3 12:42:33.778791 containerd[1519]: time="2026-03-03T12:42:33.778776440Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778814440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778834440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778850440Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778893640Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778911480Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778924280Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778937280Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778949360Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778959960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.778974360Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.779091080Z" level=info msg="runtime interface created" Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.779096920Z" level=info msg="created NRI interface" Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.780124120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.780157040Z" level=info msg="Connect containerd service" Mar 3 12:42:33.783965 containerd[1519]: time="2026-03-03T12:42:33.780194840Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 3 12:42:33.784307 containerd[1519]: time="2026-03-03T12:42:33.782492840Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 3 12:42:33.813149 kernel: Console: switching to colour frame buffer device 160x50 Mar 3 12:42:33.820187 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 3 12:42:33.842670 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 3 12:42:33.843473 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 3 12:42:33.894165 systemd-logind[1493]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Mar 3 12:42:33.905669 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 12:42:33.925461 systemd-logind[1493]: Watching system buttons on /dev/input/event0 (Power Button) Mar 3 12:42:33.945771 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 12:42:33.945981 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 12:42:33.948522 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 12:42:33.986285 coreos-metadata[1587]: Mar 03 12:42:33.986 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 3 12:42:33.990721 coreos-metadata[1587]: Mar 03 12:42:33.990 INFO Fetch successful Mar 3 12:42:33.993053 unknown[1587]: wrote ssh authorized keys file for user: core Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005452774Z" level=info msg="Start subscribing containerd event" Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005537676Z" level=info msg="Start recovering state" Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005631635Z" level=info msg="Start event monitor" Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005645716Z" level=info msg="Start cni network conf syncer for default" Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005653479Z" level=info msg="Start streaming server" Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005661776Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005668892Z" level=info msg="runtime interface starting up..." Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005674182Z" level=info msg="starting plugins..." Mar 3 12:42:34.005834 containerd[1519]: time="2026-03-03T12:42:34.005686474Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 3 12:42:34.008347 containerd[1519]: time="2026-03-03T12:42:34.008249826Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 3 12:42:34.008347 containerd[1519]: time="2026-03-03T12:42:34.008326127Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 3 12:42:34.008478 systemd[1]: Started containerd.service - containerd container runtime. Mar 3 12:42:34.010087 containerd[1519]: time="2026-03-03T12:42:34.009150032Z" level=info msg="containerd successfully booted in 0.328192s" Mar 3 12:42:34.061423 update-ssh-keys[1604]: Updated "/home/core/.ssh/authorized_keys" Mar 3 12:42:34.065817 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 3 12:42:34.070901 systemd[1]: Finished sshkeys.service. Mar 3 12:42:34.079239 locksmithd[1533]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 3 12:42:34.118174 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 12:42:34.285953 tar[1502]: linux-arm64/README.md Mar 3 12:42:34.303846 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 3 12:42:34.307212 systemd-networkd[1422]: eth1: Gained IPv6LL Mar 3 12:42:34.311486 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 3 12:42:34.314002 systemd[1]: Reached target network-online.target - Network is Online. Mar 3 12:42:34.317678 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:42:34.322428 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 3 12:42:34.371483 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 3 12:42:34.505220 sshd_keygen[1511]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 3 12:42:34.527644 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 3 12:42:34.534582 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 3 12:42:34.557825 systemd[1]: issuegen.service: Deactivated successfully. Mar 3 12:42:34.558099 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 3 12:42:34.564388 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 3 12:42:34.582438 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 3 12:42:34.588159 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 3 12:42:34.591468 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 3 12:42:34.592592 systemd[1]: Reached target getty.target - Login Prompts. Mar 3 12:42:34.947300 systemd-networkd[1422]: eth0: Gained IPv6LL Mar 3 12:42:35.041613 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:42:35.043796 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 3 12:42:35.045530 systemd[1]: Startup finished in 2.350s (kernel) + 5.299s (initrd) + 4.191s (userspace) = 11.842s. Mar 3 12:42:35.052517 (kubelet)[1651]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 12:42:35.490555 kubelet[1651]: E0303 12:42:35.490455 1651 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 12:42:35.494273 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 12:42:35.494455 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 12:42:35.495124 systemd[1]: kubelet.service: Consumed 788ms CPU time, 249.1M memory peak. Mar 3 12:42:45.613398 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 3 12:42:45.616702 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:42:45.790141 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:42:45.806772 (kubelet)[1670]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 12:42:45.852125 kubelet[1670]: E0303 12:42:45.852059 1670 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 12:42:45.855466 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 12:42:45.855744 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 12:42:45.856309 systemd[1]: kubelet.service: Consumed 171ms CPU time, 105.4M memory peak. Mar 3 12:42:55.863770 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 3 12:42:55.867456 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:42:56.049837 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:42:56.059904 (kubelet)[1684]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 12:42:56.101449 kubelet[1684]: E0303 12:42:56.101401 1684 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 12:42:56.105453 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 12:42:56.105582 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 12:42:56.107217 systemd[1]: kubelet.service: Consumed 162ms CPU time, 106.6M memory peak. Mar 3 12:43:06.113659 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 3 12:43:06.118088 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:43:06.288591 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:06.300049 (kubelet)[1698]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 12:43:06.343635 kubelet[1698]: E0303 12:43:06.343570 1698 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 12:43:06.346262 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 12:43:06.346393 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 12:43:06.346980 systemd[1]: kubelet.service: Consumed 165ms CPU time, 104.9M memory peak. Mar 3 12:43:07.194557 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 3 12:43:07.196383 systemd[1]: Started sshd@0-78.47.249.221:22-20.161.92.111:49408.service - OpenSSH per-connection server daemon (20.161.92.111:49408). Mar 3 12:43:07.735695 sshd[1706]: Accepted publickey for core from 20.161.92.111 port 49408 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:43:07.739176 sshd-session[1706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:43:07.749260 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 3 12:43:07.750225 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 3 12:43:07.760572 systemd-logind[1493]: New session 1 of user core. Mar 3 12:43:07.773096 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 3 12:43:07.778059 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 3 12:43:07.797140 (systemd)[1711]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 3 12:43:07.801188 systemd-logind[1493]: New session c1 of user core. Mar 3 12:43:07.929713 systemd[1711]: Queued start job for default target default.target. Mar 3 12:43:07.941089 systemd[1711]: Created slice app.slice - User Application Slice. Mar 3 12:43:07.941371 systemd[1711]: Reached target paths.target - Paths. Mar 3 12:43:07.941580 systemd[1711]: Reached target timers.target - Timers. Mar 3 12:43:07.944143 systemd[1711]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 3 12:43:07.958419 systemd[1711]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 3 12:43:07.958535 systemd[1711]: Reached target sockets.target - Sockets. Mar 3 12:43:07.958588 systemd[1711]: Reached target basic.target - Basic System. Mar 3 12:43:07.958629 systemd[1711]: Reached target default.target - Main User Target. Mar 3 12:43:07.958660 systemd[1711]: Startup finished in 150ms. Mar 3 12:43:07.958921 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 3 12:43:07.966982 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 3 12:43:08.270843 systemd[1]: Started sshd@1-78.47.249.221:22-20.161.92.111:49410.service - OpenSSH per-connection server daemon (20.161.92.111:49410). Mar 3 12:43:08.801971 sshd[1722]: Accepted publickey for core from 20.161.92.111 port 49410 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:43:08.803932 sshd-session[1722]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:43:08.810142 systemd-logind[1493]: New session 2 of user core. Mar 3 12:43:08.815391 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 3 12:43:09.087304 sshd[1725]: Connection closed by 20.161.92.111 port 49410 Mar 3 12:43:09.089547 sshd-session[1722]: pam_unix(sshd:session): session closed for user core Mar 3 12:43:09.093781 systemd[1]: sshd@1-78.47.249.221:22-20.161.92.111:49410.service: Deactivated successfully. Mar 3 12:43:09.096372 systemd[1]: session-2.scope: Deactivated successfully. Mar 3 12:43:09.098036 systemd-logind[1493]: Session 2 logged out. Waiting for processes to exit. Mar 3 12:43:09.101181 systemd-logind[1493]: Removed session 2. Mar 3 12:43:09.195385 systemd[1]: Started sshd@2-78.47.249.221:22-20.161.92.111:49414.service - OpenSSH per-connection server daemon (20.161.92.111:49414). Mar 3 12:43:09.724984 sshd[1731]: Accepted publickey for core from 20.161.92.111 port 49414 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:43:09.727197 sshd-session[1731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:43:09.732348 systemd-logind[1493]: New session 3 of user core. Mar 3 12:43:09.739422 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 3 12:43:10.012429 sshd[1734]: Connection closed by 20.161.92.111 port 49414 Mar 3 12:43:10.013537 sshd-session[1731]: pam_unix(sshd:session): session closed for user core Mar 3 12:43:10.020190 systemd-logind[1493]: Session 3 logged out. Waiting for processes to exit. Mar 3 12:43:10.020496 systemd[1]: sshd@2-78.47.249.221:22-20.161.92.111:49414.service: Deactivated successfully. Mar 3 12:43:10.022057 systemd[1]: session-3.scope: Deactivated successfully. Mar 3 12:43:10.024873 systemd-logind[1493]: Removed session 3. Mar 3 12:43:10.125337 systemd[1]: Started sshd@3-78.47.249.221:22-20.161.92.111:49422.service - OpenSSH per-connection server daemon (20.161.92.111:49422). Mar 3 12:43:10.663300 sshd[1740]: Accepted publickey for core from 20.161.92.111 port 49422 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:43:10.666410 sshd-session[1740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:43:10.671633 systemd-logind[1493]: New session 4 of user core. Mar 3 12:43:10.680477 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 3 12:43:10.959228 sshd[1743]: Connection closed by 20.161.92.111 port 49422 Mar 3 12:43:10.960160 sshd-session[1740]: pam_unix(sshd:session): session closed for user core Mar 3 12:43:10.965747 systemd-logind[1493]: Session 4 logged out. Waiting for processes to exit. Mar 3 12:43:10.966214 systemd[1]: sshd@3-78.47.249.221:22-20.161.92.111:49422.service: Deactivated successfully. Mar 3 12:43:10.968559 systemd[1]: session-4.scope: Deactivated successfully. Mar 3 12:43:10.972155 systemd-logind[1493]: Removed session 4. Mar 3 12:43:11.070233 systemd[1]: Started sshd@4-78.47.249.221:22-20.161.92.111:48402.service - OpenSSH per-connection server daemon (20.161.92.111:48402). Mar 3 12:43:11.618441 sshd[1749]: Accepted publickey for core from 20.161.92.111 port 48402 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:43:11.620454 sshd-session[1749]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:43:11.626548 systemd-logind[1493]: New session 5 of user core. Mar 3 12:43:11.635458 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 3 12:43:11.829862 sudo[1753]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 3 12:43:11.830317 sudo[1753]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 12:43:11.848201 sudo[1753]: pam_unix(sudo:session): session closed for user root Mar 3 12:43:11.946510 sshd[1752]: Connection closed by 20.161.92.111 port 48402 Mar 3 12:43:11.947997 sshd-session[1749]: pam_unix(sshd:session): session closed for user core Mar 3 12:43:11.954137 systemd[1]: sshd@4-78.47.249.221:22-20.161.92.111:48402.service: Deactivated successfully. Mar 3 12:43:11.956983 systemd[1]: session-5.scope: Deactivated successfully. Mar 3 12:43:11.958293 systemd-logind[1493]: Session 5 logged out. Waiting for processes to exit. Mar 3 12:43:11.960539 systemd-logind[1493]: Removed session 5. Mar 3 12:43:12.056419 systemd[1]: Started sshd@5-78.47.249.221:22-20.161.92.111:48406.service - OpenSSH per-connection server daemon (20.161.92.111:48406). Mar 3 12:43:12.596616 sshd[1759]: Accepted publickey for core from 20.161.92.111 port 48406 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:43:12.601465 sshd-session[1759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:43:12.609311 systemd-logind[1493]: New session 6 of user core. Mar 3 12:43:12.621447 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 3 12:43:12.792728 sudo[1764]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 3 12:43:12.793005 sudo[1764]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 12:43:12.799055 sudo[1764]: pam_unix(sudo:session): session closed for user root Mar 3 12:43:12.805011 sudo[1763]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 3 12:43:12.805435 sudo[1763]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 12:43:12.817911 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 3 12:43:12.867453 augenrules[1786]: No rules Mar 3 12:43:12.869006 systemd[1]: audit-rules.service: Deactivated successfully. Mar 3 12:43:12.869273 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 3 12:43:12.870675 sudo[1763]: pam_unix(sudo:session): session closed for user root Mar 3 12:43:12.966827 sshd[1762]: Connection closed by 20.161.92.111 port 48406 Mar 3 12:43:12.966580 sshd-session[1759]: pam_unix(sshd:session): session closed for user core Mar 3 12:43:12.973459 systemd-logind[1493]: Session 6 logged out. Waiting for processes to exit. Mar 3 12:43:12.973812 systemd[1]: sshd@5-78.47.249.221:22-20.161.92.111:48406.service: Deactivated successfully. Mar 3 12:43:12.976128 systemd[1]: session-6.scope: Deactivated successfully. Mar 3 12:43:12.977959 systemd-logind[1493]: Removed session 6. Mar 3 12:43:13.076498 systemd[1]: Started sshd@6-78.47.249.221:22-20.161.92.111:48414.service - OpenSSH per-connection server daemon (20.161.92.111:48414). Mar 3 12:43:13.625173 sshd[1795]: Accepted publickey for core from 20.161.92.111 port 48414 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:43:13.626846 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:43:13.635191 systemd-logind[1493]: New session 7 of user core. Mar 3 12:43:13.648491 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 3 12:43:13.824748 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 3 12:43:13.825046 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 12:43:14.165379 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 3 12:43:14.176682 (dockerd)[1817]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 3 12:43:14.410291 dockerd[1817]: time="2026-03-03T12:43:14.410222785Z" level=info msg="Starting up" Mar 3 12:43:14.411153 dockerd[1817]: time="2026-03-03T12:43:14.411125101Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 3 12:43:14.424750 dockerd[1817]: time="2026-03-03T12:43:14.424424070Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 3 12:43:14.474593 dockerd[1817]: time="2026-03-03T12:43:14.474520451Z" level=info msg="Loading containers: start." Mar 3 12:43:14.488130 kernel: Initializing XFRM netlink socket Mar 3 12:43:14.751466 systemd-networkd[1422]: docker0: Link UP Mar 3 12:43:14.757222 dockerd[1817]: time="2026-03-03T12:43:14.757158953Z" level=info msg="Loading containers: done." Mar 3 12:43:14.781300 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck11014753-merged.mount: Deactivated successfully. Mar 3 12:43:14.785328 dockerd[1817]: time="2026-03-03T12:43:14.785274168Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 3 12:43:14.785443 dockerd[1817]: time="2026-03-03T12:43:14.785383596Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 3 12:43:14.785518 dockerd[1817]: time="2026-03-03T12:43:14.785498827Z" level=info msg="Initializing buildkit" Mar 3 12:43:14.816521 dockerd[1817]: time="2026-03-03T12:43:14.816433821Z" level=info msg="Completed buildkit initialization" Mar 3 12:43:14.826654 dockerd[1817]: time="2026-03-03T12:43:14.826589565Z" level=info msg="Daemon has completed initialization" Mar 3 12:43:14.827153 dockerd[1817]: time="2026-03-03T12:43:14.826765172Z" level=info msg="API listen on /run/docker.sock" Mar 3 12:43:14.826922 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 3 12:43:15.290261 containerd[1519]: time="2026-03-03T12:43:15.290190619Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 3 12:43:15.970008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount39913693.mount: Deactivated successfully. Mar 3 12:43:16.364011 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 3 12:43:16.368270 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:43:16.535073 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:16.548157 (kubelet)[2093]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 12:43:16.591800 kubelet[2093]: E0303 12:43:16.591723 2093 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 12:43:16.594297 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 12:43:16.594435 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 12:43:16.594725 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.9M memory peak. Mar 3 12:43:17.475012 containerd[1519]: time="2026-03-03T12:43:17.474914096Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:17.476892 containerd[1519]: time="2026-03-03T12:43:17.476813202Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=24583350" Mar 3 12:43:17.477755 containerd[1519]: time="2026-03-03T12:43:17.477706962Z" level=info msg="ImageCreate event name:\"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:17.483541 containerd[1519]: time="2026-03-03T12:43:17.483427484Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:17.486452 containerd[1519]: time="2026-03-03T12:43:17.486388708Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"24579851\" in 2.196147997s" Mar 3 12:43:17.486840 containerd[1519]: time="2026-03-03T12:43:17.486633683Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\"" Mar 3 12:43:17.487539 containerd[1519]: time="2026-03-03T12:43:17.487508439Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 3 12:43:18.800386 containerd[1519]: time="2026-03-03T12:43:18.800320669Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:18.801717 containerd[1519]: time="2026-03-03T12:43:18.801558093Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=19139661" Mar 3 12:43:18.802715 containerd[1519]: time="2026-03-03T12:43:18.802668729Z" level=info msg="ImageCreate event name:\"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:18.805796 containerd[1519]: time="2026-03-03T12:43:18.805755186Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:18.808117 containerd[1519]: time="2026-03-03T12:43:18.807922288Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"20724045\" in 1.320376841s" Mar 3 12:43:18.808117 containerd[1519]: time="2026-03-03T12:43:18.807965377Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\"" Mar 3 12:43:18.809250 containerd[1519]: time="2026-03-03T12:43:18.809225486Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 3 12:43:19.207958 update_engine[1494]: I20260303 12:43:19.207175 1494 update_attempter.cc:509] Updating boot flags... Mar 3 12:43:19.854780 containerd[1519]: time="2026-03-03T12:43:19.854651268Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:19.856709 containerd[1519]: time="2026-03-03T12:43:19.856221426Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=14195564" Mar 3 12:43:19.857779 containerd[1519]: time="2026-03-03T12:43:19.857741333Z" level=info msg="ImageCreate event name:\"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:19.861597 containerd[1519]: time="2026-03-03T12:43:19.861561067Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:19.862998 containerd[1519]: time="2026-03-03T12:43:19.862964831Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"15779966\" in 1.053611559s" Mar 3 12:43:19.863136 containerd[1519]: time="2026-03-03T12:43:19.863097658Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\"" Mar 3 12:43:19.865079 containerd[1519]: time="2026-03-03T12:43:19.865044972Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 3 12:43:20.830304 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3571732932.mount: Deactivated successfully. Mar 3 12:43:21.071839 containerd[1519]: time="2026-03-03T12:43:21.071781663Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:21.073074 containerd[1519]: time="2026-03-03T12:43:21.073028852Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=22697114" Mar 3 12:43:21.074327 containerd[1519]: time="2026-03-03T12:43:21.074265799Z" level=info msg="ImageCreate event name:\"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:21.077147 containerd[1519]: time="2026-03-03T12:43:21.077079315Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:21.078672 containerd[1519]: time="2026-03-03T12:43:21.078389395Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"22696107\" in 1.213305656s" Mar 3 12:43:21.078672 containerd[1519]: time="2026-03-03T12:43:21.078423802Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\"" Mar 3 12:43:21.078885 containerd[1519]: time="2026-03-03T12:43:21.078863602Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 3 12:43:21.691820 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2694773770.mount: Deactivated successfully. Mar 3 12:43:22.472137 containerd[1519]: time="2026-03-03T12:43:22.471526571Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:22.473648 containerd[1519]: time="2026-03-03T12:43:22.473618497Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395498" Mar 3 12:43:22.475064 containerd[1519]: time="2026-03-03T12:43:22.475037105Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:22.479020 containerd[1519]: time="2026-03-03T12:43:22.478963552Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:22.481129 containerd[1519]: time="2026-03-03T12:43:22.481057638Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.402160869s" Mar 3 12:43:22.481129 containerd[1519]: time="2026-03-03T12:43:22.481092684Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Mar 3 12:43:22.482148 containerd[1519]: time="2026-03-03T12:43:22.481942032Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 3 12:43:22.944460 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount499344414.mount: Deactivated successfully. Mar 3 12:43:22.952404 containerd[1519]: time="2026-03-03T12:43:22.952341850Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:22.953602 containerd[1519]: time="2026-03-03T12:43:22.953560943Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Mar 3 12:43:22.954701 containerd[1519]: time="2026-03-03T12:43:22.954400130Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:22.956815 containerd[1519]: time="2026-03-03T12:43:22.956720776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:22.957782 containerd[1519]: time="2026-03-03T12:43:22.957742755Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 475.769037ms" Mar 3 12:43:22.957913 containerd[1519]: time="2026-03-03T12:43:22.957782642Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 3 12:43:22.958746 containerd[1519]: time="2026-03-03T12:43:22.958558817Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 3 12:43:23.544692 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2092981130.mount: Deactivated successfully. Mar 3 12:43:24.255223 containerd[1519]: time="2026-03-03T12:43:24.255151041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:24.260873 containerd[1519]: time="2026-03-03T12:43:24.260803581Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21125601" Mar 3 12:43:24.261677 containerd[1519]: time="2026-03-03T12:43:24.261620751Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:24.265985 containerd[1519]: time="2026-03-03T12:43:24.265910714Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:24.267292 containerd[1519]: time="2026-03-03T12:43:24.267157392Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 1.308566569s" Mar 3 12:43:24.267292 containerd[1519]: time="2026-03-03T12:43:24.267194558Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"" Mar 3 12:43:26.613756 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 3 12:43:26.616532 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:43:26.767260 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:26.779303 (kubelet)[2290]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 12:43:26.823256 kubelet[2290]: E0303 12:43:26.823211 2290 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 12:43:26.826891 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 12:43:26.827022 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 12:43:26.829280 systemd[1]: kubelet.service: Consumed 159ms CPU time, 105.3M memory peak. Mar 3 12:43:28.208790 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:28.209360 systemd[1]: kubelet.service: Consumed 159ms CPU time, 105.3M memory peak. Mar 3 12:43:28.212396 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:43:28.254818 systemd[1]: Reload requested from client PID 2304 ('systemctl') (unit session-7.scope)... Mar 3 12:43:28.254832 systemd[1]: Reloading... Mar 3 12:43:28.381132 zram_generator::config[2354]: No configuration found. Mar 3 12:43:28.567141 systemd[1]: Reloading finished in 311 ms. Mar 3 12:43:28.627743 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 3 12:43:28.627846 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 3 12:43:28.628200 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:28.628257 systemd[1]: kubelet.service: Consumed 110ms CPU time, 94.9M memory peak. Mar 3 12:43:28.630406 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:43:28.785191 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:28.790524 (kubelet)[2396]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 3 12:43:28.832844 kubelet[2396]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 3 12:43:28.832844 kubelet[2396]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 12:43:28.833940 kubelet[2396]: I0303 12:43:28.833731 2396 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 3 12:43:29.561732 kubelet[2396]: I0303 12:43:29.561647 2396 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 3 12:43:29.561732 kubelet[2396]: I0303 12:43:29.561693 2396 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 3 12:43:29.561732 kubelet[2396]: I0303 12:43:29.561713 2396 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 3 12:43:29.561732 kubelet[2396]: I0303 12:43:29.561720 2396 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 3 12:43:29.562036 kubelet[2396]: I0303 12:43:29.561958 2396 server.go:956] "Client rotation is on, will bootstrap in background" Mar 3 12:43:29.570864 kubelet[2396]: E0303 12:43:29.570808 2396 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://78.47.249.221:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 3 12:43:29.573743 kubelet[2396]: I0303 12:43:29.573487 2396 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 3 12:43:29.577761 kubelet[2396]: I0303 12:43:29.577737 2396 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 3 12:43:29.580912 kubelet[2396]: I0303 12:43:29.580882 2396 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 3 12:43:29.581424 kubelet[2396]: I0303 12:43:29.581391 2396 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 3 12:43:29.581660 kubelet[2396]: I0303 12:43:29.581498 2396 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-8-fcaab3b7ef","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 3 12:43:29.581798 kubelet[2396]: I0303 12:43:29.581785 2396 topology_manager.go:138] "Creating topology manager with none policy" Mar 3 12:43:29.581846 kubelet[2396]: I0303 12:43:29.581839 2396 container_manager_linux.go:306] "Creating device plugin manager" Mar 3 12:43:29.581998 kubelet[2396]: I0303 12:43:29.581982 2396 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 3 12:43:29.584826 kubelet[2396]: I0303 12:43:29.584794 2396 state_mem.go:36] "Initialized new in-memory state store" Mar 3 12:43:29.586804 kubelet[2396]: I0303 12:43:29.586776 2396 kubelet.go:475] "Attempting to sync node with API server" Mar 3 12:43:29.586916 kubelet[2396]: I0303 12:43:29.586904 2396 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 3 12:43:29.588166 kubelet[2396]: E0303 12:43:29.587345 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://78.47.249.221:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-8-fcaab3b7ef&limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 3 12:43:29.588166 kubelet[2396]: I0303 12:43:29.587761 2396 kubelet.go:387] "Adding apiserver pod source" Mar 3 12:43:29.588166 kubelet[2396]: I0303 12:43:29.587788 2396 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 3 12:43:29.590374 kubelet[2396]: E0303 12:43:29.590293 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://78.47.249.221:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 3 12:43:29.590899 kubelet[2396]: I0303 12:43:29.590864 2396 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 3 12:43:29.591879 kubelet[2396]: I0303 12:43:29.591846 2396 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 3 12:43:29.592028 kubelet[2396]: I0303 12:43:29.591901 2396 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 3 12:43:29.592028 kubelet[2396]: W0303 12:43:29.591958 2396 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 3 12:43:29.596171 kubelet[2396]: I0303 12:43:29.594805 2396 server.go:1262] "Started kubelet" Mar 3 12:43:29.596645 kubelet[2396]: I0303 12:43:29.596618 2396 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 3 12:43:29.597567 kubelet[2396]: I0303 12:43:29.597547 2396 server.go:310] "Adding debug handlers to kubelet server" Mar 3 12:43:29.599506 kubelet[2396]: I0303 12:43:29.599180 2396 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 3 12:43:29.599506 kubelet[2396]: I0303 12:43:29.599269 2396 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 3 12:43:29.599595 kubelet[2396]: I0303 12:43:29.599546 2396 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 3 12:43:29.601215 kubelet[2396]: E0303 12:43:29.599670 2396 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://78.47.249.221:6443/api/v1/namespaces/default/events\": dial tcp 78.47.249.221:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-8-fcaab3b7ef.189955626e6cb9bf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-8-fcaab3b7ef,UID:ci-4459-2-4-8-fcaab3b7ef,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-8-fcaab3b7ef,},FirstTimestamp:2026-03-03 12:43:29.594775999 +0000 UTC m=+0.801074338,LastTimestamp:2026-03-03 12:43:29.594775999 +0000 UTC m=+0.801074338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-8-fcaab3b7ef,}" Mar 3 12:43:29.602549 kubelet[2396]: I0303 12:43:29.602526 2396 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 3 12:43:29.605253 kubelet[2396]: I0303 12:43:29.605218 2396 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 3 12:43:29.606301 kubelet[2396]: I0303 12:43:29.606270 2396 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 3 12:43:29.606385 kubelet[2396]: I0303 12:43:29.606369 2396 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 3 12:43:29.606425 kubelet[2396]: I0303 12:43:29.606419 2396 reconciler.go:29] "Reconciler: start to sync state" Mar 3 12:43:29.606847 kubelet[2396]: E0303 12:43:29.606775 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://78.47.249.221:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 3 12:43:29.606960 kubelet[2396]: E0303 12:43:29.606938 2396 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" Mar 3 12:43:29.607027 kubelet[2396]: E0303 12:43:29.607005 2396 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.47.249.221:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-8-fcaab3b7ef?timeout=10s\": dial tcp 78.47.249.221:6443: connect: connection refused" interval="200ms" Mar 3 12:43:29.608232 kubelet[2396]: I0303 12:43:29.608077 2396 factory.go:223] Registration of the systemd container factory successfully Mar 3 12:43:29.608514 kubelet[2396]: I0303 12:43:29.608476 2396 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 3 12:43:29.610876 kubelet[2396]: E0303 12:43:29.610850 2396 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 3 12:43:29.610998 kubelet[2396]: I0303 12:43:29.610965 2396 factory.go:223] Registration of the containerd container factory successfully Mar 3 12:43:29.624813 kubelet[2396]: I0303 12:43:29.624761 2396 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 3 12:43:29.624813 kubelet[2396]: I0303 12:43:29.624781 2396 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 3 12:43:29.624813 kubelet[2396]: I0303 12:43:29.624799 2396 state_mem.go:36] "Initialized new in-memory state store" Mar 3 12:43:29.627878 kubelet[2396]: I0303 12:43:29.626986 2396 policy_none.go:49] "None policy: Start" Mar 3 12:43:29.627878 kubelet[2396]: I0303 12:43:29.627011 2396 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 3 12:43:29.627878 kubelet[2396]: I0303 12:43:29.627021 2396 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 3 12:43:29.629056 kubelet[2396]: I0303 12:43:29.629031 2396 policy_none.go:47] "Start" Mar 3 12:43:29.631393 kubelet[2396]: I0303 12:43:29.631360 2396 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 3 12:43:29.634173 kubelet[2396]: I0303 12:43:29.634003 2396 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 3 12:43:29.634173 kubelet[2396]: I0303 12:43:29.634027 2396 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 3 12:43:29.634173 kubelet[2396]: I0303 12:43:29.634051 2396 kubelet.go:2428] "Starting kubelet main sync loop" Mar 3 12:43:29.634173 kubelet[2396]: E0303 12:43:29.634091 2396 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 3 12:43:29.636814 kubelet[2396]: E0303 12:43:29.636738 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://78.47.249.221:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 3 12:43:29.641564 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 3 12:43:29.659178 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 3 12:43:29.664194 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 3 12:43:29.674047 kubelet[2396]: E0303 12:43:29.673987 2396 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 3 12:43:29.675300 kubelet[2396]: I0303 12:43:29.675191 2396 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 3 12:43:29.676284 kubelet[2396]: I0303 12:43:29.675260 2396 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 3 12:43:29.677859 kubelet[2396]: I0303 12:43:29.677656 2396 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 3 12:43:29.678579 kubelet[2396]: E0303 12:43:29.678412 2396 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 3 12:43:29.678579 kubelet[2396]: E0303 12:43:29.678457 2396 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-8-fcaab3b7ef\" not found" Mar 3 12:43:29.752081 systemd[1]: Created slice kubepods-burstable-pod96fb5cc848dc95025b91a102a915ef35.slice - libcontainer container kubepods-burstable-pod96fb5cc848dc95025b91a102a915ef35.slice. Mar 3 12:43:29.761658 kubelet[2396]: E0303 12:43:29.761606 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.767843 systemd[1]: Created slice kubepods-burstable-pod73b7b69b948033a92611cd07d3df2d5a.slice - libcontainer container kubepods-burstable-pod73b7b69b948033a92611cd07d3df2d5a.slice. Mar 3 12:43:29.772399 kubelet[2396]: E0303 12:43:29.772369 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.773077 systemd[1]: Created slice kubepods-burstable-pod5d467f8dcfbe6c00d6340d1d8184593d.slice - libcontainer container kubepods-burstable-pod5d467f8dcfbe6c00d6340d1d8184593d.slice. Mar 3 12:43:29.774816 kubelet[2396]: E0303 12:43:29.774794 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.778637 kubelet[2396]: I0303 12:43:29.778576 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.779338 kubelet[2396]: E0303 12:43:29.779301 2396 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://78.47.249.221:6443/api/v1/nodes\": dial tcp 78.47.249.221:6443: connect: connection refused" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.808252 kubelet[2396]: E0303 12:43:29.808058 2396 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.47.249.221:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-8-fcaab3b7ef?timeout=10s\": dial tcp 78.47.249.221:6443: connect: connection refused" interval="400ms" Mar 3 12:43:29.907750 kubelet[2396]: I0303 12:43:29.907636 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.908595 kubelet[2396]: I0303 12:43:29.908341 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5d467f8dcfbe6c00d6340d1d8184593d-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"5d467f8dcfbe6c00d6340d1d8184593d\") " pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.908595 kubelet[2396]: I0303 12:43:29.908441 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/96fb5cc848dc95025b91a102a915ef35-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"96fb5cc848dc95025b91a102a915ef35\") " pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.908595 kubelet[2396]: I0303 12:43:29.908477 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.908595 kubelet[2396]: I0303 12:43:29.908511 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.908595 kubelet[2396]: I0303 12:43:29.908543 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.909063 kubelet[2396]: I0303 12:43:29.908606 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/96fb5cc848dc95025b91a102a915ef35-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"96fb5cc848dc95025b91a102a915ef35\") " pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.909063 kubelet[2396]: I0303 12:43:29.908680 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/96fb5cc848dc95025b91a102a915ef35-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"96fb5cc848dc95025b91a102a915ef35\") " pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.909063 kubelet[2396]: I0303 12:43:29.908726 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.981839 kubelet[2396]: I0303 12:43:29.981714 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:29.982447 kubelet[2396]: E0303 12:43:29.982409 2396 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://78.47.249.221:6443/api/v1/nodes\": dial tcp 78.47.249.221:6443: connect: connection refused" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:30.066242 containerd[1519]: time="2026-03-03T12:43:30.066156381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-8-fcaab3b7ef,Uid:96fb5cc848dc95025b91a102a915ef35,Namespace:kube-system,Attempt:0,}" Mar 3 12:43:30.075401 containerd[1519]: time="2026-03-03T12:43:30.075330386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef,Uid:73b7b69b948033a92611cd07d3df2d5a,Namespace:kube-system,Attempt:0,}" Mar 3 12:43:30.077986 containerd[1519]: time="2026-03-03T12:43:30.077767245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-8-fcaab3b7ef,Uid:5d467f8dcfbe6c00d6340d1d8184593d,Namespace:kube-system,Attempt:0,}" Mar 3 12:43:30.209081 kubelet[2396]: E0303 12:43:30.208914 2396 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.47.249.221:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-8-fcaab3b7ef?timeout=10s\": dial tcp 78.47.249.221:6443: connect: connection refused" interval="800ms" Mar 3 12:43:30.384908 kubelet[2396]: I0303 12:43:30.384846 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:30.385701 kubelet[2396]: E0303 12:43:30.385660 2396 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://78.47.249.221:6443/api/v1/nodes\": dial tcp 78.47.249.221:6443: connect: connection refused" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:30.522358 kubelet[2396]: E0303 12:43:30.521967 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://78.47.249.221:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 3 12:43:30.539543 kubelet[2396]: E0303 12:43:30.539422 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://78.47.249.221:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 3 12:43:30.583520 kubelet[2396]: E0303 12:43:30.583420 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://78.47.249.221:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-8-fcaab3b7ef&limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 3 12:43:30.605753 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4114578391.mount: Deactivated successfully. Mar 3 12:43:30.613170 containerd[1519]: time="2026-03-03T12:43:30.613093678Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 12:43:30.616093 containerd[1519]: time="2026-03-03T12:43:30.616026438Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Mar 3 12:43:30.617828 containerd[1519]: time="2026-03-03T12:43:30.617742128Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 12:43:30.625380 containerd[1519]: time="2026-03-03T12:43:30.624880964Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 3 12:43:30.627463 containerd[1519]: time="2026-03-03T12:43:30.627175085Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 3 12:43:30.627463 containerd[1519]: time="2026-03-03T12:43:30.627291940Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 12:43:30.630322 containerd[1519]: time="2026-03-03T12:43:30.630219179Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 560.502721ms" Mar 3 12:43:30.630805 containerd[1519]: time="2026-03-03T12:43:30.630777327Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 12:43:30.632029 containerd[1519]: time="2026-03-03T12:43:30.631988876Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 552.809778ms" Mar 3 12:43:30.632772 containerd[1519]: time="2026-03-03T12:43:30.632739008Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 12:43:30.633294 containerd[1519]: time="2026-03-03T12:43:30.633268913Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 555.977326ms" Mar 3 12:43:30.666333 containerd[1519]: time="2026-03-03T12:43:30.666280643Z" level=info msg="connecting to shim c805280eb2a7584f6f6b34c1d0087f8a7c0d3e8a0973d2a7cc5a93be8d4b8150" address="unix:///run/containerd/s/ed889e1528e993a02ffd572af02fcb00953be0b9f2f8243fb401d5601ea87860" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:43:30.667141 containerd[1519]: time="2026-03-03T12:43:30.667081861Z" level=info msg="connecting to shim 83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7" address="unix:///run/containerd/s/df42b206f8201e7ae8ca83c9347846ee090bfbf9b2ba02326c061ff64c901682" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:43:30.687056 containerd[1519]: time="2026-03-03T12:43:30.686565571Z" level=info msg="connecting to shim d1755bc9764f97f682b748ecc4b3f5d1c50677188aa0aa35aee8a7e3d4761264" address="unix:///run/containerd/s/9403ad5ae06fd0e2791f2baa6efc806e34a189b34a006b8e92d56e5b216775bf" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:43:30.704691 systemd[1]: Started cri-containerd-83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7.scope - libcontainer container 83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7. Mar 3 12:43:30.709423 systemd[1]: Started cri-containerd-c805280eb2a7584f6f6b34c1d0087f8a7c0d3e8a0973d2a7cc5a93be8d4b8150.scope - libcontainer container c805280eb2a7584f6f6b34c1d0087f8a7c0d3e8a0973d2a7cc5a93be8d4b8150. Mar 3 12:43:30.740417 systemd[1]: Started cri-containerd-d1755bc9764f97f682b748ecc4b3f5d1c50677188aa0aa35aee8a7e3d4761264.scope - libcontainer container d1755bc9764f97f682b748ecc4b3f5d1c50677188aa0aa35aee8a7e3d4761264. Mar 3 12:43:30.775390 containerd[1519]: time="2026-03-03T12:43:30.775049306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef,Uid:73b7b69b948033a92611cd07d3df2d5a,Namespace:kube-system,Attempt:0,} returns sandbox id \"83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7\"" Mar 3 12:43:30.784512 containerd[1519]: time="2026-03-03T12:43:30.784463541Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-8-fcaab3b7ef,Uid:96fb5cc848dc95025b91a102a915ef35,Namespace:kube-system,Attempt:0,} returns sandbox id \"c805280eb2a7584f6f6b34c1d0087f8a7c0d3e8a0973d2a7cc5a93be8d4b8150\"" Mar 3 12:43:30.805907 containerd[1519]: time="2026-03-03T12:43:30.805393469Z" level=info msg="CreateContainer within sandbox \"83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 3 12:43:30.812596 containerd[1519]: time="2026-03-03T12:43:30.812553427Z" level=info msg="CreateContainer within sandbox \"c805280eb2a7584f6f6b34c1d0087f8a7c0d3e8a0973d2a7cc5a93be8d4b8150\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 3 12:43:30.816347 containerd[1519]: time="2026-03-03T12:43:30.816307127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-8-fcaab3b7ef,Uid:5d467f8dcfbe6c00d6340d1d8184593d,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1755bc9764f97f682b748ecc4b3f5d1c50677188aa0aa35aee8a7e3d4761264\"" Mar 3 12:43:30.836642 containerd[1519]: time="2026-03-03T12:43:30.836089714Z" level=info msg="CreateContainer within sandbox \"d1755bc9764f97f682b748ecc4b3f5d1c50677188aa0aa35aee8a7e3d4761264\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 3 12:43:30.838069 containerd[1519]: time="2026-03-03T12:43:30.837784402Z" level=info msg="Container 073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:43:30.841795 containerd[1519]: time="2026-03-03T12:43:30.841764210Z" level=info msg="Container 746d9d3d832a5182ddb2bf79a6866eac7f36665bcafec8e2bc950b339a6b7657: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:43:30.879068 containerd[1519]: time="2026-03-03T12:43:30.878973455Z" level=info msg="CreateContainer within sandbox \"83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220\"" Mar 3 12:43:30.880406 containerd[1519]: time="2026-03-03T12:43:30.880363226Z" level=info msg="StartContainer for \"073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220\"" Mar 3 12:43:30.882659 containerd[1519]: time="2026-03-03T12:43:30.882599260Z" level=info msg="connecting to shim 073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220" address="unix:///run/containerd/s/df42b206f8201e7ae8ca83c9347846ee090bfbf9b2ba02326c061ff64c901682" protocol=ttrpc version=3 Mar 3 12:43:30.888216 containerd[1519]: time="2026-03-03T12:43:30.888090734Z" level=info msg="CreateContainer within sandbox \"c805280eb2a7584f6f6b34c1d0087f8a7c0d3e8a0973d2a7cc5a93be8d4b8150\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"746d9d3d832a5182ddb2bf79a6866eac7f36665bcafec8e2bc950b339a6b7657\"" Mar 3 12:43:30.888540 containerd[1519]: time="2026-03-03T12:43:30.888418054Z" level=info msg="Container ac8812efd4631c83df879fa68a2469b8f07f7eb61b6facbdbef674c72d2a4ca0: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:43:30.889016 containerd[1519]: time="2026-03-03T12:43:30.888989484Z" level=info msg="StartContainer for \"746d9d3d832a5182ddb2bf79a6866eac7f36665bcafec8e2bc950b339a6b7657\"" Mar 3 12:43:30.891719 containerd[1519]: time="2026-03-03T12:43:30.891673613Z" level=info msg="connecting to shim 746d9d3d832a5182ddb2bf79a6866eac7f36665bcafec8e2bc950b339a6b7657" address="unix:///run/containerd/s/ed889e1528e993a02ffd572af02fcb00953be0b9f2f8243fb401d5601ea87860" protocol=ttrpc version=3 Mar 3 12:43:30.903121 containerd[1519]: time="2026-03-03T12:43:30.903066051Z" level=info msg="CreateContainer within sandbox \"d1755bc9764f97f682b748ecc4b3f5d1c50677188aa0aa35aee8a7e3d4761264\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ac8812efd4631c83df879fa68a2469b8f07f7eb61b6facbdbef674c72d2a4ca0\"" Mar 3 12:43:30.905289 containerd[1519]: time="2026-03-03T12:43:30.904097457Z" level=info msg="StartContainer for \"ac8812efd4631c83df879fa68a2469b8f07f7eb61b6facbdbef674c72d2a4ca0\"" Mar 3 12:43:30.905289 containerd[1519]: time="2026-03-03T12:43:30.905208434Z" level=info msg="connecting to shim ac8812efd4631c83df879fa68a2469b8f07f7eb61b6facbdbef674c72d2a4ca0" address="unix:///run/containerd/s/9403ad5ae06fd0e2791f2baa6efc806e34a189b34a006b8e92d56e5b216775bf" protocol=ttrpc version=3 Mar 3 12:43:30.907361 systemd[1]: Started cri-containerd-073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220.scope - libcontainer container 073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220. Mar 3 12:43:30.919978 kubelet[2396]: E0303 12:43:30.919869 2396 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://78.47.249.221:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 78.47.249.221:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 3 12:43:30.925334 systemd[1]: Started cri-containerd-746d9d3d832a5182ddb2bf79a6866eac7f36665bcafec8e2bc950b339a6b7657.scope - libcontainer container 746d9d3d832a5182ddb2bf79a6866eac7f36665bcafec8e2bc950b339a6b7657. Mar 3 12:43:30.941474 systemd[1]: Started cri-containerd-ac8812efd4631c83df879fa68a2469b8f07f7eb61b6facbdbef674c72d2a4ca0.scope - libcontainer container ac8812efd4631c83df879fa68a2469b8f07f7eb61b6facbdbef674c72d2a4ca0. Mar 3 12:43:30.994881 containerd[1519]: time="2026-03-03T12:43:30.994505828Z" level=info msg="StartContainer for \"073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220\" returns successfully" Mar 3 12:43:31.017944 kubelet[2396]: E0303 12:43:31.017642 2396 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.47.249.221:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-8-fcaab3b7ef?timeout=10s\": dial tcp 78.47.249.221:6443: connect: connection refused" interval="1.6s" Mar 3 12:43:31.019638 containerd[1519]: time="2026-03-03T12:43:31.019594017Z" level=info msg="StartContainer for \"746d9d3d832a5182ddb2bf79a6866eac7f36665bcafec8e2bc950b339a6b7657\" returns successfully" Mar 3 12:43:31.035995 containerd[1519]: time="2026-03-03T12:43:31.035273745Z" level=info msg="StartContainer for \"ac8812efd4631c83df879fa68a2469b8f07f7eb61b6facbdbef674c72d2a4ca0\" returns successfully" Mar 3 12:43:31.190006 kubelet[2396]: I0303 12:43:31.189841 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:31.649571 kubelet[2396]: E0303 12:43:31.649296 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:31.652381 kubelet[2396]: E0303 12:43:31.652321 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:31.656489 kubelet[2396]: E0303 12:43:31.656339 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:32.657883 kubelet[2396]: E0303 12:43:32.657854 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:32.658710 kubelet[2396]: E0303 12:43:32.658509 2396 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:33.988195 kubelet[2396]: E0303 12:43:33.988144 2396 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-8-fcaab3b7ef\" not found" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.100216 kubelet[2396]: I0303 12:43:34.099186 2396 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.107639 kubelet[2396]: I0303 12:43:34.107595 2396 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.125199 kubelet[2396]: E0303 12:43:34.125134 2396 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-8-fcaab3b7ef\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.125199 kubelet[2396]: I0303 12:43:34.125190 2396 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.133009 kubelet[2396]: E0303 12:43:34.132782 2396 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.133009 kubelet[2396]: I0303 12:43:34.132814 2396 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.136993 kubelet[2396]: E0303 12:43:34.136963 2396 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:34.590848 kubelet[2396]: I0303 12:43:34.590785 2396 apiserver.go:52] "Watching apiserver" Mar 3 12:43:34.607016 kubelet[2396]: I0303 12:43:34.606972 2396 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 3 12:43:35.903559 kubelet[2396]: I0303 12:43:35.903011 2396 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:36.254462 systemd[1]: Reload requested from client PID 2683 ('systemctl') (unit session-7.scope)... Mar 3 12:43:36.254477 systemd[1]: Reloading... Mar 3 12:43:36.376210 zram_generator::config[2733]: No configuration found. Mar 3 12:43:36.591441 systemd[1]: Reloading finished in 336 ms. Mar 3 12:43:36.617242 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:43:36.631363 systemd[1]: kubelet.service: Deactivated successfully. Mar 3 12:43:36.632661 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:36.632778 systemd[1]: kubelet.service: Consumed 1.245s CPU time, 121.2M memory peak. Mar 3 12:43:36.636324 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 12:43:36.820426 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 12:43:36.830842 (kubelet)[2772]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 3 12:43:36.881058 kubelet[2772]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 3 12:43:36.882127 kubelet[2772]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 12:43:36.882127 kubelet[2772]: I0303 12:43:36.881481 2772 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 3 12:43:36.893000 kubelet[2772]: I0303 12:43:36.892970 2772 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 3 12:43:36.893193 kubelet[2772]: I0303 12:43:36.893178 2772 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 3 12:43:36.893371 kubelet[2772]: I0303 12:43:36.893355 2772 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 3 12:43:36.893453 kubelet[2772]: I0303 12:43:36.893438 2772 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 3 12:43:36.893845 kubelet[2772]: I0303 12:43:36.893807 2772 server.go:956] "Client rotation is on, will bootstrap in background" Mar 3 12:43:36.896294 kubelet[2772]: I0303 12:43:36.896205 2772 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 3 12:43:36.899352 kubelet[2772]: I0303 12:43:36.899326 2772 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 3 12:43:36.903692 kubelet[2772]: I0303 12:43:36.903661 2772 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 3 12:43:36.907038 kubelet[2772]: I0303 12:43:36.907013 2772 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 3 12:43:36.907364 kubelet[2772]: I0303 12:43:36.907297 2772 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 3 12:43:36.907494 kubelet[2772]: I0303 12:43:36.907347 2772 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-8-fcaab3b7ef","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 3 12:43:36.907581 kubelet[2772]: I0303 12:43:36.907495 2772 topology_manager.go:138] "Creating topology manager with none policy" Mar 3 12:43:36.907581 kubelet[2772]: I0303 12:43:36.907504 2772 container_manager_linux.go:306] "Creating device plugin manager" Mar 3 12:43:36.907581 kubelet[2772]: I0303 12:43:36.907531 2772 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 3 12:43:36.907742 kubelet[2772]: I0303 12:43:36.907731 2772 state_mem.go:36] "Initialized new in-memory state store" Mar 3 12:43:36.908010 kubelet[2772]: I0303 12:43:36.907996 2772 kubelet.go:475] "Attempting to sync node with API server" Mar 3 12:43:36.908051 kubelet[2772]: I0303 12:43:36.908015 2772 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 3 12:43:36.908051 kubelet[2772]: I0303 12:43:36.908044 2772 kubelet.go:387] "Adding apiserver pod source" Mar 3 12:43:36.908095 kubelet[2772]: I0303 12:43:36.908069 2772 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 3 12:43:36.910919 kubelet[2772]: I0303 12:43:36.910882 2772 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 3 12:43:36.915124 kubelet[2772]: I0303 12:43:36.915008 2772 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 3 12:43:36.915124 kubelet[2772]: I0303 12:43:36.915061 2772 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 3 12:43:36.926377 kubelet[2772]: I0303 12:43:36.926354 2772 server.go:1262] "Started kubelet" Mar 3 12:43:36.927146 kubelet[2772]: I0303 12:43:36.927028 2772 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 3 12:43:36.927667 kubelet[2772]: I0303 12:43:36.927258 2772 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 3 12:43:36.927667 kubelet[2772]: I0303 12:43:36.927509 2772 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 3 12:43:36.932520 kubelet[2772]: I0303 12:43:36.932492 2772 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 3 12:43:36.943318 kubelet[2772]: I0303 12:43:36.943252 2772 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 3 12:43:36.945099 kubelet[2772]: I0303 12:43:36.945069 2772 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 3 12:43:36.947005 kubelet[2772]: I0303 12:43:36.946984 2772 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 3 12:43:36.947224 kubelet[2772]: E0303 12:43:36.947196 2772 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-8-fcaab3b7ef\" not found" Mar 3 12:43:36.948227 kubelet[2772]: I0303 12:43:36.948208 2772 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 3 12:43:36.950125 kubelet[2772]: I0303 12:43:36.948516 2772 reconciler.go:29] "Reconciler: start to sync state" Mar 3 12:43:36.950125 kubelet[2772]: I0303 12:43:36.949749 2772 server.go:310] "Adding debug handlers to kubelet server" Mar 3 12:43:36.953736 kubelet[2772]: I0303 12:43:36.953718 2772 factory.go:223] Registration of the systemd container factory successfully Mar 3 12:43:36.955885 kubelet[2772]: E0303 12:43:36.955856 2772 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 3 12:43:36.957299 kubelet[2772]: I0303 12:43:36.957272 2772 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 3 12:43:36.959978 kubelet[2772]: I0303 12:43:36.959948 2772 factory.go:223] Registration of the containerd container factory successfully Mar 3 12:43:36.969490 kubelet[2772]: I0303 12:43:36.969444 2772 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 3 12:43:36.971315 kubelet[2772]: I0303 12:43:36.971163 2772 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 3 12:43:36.971315 kubelet[2772]: I0303 12:43:36.971205 2772 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 3 12:43:36.971315 kubelet[2772]: I0303 12:43:36.971226 2772 kubelet.go:2428] "Starting kubelet main sync loop" Mar 3 12:43:36.971315 kubelet[2772]: E0303 12:43:36.971266 2772 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 3 12:43:37.014974 kubelet[2772]: I0303 12:43:37.014939 2772 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 3 12:43:37.014974 kubelet[2772]: I0303 12:43:37.014959 2772 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 3 12:43:37.014974 kubelet[2772]: I0303 12:43:37.014981 2772 state_mem.go:36] "Initialized new in-memory state store" Mar 3 12:43:37.016261 kubelet[2772]: I0303 12:43:37.015138 2772 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 3 12:43:37.016261 kubelet[2772]: I0303 12:43:37.015148 2772 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 3 12:43:37.016261 kubelet[2772]: I0303 12:43:37.015164 2772 policy_none.go:49] "None policy: Start" Mar 3 12:43:37.016261 kubelet[2772]: I0303 12:43:37.015173 2772 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 3 12:43:37.016261 kubelet[2772]: I0303 12:43:37.015182 2772 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 3 12:43:37.016261 kubelet[2772]: I0303 12:43:37.015303 2772 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 3 12:43:37.016261 kubelet[2772]: I0303 12:43:37.015311 2772 policy_none.go:47] "Start" Mar 3 12:43:37.021501 kubelet[2772]: E0303 12:43:37.021461 2772 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 3 12:43:37.021815 kubelet[2772]: I0303 12:43:37.021789 2772 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 3 12:43:37.021894 kubelet[2772]: I0303 12:43:37.021806 2772 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 3 12:43:37.023381 kubelet[2772]: I0303 12:43:37.023348 2772 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 3 12:43:37.026932 kubelet[2772]: E0303 12:43:37.026874 2772 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 3 12:43:37.072949 kubelet[2772]: I0303 12:43:37.072892 2772 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.074033 kubelet[2772]: I0303 12:43:37.072907 2772 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.074985 kubelet[2772]: I0303 12:43:37.073017 2772 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.083697 kubelet[2772]: E0303 12:43:37.083657 2772 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-8-fcaab3b7ef\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.125744 kubelet[2772]: I0303 12:43:37.125371 2772 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.137094 kubelet[2772]: I0303 12:43:37.136931 2772 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.137094 kubelet[2772]: I0303 12:43:37.137022 2772 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151173 kubelet[2772]: I0303 12:43:37.150961 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/96fb5cc848dc95025b91a102a915ef35-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"96fb5cc848dc95025b91a102a915ef35\") " pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151173 kubelet[2772]: I0303 12:43:37.151016 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/96fb5cc848dc95025b91a102a915ef35-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"96fb5cc848dc95025b91a102a915ef35\") " pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151173 kubelet[2772]: I0303 12:43:37.151039 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151173 kubelet[2772]: I0303 12:43:37.151058 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/96fb5cc848dc95025b91a102a915ef35-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"96fb5cc848dc95025b91a102a915ef35\") " pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151173 kubelet[2772]: I0303 12:43:37.151081 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151616 kubelet[2772]: I0303 12:43:37.151099 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151616 kubelet[2772]: I0303 12:43:37.151151 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151616 kubelet[2772]: I0303 12:43:37.151173 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/73b7b69b948033a92611cd07d3df2d5a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"73b7b69b948033a92611cd07d3df2d5a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.151616 kubelet[2772]: I0303 12:43:37.151193 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5d467f8dcfbe6c00d6340d1d8184593d-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-8-fcaab3b7ef\" (UID: \"5d467f8dcfbe6c00d6340d1d8184593d\") " pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:37.909694 kubelet[2772]: I0303 12:43:37.909639 2772 apiserver.go:52] "Watching apiserver" Mar 3 12:43:37.950697 kubelet[2772]: I0303 12:43:37.950620 2772 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 3 12:43:37.994950 kubelet[2772]: I0303 12:43:37.994545 2772 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:38.006321 kubelet[2772]: E0303 12:43:38.006285 2772 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-8-fcaab3b7ef\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:43:38.047465 kubelet[2772]: I0303 12:43:38.047388 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-8-fcaab3b7ef" podStartSLOduration=1.047368693 podStartE2EDuration="1.047368693s" podCreationTimestamp="2026-03-03 12:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 12:43:38.028863239 +0000 UTC m=+1.193796398" watchObservedRunningTime="2026-03-03 12:43:38.047368693 +0000 UTC m=+1.212301852" Mar 3 12:43:38.063208 kubelet[2772]: I0303 12:43:38.063044 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-8-fcaab3b7ef" podStartSLOduration=3.063025886 podStartE2EDuration="3.063025886s" podCreationTimestamp="2026-03-03 12:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 12:43:38.048971879 +0000 UTC m=+1.213905078" watchObservedRunningTime="2026-03-03 12:43:38.063025886 +0000 UTC m=+1.227959045" Mar 3 12:43:38.076430 kubelet[2772]: I0303 12:43:38.076242 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-8-fcaab3b7ef" podStartSLOduration=1.076223254 podStartE2EDuration="1.076223254s" podCreationTimestamp="2026-03-03 12:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 12:43:38.063457525 +0000 UTC m=+1.228390684" watchObservedRunningTime="2026-03-03 12:43:38.076223254 +0000 UTC m=+1.241156413" Mar 3 12:43:41.228745 kubelet[2772]: I0303 12:43:41.228492 2772 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 3 12:43:41.229372 containerd[1519]: time="2026-03-03T12:43:41.229034527Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 3 12:43:41.230607 kubelet[2772]: I0303 12:43:41.230001 2772 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 3 12:43:42.424400 systemd[1]: Created slice kubepods-besteffort-pod568b6726_d168_4793_9af4_7d1742734e17.slice - libcontainer container kubepods-besteffort-pod568b6726_d168_4793_9af4_7d1742734e17.slice. Mar 3 12:43:42.488146 kubelet[2772]: I0303 12:43:42.487858 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/568b6726-d168-4793-9af4-7d1742734e17-xtables-lock\") pod \"kube-proxy-bx6cb\" (UID: \"568b6726-d168-4793-9af4-7d1742734e17\") " pod="kube-system/kube-proxy-bx6cb" Mar 3 12:43:42.488469 kubelet[2772]: I0303 12:43:42.488331 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/568b6726-d168-4793-9af4-7d1742734e17-kube-proxy\") pod \"kube-proxy-bx6cb\" (UID: \"568b6726-d168-4793-9af4-7d1742734e17\") " pod="kube-system/kube-proxy-bx6cb" Mar 3 12:43:42.488469 kubelet[2772]: I0303 12:43:42.488359 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/568b6726-d168-4793-9af4-7d1742734e17-lib-modules\") pod \"kube-proxy-bx6cb\" (UID: \"568b6726-d168-4793-9af4-7d1742734e17\") " pod="kube-system/kube-proxy-bx6cb" Mar 3 12:43:42.488469 kubelet[2772]: I0303 12:43:42.488389 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xzlg\" (UniqueName: \"kubernetes.io/projected/568b6726-d168-4793-9af4-7d1742734e17-kube-api-access-8xzlg\") pod \"kube-proxy-bx6cb\" (UID: \"568b6726-d168-4793-9af4-7d1742734e17\") " pod="kube-system/kube-proxy-bx6cb" Mar 3 12:43:42.494217 systemd[1]: Created slice kubepods-besteffort-pod448fb59c_eea1_45c0_8d13_b5ee8109e22e.slice - libcontainer container kubepods-besteffort-pod448fb59c_eea1_45c0_8d13_b5ee8109e22e.slice. Mar 3 12:43:42.589459 kubelet[2772]: I0303 12:43:42.589385 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2gd\" (UniqueName: \"kubernetes.io/projected/448fb59c-eea1-45c0-8d13-b5ee8109e22e-kube-api-access-pc2gd\") pod \"tigera-operator-5588576f44-6chcz\" (UID: \"448fb59c-eea1-45c0-8d13-b5ee8109e22e\") " pod="tigera-operator/tigera-operator-5588576f44-6chcz" Mar 3 12:43:42.589902 kubelet[2772]: I0303 12:43:42.589875 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/448fb59c-eea1-45c0-8d13-b5ee8109e22e-var-lib-calico\") pod \"tigera-operator-5588576f44-6chcz\" (UID: \"448fb59c-eea1-45c0-8d13-b5ee8109e22e\") " pod="tigera-operator/tigera-operator-5588576f44-6chcz" Mar 3 12:43:42.736620 containerd[1519]: time="2026-03-03T12:43:42.736497613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bx6cb,Uid:568b6726-d168-4793-9af4-7d1742734e17,Namespace:kube-system,Attempt:0,}" Mar 3 12:43:42.760688 containerd[1519]: time="2026-03-03T12:43:42.760588166Z" level=info msg="connecting to shim 0c00c2d0ed9955c0f1eeddc2d4d2e00c90285f7152476bfbfb08c143075b2127" address="unix:///run/containerd/s/a21681ae95c6777c789994fee7c4c7c94c8f6911ecf40f4c79c6b62556e510d6" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:43:42.790318 systemd[1]: Started cri-containerd-0c00c2d0ed9955c0f1eeddc2d4d2e00c90285f7152476bfbfb08c143075b2127.scope - libcontainer container 0c00c2d0ed9955c0f1eeddc2d4d2e00c90285f7152476bfbfb08c143075b2127. Mar 3 12:43:42.803733 containerd[1519]: time="2026-03-03T12:43:42.803672937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-6chcz,Uid:448fb59c-eea1-45c0-8d13-b5ee8109e22e,Namespace:tigera-operator,Attempt:0,}" Mar 3 12:43:42.826406 containerd[1519]: time="2026-03-03T12:43:42.826352135Z" level=info msg="connecting to shim c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6" address="unix:///run/containerd/s/71587103546140843c8514e03a1a174366315456d97ad834502fe29c05a99de6" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:43:42.833489 containerd[1519]: time="2026-03-03T12:43:42.833436910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bx6cb,Uid:568b6726-d168-4793-9af4-7d1742734e17,Namespace:kube-system,Attempt:0,} returns sandbox id \"0c00c2d0ed9955c0f1eeddc2d4d2e00c90285f7152476bfbfb08c143075b2127\"" Mar 3 12:43:42.841978 containerd[1519]: time="2026-03-03T12:43:42.841731462Z" level=info msg="CreateContainer within sandbox \"0c00c2d0ed9955c0f1eeddc2d4d2e00c90285f7152476bfbfb08c143075b2127\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 3 12:43:42.854208 containerd[1519]: time="2026-03-03T12:43:42.854160269Z" level=info msg="Container c6c13c4bb8b45df3a486bb5655121a5536affc0440877eadaa5ff5744bcd1aaa: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:43:42.854688 systemd[1]: Started cri-containerd-c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6.scope - libcontainer container c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6. Mar 3 12:43:42.873085 containerd[1519]: time="2026-03-03T12:43:42.873008357Z" level=info msg="CreateContainer within sandbox \"0c00c2d0ed9955c0f1eeddc2d4d2e00c90285f7152476bfbfb08c143075b2127\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c6c13c4bb8b45df3a486bb5655121a5536affc0440877eadaa5ff5744bcd1aaa\"" Mar 3 12:43:42.874225 containerd[1519]: time="2026-03-03T12:43:42.874067603Z" level=info msg="StartContainer for \"c6c13c4bb8b45df3a486bb5655121a5536affc0440877eadaa5ff5744bcd1aaa\"" Mar 3 12:43:42.876133 containerd[1519]: time="2026-03-03T12:43:42.876057364Z" level=info msg="connecting to shim c6c13c4bb8b45df3a486bb5655121a5536affc0440877eadaa5ff5744bcd1aaa" address="unix:///run/containerd/s/a21681ae95c6777c789994fee7c4c7c94c8f6911ecf40f4c79c6b62556e510d6" protocol=ttrpc version=3 Mar 3 12:43:42.900329 systemd[1]: Started cri-containerd-c6c13c4bb8b45df3a486bb5655121a5536affc0440877eadaa5ff5744bcd1aaa.scope - libcontainer container c6c13c4bb8b45df3a486bb5655121a5536affc0440877eadaa5ff5744bcd1aaa. Mar 3 12:43:42.915143 containerd[1519]: time="2026-03-03T12:43:42.915043963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-6chcz,Uid:448fb59c-eea1-45c0-8d13-b5ee8109e22e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6\"" Mar 3 12:43:42.918967 containerd[1519]: time="2026-03-03T12:43:42.918841431Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 3 12:43:42.977831 containerd[1519]: time="2026-03-03T12:43:42.977771447Z" level=info msg="StartContainer for \"c6c13c4bb8b45df3a486bb5655121a5536affc0440877eadaa5ff5744bcd1aaa\" returns successfully" Mar 3 12:43:43.027839 kubelet[2772]: I0303 12:43:43.027544 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bx6cb" podStartSLOduration=1.027527541 podStartE2EDuration="1.027527541s" podCreationTimestamp="2026-03-03 12:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 12:43:43.027413732 +0000 UTC m=+6.192346891" watchObservedRunningTime="2026-03-03 12:43:43.027527541 +0000 UTC m=+6.192460700" Mar 3 12:43:44.666102 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2214451210.mount: Deactivated successfully. Mar 3 12:43:45.446823 containerd[1519]: time="2026-03-03T12:43:45.446771538Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:45.447919 containerd[1519]: time="2026-03-03T12:43:45.447874660Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 3 12:43:45.449601 containerd[1519]: time="2026-03-03T12:43:45.449365692Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:45.452735 containerd[1519]: time="2026-03-03T12:43:45.452699341Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:43:45.453622 containerd[1519]: time="2026-03-03T12:43:45.453590928Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.534430471s" Mar 3 12:43:45.453792 containerd[1519]: time="2026-03-03T12:43:45.453717177Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 3 12:43:45.460668 containerd[1519]: time="2026-03-03T12:43:45.460607133Z" level=info msg="CreateContainer within sandbox \"c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 3 12:43:45.474196 containerd[1519]: time="2026-03-03T12:43:45.474137064Z" level=info msg="Container 7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:43:45.482590 containerd[1519]: time="2026-03-03T12:43:45.482518411Z" level=info msg="CreateContainer within sandbox \"c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a\"" Mar 3 12:43:45.485211 containerd[1519]: time="2026-03-03T12:43:45.484335307Z" level=info msg="StartContainer for \"7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a\"" Mar 3 12:43:45.486922 containerd[1519]: time="2026-03-03T12:43:45.486771369Z" level=info msg="connecting to shim 7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a" address="unix:///run/containerd/s/71587103546140843c8514e03a1a174366315456d97ad834502fe29c05a99de6" protocol=ttrpc version=3 Mar 3 12:43:45.509406 systemd[1]: Started cri-containerd-7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a.scope - libcontainer container 7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a. Mar 3 12:43:45.546843 containerd[1519]: time="2026-03-03T12:43:45.546782177Z" level=info msg="StartContainer for \"7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a\" returns successfully" Mar 3 12:43:48.719485 kubelet[2772]: I0303 12:43:48.719391 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-6chcz" podStartSLOduration=4.181896372 podStartE2EDuration="6.719366723s" podCreationTimestamp="2026-03-03 12:43:42 +0000 UTC" firstStartedPulling="2026-03-03 12:43:42.917049286 +0000 UTC m=+6.081982445" lastFinishedPulling="2026-03-03 12:43:45.454519677 +0000 UTC m=+8.619452796" observedRunningTime="2026-03-03 12:43:46.040480662 +0000 UTC m=+9.205413901" watchObservedRunningTime="2026-03-03 12:43:48.719366723 +0000 UTC m=+11.884299882" Mar 3 12:43:51.660481 sudo[1799]: pam_unix(sudo:session): session closed for user root Mar 3 12:43:51.757399 sshd[1798]: Connection closed by 20.161.92.111 port 48414 Mar 3 12:43:51.757220 sshd-session[1795]: pam_unix(sshd:session): session closed for user core Mar 3 12:43:51.763849 systemd[1]: sshd@6-78.47.249.221:22-20.161.92.111:48414.service: Deactivated successfully. Mar 3 12:43:51.765412 systemd-logind[1493]: Session 7 logged out. Waiting for processes to exit. Mar 3 12:43:51.769620 systemd[1]: session-7.scope: Deactivated successfully. Mar 3 12:43:51.770533 systemd[1]: session-7.scope: Consumed 6.135s CPU time, 218.6M memory peak. Mar 3 12:43:51.774551 systemd-logind[1493]: Removed session 7. Mar 3 12:43:58.701985 systemd[1]: Created slice kubepods-besteffort-podb38cc3bb_6b73_485a_bd63_ac1721ba34f5.slice - libcontainer container kubepods-besteffort-podb38cc3bb_6b73_485a_bd63_ac1721ba34f5.slice. Mar 3 12:43:58.798565 kubelet[2772]: I0303 12:43:58.798512 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8bf4\" (UniqueName: \"kubernetes.io/projected/b38cc3bb-6b73-485a-bd63-ac1721ba34f5-kube-api-access-p8bf4\") pod \"calico-typha-54c8d886f7-nt25q\" (UID: \"b38cc3bb-6b73-485a-bd63-ac1721ba34f5\") " pod="calico-system/calico-typha-54c8d886f7-nt25q" Mar 3 12:43:58.798565 kubelet[2772]: I0303 12:43:58.798571 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b38cc3bb-6b73-485a-bd63-ac1721ba34f5-tigera-ca-bundle\") pod \"calico-typha-54c8d886f7-nt25q\" (UID: \"b38cc3bb-6b73-485a-bd63-ac1721ba34f5\") " pod="calico-system/calico-typha-54c8d886f7-nt25q" Mar 3 12:43:58.799042 kubelet[2772]: I0303 12:43:58.798590 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b38cc3bb-6b73-485a-bd63-ac1721ba34f5-typha-certs\") pod \"calico-typha-54c8d886f7-nt25q\" (UID: \"b38cc3bb-6b73-485a-bd63-ac1721ba34f5\") " pod="calico-system/calico-typha-54c8d886f7-nt25q" Mar 3 12:43:58.922247 systemd[1]: Created slice kubepods-besteffort-podb92cc6ee_deb8_4e84_ae35_6d71a47771dc.slice - libcontainer container kubepods-besteffort-podb92cc6ee_deb8_4e84_ae35_6d71a47771dc.slice. Mar 3 12:43:59.000460 kubelet[2772]: I0303 12:43:58.999523 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-bpffs\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.000460 kubelet[2772]: I0303 12:43:59.000384 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-cni-net-dir\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.000460 kubelet[2772]: I0303 12:43:59.000414 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-var-run-calico\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.001181 kubelet[2772]: I0303 12:43:59.000674 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5x2h\" (UniqueName: \"kubernetes.io/projected/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-kube-api-access-h5x2h\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.001181 kubelet[2772]: I0303 12:43:59.001142 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-flexvol-driver-host\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.001469 kubelet[2772]: I0303 12:43:59.001387 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-cni-bin-dir\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.001708 kubelet[2772]: I0303 12:43:59.001648 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-node-certs\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.001708 kubelet[2772]: I0303 12:43:59.001677 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-cni-log-dir\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.001945 kubelet[2772]: I0303 12:43:59.001905 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-lib-modules\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.001945 kubelet[2772]: I0303 12:43:59.001929 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-policysync\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.002207 kubelet[2772]: I0303 12:43:59.002189 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-sys-fs\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.002437 kubelet[2772]: I0303 12:43:59.002413 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-nodeproc\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.002662 kubelet[2772]: I0303 12:43:59.002490 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-tigera-ca-bundle\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.002662 kubelet[2772]: I0303 12:43:59.002525 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-var-lib-calico\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.002799 kubelet[2772]: I0303 12:43:59.002781 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b92cc6ee-deb8-4e84-ae35-6d71a47771dc-xtables-lock\") pod \"calico-node-d8vgh\" (UID: \"b92cc6ee-deb8-4e84-ae35-6d71a47771dc\") " pod="calico-system/calico-node-d8vgh" Mar 3 12:43:59.010462 containerd[1519]: time="2026-03-03T12:43:59.010321636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54c8d886f7-nt25q,Uid:b38cc3bb-6b73-485a-bd63-ac1721ba34f5,Namespace:calico-system,Attempt:0,}" Mar 3 12:43:59.021928 kubelet[2772]: E0303 12:43:59.021852 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:43:59.040551 containerd[1519]: time="2026-03-03T12:43:59.040320156Z" level=info msg="connecting to shim aa8d49333254af3f178ea9af8f6b552cdaef516a0e3c4306fe75a5cc440b0a9b" address="unix:///run/containerd/s/84d0aa6e23e34b7cc3f104626fa62bf1bdcc2e91d9086f330a0ee628b7bab391" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:43:59.083734 systemd[1]: Started cri-containerd-aa8d49333254af3f178ea9af8f6b552cdaef516a0e3c4306fe75a5cc440b0a9b.scope - libcontainer container aa8d49333254af3f178ea9af8f6b552cdaef516a0e3c4306fe75a5cc440b0a9b. Mar 3 12:43:59.104384 kubelet[2772]: I0303 12:43:59.104330 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af4d9ea5-01b2-4705-ac48-8ebf47211342-kubelet-dir\") pod \"csi-node-driver-5rc66\" (UID: \"af4d9ea5-01b2-4705-ac48-8ebf47211342\") " pod="calico-system/csi-node-driver-5rc66" Mar 3 12:43:59.105134 kubelet[2772]: I0303 12:43:59.104755 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnkr\" (UniqueName: \"kubernetes.io/projected/af4d9ea5-01b2-4705-ac48-8ebf47211342-kube-api-access-nbnkr\") pod \"csi-node-driver-5rc66\" (UID: \"af4d9ea5-01b2-4705-ac48-8ebf47211342\") " pod="calico-system/csi-node-driver-5rc66" Mar 3 12:43:59.105134 kubelet[2772]: I0303 12:43:59.104814 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af4d9ea5-01b2-4705-ac48-8ebf47211342-registration-dir\") pod \"csi-node-driver-5rc66\" (UID: \"af4d9ea5-01b2-4705-ac48-8ebf47211342\") " pod="calico-system/csi-node-driver-5rc66" Mar 3 12:43:59.105134 kubelet[2772]: I0303 12:43:59.104849 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af4d9ea5-01b2-4705-ac48-8ebf47211342-socket-dir\") pod \"csi-node-driver-5rc66\" (UID: \"af4d9ea5-01b2-4705-ac48-8ebf47211342\") " pod="calico-system/csi-node-driver-5rc66" Mar 3 12:43:59.105134 kubelet[2772]: I0303 12:43:59.104953 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/af4d9ea5-01b2-4705-ac48-8ebf47211342-varrun\") pod \"csi-node-driver-5rc66\" (UID: \"af4d9ea5-01b2-4705-ac48-8ebf47211342\") " pod="calico-system/csi-node-driver-5rc66" Mar 3 12:43:59.112137 kubelet[2772]: E0303 12:43:59.110322 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.112680 kubelet[2772]: W0303 12:43:59.112617 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.112680 kubelet[2772]: E0303 12:43:59.112649 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.115302 kubelet[2772]: E0303 12:43:59.115261 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.115302 kubelet[2772]: W0303 12:43:59.115281 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.115460 kubelet[2772]: E0303 12:43:59.115423 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.138538 kubelet[2772]: E0303 12:43:59.138446 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.138538 kubelet[2772]: W0303 12:43:59.138475 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.138538 kubelet[2772]: E0303 12:43:59.138495 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.170497 containerd[1519]: time="2026-03-03T12:43:59.170412454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54c8d886f7-nt25q,Uid:b38cc3bb-6b73-485a-bd63-ac1721ba34f5,Namespace:calico-system,Attempt:0,} returns sandbox id \"aa8d49333254af3f178ea9af8f6b552cdaef516a0e3c4306fe75a5cc440b0a9b\"" Mar 3 12:43:59.175139 containerd[1519]: time="2026-03-03T12:43:59.174037662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 3 12:43:59.205986 kubelet[2772]: E0303 12:43:59.205941 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.206217 kubelet[2772]: W0303 12:43:59.206019 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.206217 kubelet[2772]: E0303 12:43:59.206061 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.206573 kubelet[2772]: E0303 12:43:59.206551 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.206573 kubelet[2772]: W0303 12:43:59.206572 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.206702 kubelet[2772]: E0303 12:43:59.206592 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.207152 kubelet[2772]: E0303 12:43:59.207073 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.207246 kubelet[2772]: W0303 12:43:59.207170 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.207246 kubelet[2772]: E0303 12:43:59.207194 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.207587 kubelet[2772]: E0303 12:43:59.207552 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.207587 kubelet[2772]: W0303 12:43:59.207588 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.207683 kubelet[2772]: E0303 12:43:59.207606 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.207915 kubelet[2772]: E0303 12:43:59.207881 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.207964 kubelet[2772]: W0303 12:43:59.207918 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.207964 kubelet[2772]: E0303 12:43:59.207936 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.208413 kubelet[2772]: E0303 12:43:59.208392 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.208482 kubelet[2772]: W0303 12:43:59.208415 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.208482 kubelet[2772]: E0303 12:43:59.208433 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.208743 kubelet[2772]: E0303 12:43:59.208725 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.208788 kubelet[2772]: W0303 12:43:59.208745 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.208788 kubelet[2772]: E0303 12:43:59.208779 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.209241 kubelet[2772]: E0303 12:43:59.209197 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.209309 kubelet[2772]: W0303 12:43:59.209243 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.209309 kubelet[2772]: E0303 12:43:59.209262 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.209497 kubelet[2772]: E0303 12:43:59.209485 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.209547 kubelet[2772]: W0303 12:43:59.209506 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.209547 kubelet[2772]: E0303 12:43:59.209519 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.209686 kubelet[2772]: E0303 12:43:59.209671 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.209686 kubelet[2772]: W0303 12:43:59.209681 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.209780 kubelet[2772]: E0303 12:43:59.209689 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.209851 kubelet[2772]: E0303 12:43:59.209829 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.209851 kubelet[2772]: W0303 12:43:59.209836 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.209851 kubelet[2772]: E0303 12:43:59.209843 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.209985 kubelet[2772]: E0303 12:43:59.209970 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.209985 kubelet[2772]: W0303 12:43:59.209980 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.210085 kubelet[2772]: E0303 12:43:59.209987 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.210196 kubelet[2772]: E0303 12:43:59.210171 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.210196 kubelet[2772]: W0303 12:43:59.210179 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.210196 kubelet[2772]: E0303 12:43:59.210196 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.210371 kubelet[2772]: E0303 12:43:59.210342 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.210371 kubelet[2772]: W0303 12:43:59.210349 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.210371 kubelet[2772]: E0303 12:43:59.210357 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.210710 kubelet[2772]: E0303 12:43:59.210692 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.210784 kubelet[2772]: W0303 12:43:59.210770 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.210854 kubelet[2772]: E0303 12:43:59.210842 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.211196 kubelet[2772]: E0303 12:43:59.211149 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.211196 kubelet[2772]: W0303 12:43:59.211165 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.211196 kubelet[2772]: E0303 12:43:59.211178 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.211633 kubelet[2772]: E0303 12:43:59.211542 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.211633 kubelet[2772]: W0303 12:43:59.211558 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.211633 kubelet[2772]: E0303 12:43:59.211571 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.211910 kubelet[2772]: E0303 12:43:59.211897 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.211992 kubelet[2772]: W0303 12:43:59.211979 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.212145 kubelet[2772]: E0303 12:43:59.212039 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.212357 kubelet[2772]: E0303 12:43:59.212344 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.212422 kubelet[2772]: W0303 12:43:59.212411 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.212474 kubelet[2772]: E0303 12:43:59.212464 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.212811 kubelet[2772]: E0303 12:43:59.212733 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.212811 kubelet[2772]: W0303 12:43:59.212745 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.212811 kubelet[2772]: E0303 12:43:59.212754 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.213160 kubelet[2772]: E0303 12:43:59.213029 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.213160 kubelet[2772]: W0303 12:43:59.213040 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.213160 kubelet[2772]: E0303 12:43:59.213049 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.213312 kubelet[2772]: E0303 12:43:59.213288 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.213361 kubelet[2772]: W0303 12:43:59.213344 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.213390 kubelet[2772]: E0303 12:43:59.213364 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.213578 kubelet[2772]: E0303 12:43:59.213566 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.213578 kubelet[2772]: W0303 12:43:59.213577 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.213662 kubelet[2772]: E0303 12:43:59.213587 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.213764 kubelet[2772]: E0303 12:43:59.213749 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.213764 kubelet[2772]: W0303 12:43:59.213756 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.213826 kubelet[2772]: E0303 12:43:59.213765 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.214271 kubelet[2772]: E0303 12:43:59.214175 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.214446 kubelet[2772]: W0303 12:43:59.214346 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.214525 kubelet[2772]: E0303 12:43:59.214505 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.228491 kubelet[2772]: E0303 12:43:59.228451 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:43:59.228491 kubelet[2772]: W0303 12:43:59.228483 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:43:59.228763 kubelet[2772]: E0303 12:43:59.228511 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:43:59.231083 containerd[1519]: time="2026-03-03T12:43:59.230964325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-d8vgh,Uid:b92cc6ee-deb8-4e84-ae35-6d71a47771dc,Namespace:calico-system,Attempt:0,}" Mar 3 12:43:59.256476 containerd[1519]: time="2026-03-03T12:43:59.254268901Z" level=info msg="connecting to shim 35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b" address="unix:///run/containerd/s/b4b67af332636bf1a36eb6c6ba2015a0677a48706ffc21384ccea5e62fcf3dc9" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:43:59.277363 systemd[1]: Started cri-containerd-35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b.scope - libcontainer container 35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b. Mar 3 12:43:59.306709 containerd[1519]: time="2026-03-03T12:43:59.306652745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-d8vgh,Uid:b92cc6ee-deb8-4e84-ae35-6d71a47771dc,Namespace:calico-system,Attempt:0,} returns sandbox id \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\"" Mar 3 12:44:00.811814 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2839597162.mount: Deactivated successfully. Mar 3 12:44:00.972273 kubelet[2772]: E0303 12:44:00.972195 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:01.424886 containerd[1519]: time="2026-03-03T12:44:01.424824701Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:01.426149 containerd[1519]: time="2026-03-03T12:44:01.426082971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 3 12:44:01.427829 containerd[1519]: time="2026-03-03T12:44:01.426897457Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:01.429671 containerd[1519]: time="2026-03-03T12:44:01.429615089Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:01.430376 containerd[1519]: time="2026-03-03T12:44:01.430236163Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.254873025s" Mar 3 12:44:01.430376 containerd[1519]: time="2026-03-03T12:44:01.430275326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 3 12:44:01.432884 containerd[1519]: time="2026-03-03T12:44:01.432836029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 3 12:44:01.452828 containerd[1519]: time="2026-03-03T12:44:01.452635255Z" level=info msg="CreateContainer within sandbox \"aa8d49333254af3f178ea9af8f6b552cdaef516a0e3c4306fe75a5cc440b0a9b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 3 12:44:01.466404 containerd[1519]: time="2026-03-03T12:44:01.466173612Z" level=info msg="Container 0d9ea51197a1614c146509e096f1295ffb3f7c5523813e7593301ae00eefcfd0: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:01.479399 containerd[1519]: time="2026-03-03T12:44:01.479343388Z" level=info msg="CreateContainer within sandbox \"aa8d49333254af3f178ea9af8f6b552cdaef516a0e3c4306fe75a5cc440b0a9b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0d9ea51197a1614c146509e096f1295ffb3f7c5523813e7593301ae00eefcfd0\"" Mar 3 12:44:01.487283 containerd[1519]: time="2026-03-03T12:44:01.487246150Z" level=info msg="StartContainer for \"0d9ea51197a1614c146509e096f1295ffb3f7c5523813e7593301ae00eefcfd0\"" Mar 3 12:44:01.488858 containerd[1519]: time="2026-03-03T12:44:01.488824158Z" level=info msg="connecting to shim 0d9ea51197a1614c146509e096f1295ffb3f7c5523813e7593301ae00eefcfd0" address="unix:///run/containerd/s/84d0aa6e23e34b7cc3f104626fa62bf1bdcc2e91d9086f330a0ee628b7bab391" protocol=ttrpc version=3 Mar 3 12:44:01.512357 systemd[1]: Started cri-containerd-0d9ea51197a1614c146509e096f1295ffb3f7c5523813e7593301ae00eefcfd0.scope - libcontainer container 0d9ea51197a1614c146509e096f1295ffb3f7c5523813e7593301ae00eefcfd0. Mar 3 12:44:01.564136 containerd[1519]: time="2026-03-03T12:44:01.563985839Z" level=info msg="StartContainer for \"0d9ea51197a1614c146509e096f1295ffb3f7c5523813e7593301ae00eefcfd0\" returns successfully" Mar 3 12:44:02.089333 kubelet[2772]: I0303 12:44:02.088571 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-54c8d886f7-nt25q" podStartSLOduration=1.830117191 podStartE2EDuration="4.088552578s" podCreationTimestamp="2026-03-03 12:43:58 +0000 UTC" firstStartedPulling="2026-03-03 12:43:59.173602197 +0000 UTC m=+22.338535396" lastFinishedPulling="2026-03-03 12:44:01.432037544 +0000 UTC m=+24.596970783" observedRunningTime="2026-03-03 12:44:02.088010228 +0000 UTC m=+25.252943387" watchObservedRunningTime="2026-03-03 12:44:02.088552578 +0000 UTC m=+25.253485737" Mar 3 12:44:02.101345 kubelet[2772]: E0303 12:44:02.101146 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.101345 kubelet[2772]: W0303 12:44:02.101180 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.101345 kubelet[2772]: E0303 12:44:02.101228 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.101856 kubelet[2772]: E0303 12:44:02.101825 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.102017 kubelet[2772]: W0303 12:44:02.101951 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.102145 kubelet[2772]: E0303 12:44:02.102095 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.102511 kubelet[2772]: E0303 12:44:02.102492 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.102755 kubelet[2772]: W0303 12:44:02.102615 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.102755 kubelet[2772]: E0303 12:44:02.102640 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.103024 kubelet[2772]: E0303 12:44:02.103005 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.103148 kubelet[2772]: W0303 12:44:02.103129 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.103402 kubelet[2772]: E0303 12:44:02.103260 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.103574 kubelet[2772]: E0303 12:44:02.103554 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.103804 kubelet[2772]: W0303 12:44:02.103657 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.103804 kubelet[2772]: E0303 12:44:02.103681 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.104035 kubelet[2772]: E0303 12:44:02.104017 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.104149 kubelet[2772]: W0303 12:44:02.104130 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.104367 kubelet[2772]: E0303 12:44:02.104237 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.104625 kubelet[2772]: E0303 12:44:02.104604 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.104857 kubelet[2772]: W0303 12:44:02.104702 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.104857 kubelet[2772]: E0303 12:44:02.104725 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.105088 kubelet[2772]: E0303 12:44:02.105069 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.105269 kubelet[2772]: W0303 12:44:02.105193 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.105546 kubelet[2772]: E0303 12:44:02.105380 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.105874 kubelet[2772]: E0303 12:44:02.105726 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.105874 kubelet[2772]: W0303 12:44:02.105745 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.105874 kubelet[2772]: E0303 12:44:02.105761 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.106152 kubelet[2772]: E0303 12:44:02.106134 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.106292 kubelet[2772]: W0303 12:44:02.106269 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.106397 kubelet[2772]: E0303 12:44:02.106377 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.106864 kubelet[2772]: E0303 12:44:02.106712 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.106864 kubelet[2772]: W0303 12:44:02.106735 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.106864 kubelet[2772]: E0303 12:44:02.106751 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.107458 kubelet[2772]: E0303 12:44:02.107201 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.107458 kubelet[2772]: W0303 12:44:02.107239 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.107458 kubelet[2772]: E0303 12:44:02.107299 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.107752 kubelet[2772]: E0303 12:44:02.107732 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.107992 kubelet[2772]: W0303 12:44:02.107834 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.107992 kubelet[2772]: E0303 12:44:02.107859 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.108255 kubelet[2772]: E0303 12:44:02.108234 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.108317 kubelet[2772]: W0303 12:44:02.108306 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.108424 kubelet[2772]: E0303 12:44:02.108364 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.108676 kubelet[2772]: E0303 12:44:02.108662 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.108756 kubelet[2772]: W0303 12:44:02.108745 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.108809 kubelet[2772]: E0303 12:44:02.108800 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.131291 kubelet[2772]: E0303 12:44:02.131252 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.131291 kubelet[2772]: W0303 12:44:02.131280 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.131291 kubelet[2772]: E0303 12:44:02.131303 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.131980 kubelet[2772]: E0303 12:44:02.131543 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.131980 kubelet[2772]: W0303 12:44:02.131554 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.131980 kubelet[2772]: E0303 12:44:02.131568 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.132351 kubelet[2772]: E0303 12:44:02.132319 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.132473 kubelet[2772]: W0303 12:44:02.132448 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.132589 kubelet[2772]: E0303 12:44:02.132567 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.132992 kubelet[2772]: E0303 12:44:02.132969 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.133453 kubelet[2772]: W0303 12:44:02.133156 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.133453 kubelet[2772]: E0303 12:44:02.133191 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.133811 kubelet[2772]: E0303 12:44:02.133785 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.133974 kubelet[2772]: W0303 12:44:02.133949 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.134120 kubelet[2772]: E0303 12:44:02.134076 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.134754 kubelet[2772]: E0303 12:44:02.134623 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.134754 kubelet[2772]: W0303 12:44:02.134647 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.134754 kubelet[2772]: E0303 12:44:02.134660 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.135018 kubelet[2772]: E0303 12:44:02.134998 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.135201 kubelet[2772]: W0303 12:44:02.135068 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.135201 kubelet[2772]: E0303 12:44:02.135085 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.135519 kubelet[2772]: E0303 12:44:02.135399 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.135519 kubelet[2772]: W0303 12:44:02.135413 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.135519 kubelet[2772]: E0303 12:44:02.135426 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.135730 kubelet[2772]: E0303 12:44:02.135717 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.135879 kubelet[2772]: W0303 12:44:02.135780 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.135879 kubelet[2772]: E0303 12:44:02.135797 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.136058 kubelet[2772]: E0303 12:44:02.136047 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.136301 kubelet[2772]: W0303 12:44:02.136144 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.136301 kubelet[2772]: E0303 12:44:02.136162 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.136452 kubelet[2772]: E0303 12:44:02.136437 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.136651 kubelet[2772]: W0303 12:44:02.136501 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.136651 kubelet[2772]: E0303 12:44:02.136517 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.136762 kubelet[2772]: E0303 12:44:02.136750 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.136820 kubelet[2772]: W0303 12:44:02.136809 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.136873 kubelet[2772]: E0303 12:44:02.136863 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.137082 kubelet[2772]: E0303 12:44:02.137071 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.137367 kubelet[2772]: W0303 12:44:02.137160 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.137367 kubelet[2772]: E0303 12:44:02.137177 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.137545 kubelet[2772]: E0303 12:44:02.137524 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.137626 kubelet[2772]: W0303 12:44:02.137549 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.137626 kubelet[2772]: E0303 12:44:02.137571 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.137853 kubelet[2772]: E0303 12:44:02.137835 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.137889 kubelet[2772]: W0303 12:44:02.137856 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.137889 kubelet[2772]: E0303 12:44:02.137876 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.138330 kubelet[2772]: E0303 12:44:02.138289 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.138330 kubelet[2772]: W0303 12:44:02.138320 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.138408 kubelet[2772]: E0303 12:44:02.138343 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.138768 kubelet[2772]: E0303 12:44:02.138752 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.138805 kubelet[2772]: W0303 12:44:02.138767 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.138805 kubelet[2772]: E0303 12:44:02.138780 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.139355 kubelet[2772]: E0303 12:44:02.139339 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:02.139403 kubelet[2772]: W0303 12:44:02.139357 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:02.139403 kubelet[2772]: E0303 12:44:02.139370 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:02.973153 kubelet[2772]: E0303 12:44:02.972609 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:03.077328 kubelet[2772]: I0303 12:44:03.077272 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 12:44:03.111658 containerd[1519]: time="2026-03-03T12:44:03.111133514Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:03.112881 containerd[1519]: time="2026-03-03T12:44:03.112810725Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 3 12:44:03.113741 containerd[1519]: time="2026-03-03T12:44:03.113672092Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:03.116360 containerd[1519]: time="2026-03-03T12:44:03.116315517Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:03.117424 containerd[1519]: time="2026-03-03T12:44:03.117372734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.684500744s" Mar 3 12:44:03.117424 containerd[1519]: time="2026-03-03T12:44:03.117411737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 3 12:44:03.118807 kubelet[2772]: E0303 12:44:03.118691 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.118807 kubelet[2772]: W0303 12:44:03.118717 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.118807 kubelet[2772]: E0303 12:44:03.118740 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.120371 kubelet[2772]: E0303 12:44:03.119710 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.120371 kubelet[2772]: W0303 12:44:03.119730 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.120371 kubelet[2772]: E0303 12:44:03.119792 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.120719 kubelet[2772]: E0303 12:44:03.120555 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.120719 kubelet[2772]: W0303 12:44:03.120576 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.120719 kubelet[2772]: E0303 12:44:03.120594 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.121541 kubelet[2772]: E0303 12:44:03.121521 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.121802 kubelet[2772]: W0303 12:44:03.121640 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.121802 kubelet[2772]: E0303 12:44:03.121678 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.122296 kubelet[2772]: E0303 12:44:03.122135 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.122296 kubelet[2772]: W0303 12:44:03.122155 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.122296 kubelet[2772]: E0303 12:44:03.122172 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.122860 kubelet[2772]: E0303 12:44:03.122684 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.122860 kubelet[2772]: W0303 12:44:03.122727 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.122860 kubelet[2772]: E0303 12:44:03.122746 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.123481 kubelet[2772]: E0303 12:44:03.123293 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.123481 kubelet[2772]: W0303 12:44:03.123349 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.123481 kubelet[2772]: E0303 12:44:03.123367 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.123957 kubelet[2772]: E0303 12:44:03.123935 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.124156 kubelet[2772]: W0303 12:44:03.124049 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.124156 kubelet[2772]: E0303 12:44:03.124077 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.124821 kubelet[2772]: E0303 12:44:03.124641 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.124821 kubelet[2772]: W0303 12:44:03.124657 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.124821 kubelet[2772]: E0303 12:44:03.124689 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.125156 kubelet[2772]: E0303 12:44:03.125139 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.125380 kubelet[2772]: W0303 12:44:03.125231 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.125380 kubelet[2772]: E0303 12:44:03.125297 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.127713 containerd[1519]: time="2026-03-03T12:44:03.125692909Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 3 12:44:03.127818 kubelet[2772]: E0303 12:44:03.127366 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.127818 kubelet[2772]: W0303 12:44:03.127381 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.127818 kubelet[2772]: E0303 12:44:03.127395 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.127818 kubelet[2772]: E0303 12:44:03.127576 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.127818 kubelet[2772]: W0303 12:44:03.127585 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.127818 kubelet[2772]: E0303 12:44:03.127594 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.128102 kubelet[2772]: E0303 12:44:03.127997 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.128102 kubelet[2772]: W0303 12:44:03.128011 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.128102 kubelet[2772]: E0303 12:44:03.128022 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.129056 kubelet[2772]: E0303 12:44:03.129037 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.129160 kubelet[2772]: W0303 12:44:03.129145 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.129217 kubelet[2772]: E0303 12:44:03.129205 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.129506 kubelet[2772]: E0303 12:44:03.129492 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.129597 kubelet[2772]: W0303 12:44:03.129584 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.129658 kubelet[2772]: E0303 12:44:03.129647 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.137523 containerd[1519]: time="2026-03-03T12:44:03.137464912Z" level=info msg="Container 8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:03.143076 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3856908020.mount: Deactivated successfully. Mar 3 12:44:03.144312 kubelet[2772]: E0303 12:44:03.143345 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.144312 kubelet[2772]: W0303 12:44:03.143366 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.144312 kubelet[2772]: E0303 12:44:03.143385 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.144590 kubelet[2772]: E0303 12:44:03.144570 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.144590 kubelet[2772]: W0303 12:44:03.144587 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.144669 kubelet[2772]: E0303 12:44:03.144602 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.144927 kubelet[2772]: E0303 12:44:03.144911 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.144927 kubelet[2772]: W0303 12:44:03.144925 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.145333 kubelet[2772]: E0303 12:44:03.145306 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.146343 kubelet[2772]: E0303 12:44:03.146322 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.146535 kubelet[2772]: W0303 12:44:03.146516 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.146607 kubelet[2772]: E0303 12:44:03.146538 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.147131 kubelet[2772]: E0303 12:44:03.147088 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.147199 kubelet[2772]: W0303 12:44:03.147182 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.147230 kubelet[2772]: E0303 12:44:03.147201 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.147450 kubelet[2772]: E0303 12:44:03.147435 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.147450 kubelet[2772]: W0303 12:44:03.147449 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.147511 kubelet[2772]: E0303 12:44:03.147459 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.147831 kubelet[2772]: E0303 12:44:03.147809 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.147831 kubelet[2772]: W0303 12:44:03.147829 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.147898 kubelet[2772]: E0303 12:44:03.147841 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.148082 kubelet[2772]: E0303 12:44:03.148069 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.148182 kubelet[2772]: W0303 12:44:03.148081 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.148218 kubelet[2772]: E0303 12:44:03.148188 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.148470 kubelet[2772]: E0303 12:44:03.148455 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.148470 kubelet[2772]: W0303 12:44:03.148469 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.148523 kubelet[2772]: E0303 12:44:03.148482 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.148914 kubelet[2772]: E0303 12:44:03.148896 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.148914 kubelet[2772]: W0303 12:44:03.148913 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.148984 kubelet[2772]: E0303 12:44:03.148924 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.149391 kubelet[2772]: E0303 12:44:03.149375 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.149391 kubelet[2772]: W0303 12:44:03.149389 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.149481 kubelet[2772]: E0303 12:44:03.149419 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.149692 kubelet[2772]: E0303 12:44:03.149676 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.149692 kubelet[2772]: W0303 12:44:03.149690 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.149754 kubelet[2772]: E0303 12:44:03.149700 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.150019 kubelet[2772]: E0303 12:44:03.150004 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.150019 kubelet[2772]: W0303 12:44:03.150018 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.150072 kubelet[2772]: E0303 12:44:03.150028 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.151206 kubelet[2772]: E0303 12:44:03.151179 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.151206 kubelet[2772]: W0303 12:44:03.151199 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.151805 kubelet[2772]: E0303 12:44:03.151765 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.152573 containerd[1519]: time="2026-03-03T12:44:03.152385767Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee\"" Mar 3 12:44:03.153236 containerd[1519]: time="2026-03-03T12:44:03.153206852Z" level=info msg="StartContainer for \"8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee\"" Mar 3 12:44:03.154362 kubelet[2772]: E0303 12:44:03.154340 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.154475 kubelet[2772]: W0303 12:44:03.154447 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.154475 kubelet[2772]: E0303 12:44:03.154468 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.155350 kubelet[2772]: E0303 12:44:03.155328 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.155350 kubelet[2772]: W0303 12:44:03.155347 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.155705 containerd[1519]: time="2026-03-03T12:44:03.155605063Z" level=info msg="connecting to shim 8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee" address="unix:///run/containerd/s/b4b67af332636bf1a36eb6c6ba2015a0677a48706ffc21384ccea5e62fcf3dc9" protocol=ttrpc version=3 Mar 3 12:44:03.155755 kubelet[2772]: E0303 12:44:03.155676 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.156961 kubelet[2772]: E0303 12:44:03.156592 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.157138 kubelet[2772]: W0303 12:44:03.157044 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.157851 kubelet[2772]: E0303 12:44:03.157300 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.158249 kubelet[2772]: E0303 12:44:03.158225 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 12:44:03.158249 kubelet[2772]: W0303 12:44:03.158242 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 12:44:03.158328 kubelet[2772]: E0303 12:44:03.158267 2772 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 12:44:03.183390 systemd[1]: Started cri-containerd-8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee.scope - libcontainer container 8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee. Mar 3 12:44:03.256877 containerd[1519]: time="2026-03-03T12:44:03.256100032Z" level=info msg="StartContainer for \"8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee\" returns successfully" Mar 3 12:44:03.353195 systemd[1]: cri-containerd-8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee.scope: Deactivated successfully. Mar 3 12:44:03.358962 containerd[1519]: time="2026-03-03T12:44:03.358864726Z" level=info msg="received container exit event container_id:\"8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee\" id:\"8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee\" pid:3436 exited_at:{seconds:1772541843 nanos:358502066}" Mar 3 12:44:03.383561 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8914898675d0270fbee8d486500d223f0ac196b6ea1fa826dce95ea6ed8dddee-rootfs.mount: Deactivated successfully. Mar 3 12:44:04.086476 containerd[1519]: time="2026-03-03T12:44:04.086284131Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 3 12:44:04.973193 kubelet[2772]: E0303 12:44:04.972128 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:06.972697 kubelet[2772]: E0303 12:44:06.972386 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:08.971817 kubelet[2772]: E0303 12:44:08.971770 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:10.599846 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount407175427.mount: Deactivated successfully. Mar 3 12:44:10.629241 containerd[1519]: time="2026-03-03T12:44:10.629175325Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:10.630528 containerd[1519]: time="2026-03-03T12:44:10.630260726Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 3 12:44:10.631406 containerd[1519]: time="2026-03-03T12:44:10.631348248Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:10.633816 containerd[1519]: time="2026-03-03T12:44:10.633728290Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:10.634779 containerd[1519]: time="2026-03-03T12:44:10.634718451Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 6.548267432s" Mar 3 12:44:10.634779 containerd[1519]: time="2026-03-03T12:44:10.634772451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 3 12:44:10.639606 containerd[1519]: time="2026-03-03T12:44:10.639560736Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 3 12:44:10.653032 containerd[1519]: time="2026-03-03T12:44:10.652931549Z" level=info msg="Container 9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:10.654903 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1946050950.mount: Deactivated successfully. Mar 3 12:44:10.663905 containerd[1519]: time="2026-03-03T12:44:10.663838160Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be\"" Mar 3 12:44:10.664722 containerd[1519]: time="2026-03-03T12:44:10.664672481Z" level=info msg="StartContainer for \"9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be\"" Mar 3 12:44:10.673629 containerd[1519]: time="2026-03-03T12:44:10.673553730Z" level=info msg="connecting to shim 9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be" address="unix:///run/containerd/s/b4b67af332636bf1a36eb6c6ba2015a0677a48706ffc21384ccea5e62fcf3dc9" protocol=ttrpc version=3 Mar 3 12:44:10.703444 systemd[1]: Started cri-containerd-9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be.scope - libcontainer container 9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be. Mar 3 12:44:10.802139 containerd[1519]: time="2026-03-03T12:44:10.802023300Z" level=info msg="StartContainer for \"9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be\" returns successfully" Mar 3 12:44:10.912475 systemd[1]: cri-containerd-9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be.scope: Deactivated successfully. Mar 3 12:44:10.917061 containerd[1519]: time="2026-03-03T12:44:10.916999137Z" level=info msg="received container exit event container_id:\"9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be\" id:\"9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be\" pid:3491 exited_at:{seconds:1772541850 nanos:916066576}" Mar 3 12:44:10.972977 kubelet[2772]: E0303 12:44:10.972852 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:11.106759 containerd[1519]: time="2026-03-03T12:44:11.106624539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 3 12:44:11.599948 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9bc466215f6a713b90e9f000a51df1e75aed07b002cc131bc14349dec6a088be-rootfs.mount: Deactivated successfully. Mar 3 12:44:12.973434 kubelet[2772]: E0303 12:44:12.972779 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:14.873557 containerd[1519]: time="2026-03-03T12:44:14.873472300Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:14.876009 containerd[1519]: time="2026-03-03T12:44:14.875953754Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 3 12:44:14.876996 containerd[1519]: time="2026-03-03T12:44:14.876942760Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:14.880336 containerd[1519]: time="2026-03-03T12:44:14.880277139Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:14.881119 containerd[1519]: time="2026-03-03T12:44:14.881047463Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 3.774341044s" Mar 3 12:44:14.881119 containerd[1519]: time="2026-03-03T12:44:14.881085784Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 3 12:44:14.887878 containerd[1519]: time="2026-03-03T12:44:14.887826982Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 3 12:44:14.901475 containerd[1519]: time="2026-03-03T12:44:14.900974577Z" level=info msg="Container 86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:14.914979 containerd[1519]: time="2026-03-03T12:44:14.914863417Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576\"" Mar 3 12:44:14.916550 containerd[1519]: time="2026-03-03T12:44:14.916384586Z" level=info msg="StartContainer for \"86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576\"" Mar 3 12:44:14.919712 containerd[1519]: time="2026-03-03T12:44:14.919679804Z" level=info msg="connecting to shim 86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576" address="unix:///run/containerd/s/b4b67af332636bf1a36eb6c6ba2015a0677a48706ffc21384ccea5e62fcf3dc9" protocol=ttrpc version=3 Mar 3 12:44:14.943348 systemd[1]: Started cri-containerd-86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576.scope - libcontainer container 86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576. Mar 3 12:44:14.974048 kubelet[2772]: E0303 12:44:14.974000 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rc66" podUID="af4d9ea5-01b2-4705-ac48-8ebf47211342" Mar 3 12:44:15.025729 containerd[1519]: time="2026-03-03T12:44:15.025693078Z" level=info msg="StartContainer for \"86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576\" returns successfully" Mar 3 12:44:15.542739 containerd[1519]: time="2026-03-03T12:44:15.542690518Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 3 12:44:15.546175 systemd[1]: cri-containerd-86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576.scope: Deactivated successfully. Mar 3 12:44:15.546532 systemd[1]: cri-containerd-86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576.scope: Consumed 500ms CPU time, 194.5M memory peak, 171.3M written to disk. Mar 3 12:44:15.547845 containerd[1519]: time="2026-03-03T12:44:15.547607752Z" level=info msg="received container exit event container_id:\"86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576\" id:\"86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576\" pid:3551 exited_at:{seconds:1772541855 nanos:547259309}" Mar 3 12:44:15.573590 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-86ea5e362c2515c97974c5ce30048541ba0e91dbb3e4baadf302aaeac400b576-rootfs.mount: Deactivated successfully. Mar 3 12:44:15.642967 kubelet[2772]: I0303 12:44:15.642907 2772 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 3 12:44:15.699273 systemd[1]: Created slice kubepods-burstable-pod3a3236b5_09aa_4b76_9ecf_67f6d695afd2.slice - libcontainer container kubepods-burstable-pod3a3236b5_09aa_4b76_9ecf_67f6d695afd2.slice. Mar 3 12:44:15.717381 systemd[1]: Created slice kubepods-burstable-podd6331020_20e1_4302_b0cb_7c30f36149f0.slice - libcontainer container kubepods-burstable-podd6331020_20e1_4302_b0cb_7c30f36149f0.slice. Mar 3 12:44:15.730544 systemd[1]: Created slice kubepods-besteffort-podeccd9c30_d55c_4dea_9b0c_0692a8266820.slice - libcontainer container kubepods-besteffort-podeccd9c30_d55c_4dea_9b0c_0692a8266820.slice. Mar 3 12:44:15.743455 kubelet[2772]: I0303 12:44:15.743401 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b8fj\" (UniqueName: \"kubernetes.io/projected/3a3236b5-09aa-4b76-9ecf-67f6d695afd2-kube-api-access-9b8fj\") pod \"coredns-66bc5c9577-k647t\" (UID: \"3a3236b5-09aa-4b76-9ecf-67f6d695afd2\") " pod="kube-system/coredns-66bc5c9577-k647t" Mar 3 12:44:15.743584 kubelet[2772]: I0303 12:44:15.743492 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a3236b5-09aa-4b76-9ecf-67f6d695afd2-config-volume\") pod \"coredns-66bc5c9577-k647t\" (UID: \"3a3236b5-09aa-4b76-9ecf-67f6d695afd2\") " pod="kube-system/coredns-66bc5c9577-k647t" Mar 3 12:44:15.743584 kubelet[2772]: I0303 12:44:15.743518 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc26s\" (UniqueName: \"kubernetes.io/projected/d6331020-20e1-4302-b0cb-7c30f36149f0-kube-api-access-xc26s\") pod \"coredns-66bc5c9577-xhfv4\" (UID: \"d6331020-20e1-4302-b0cb-7c30f36149f0\") " pod="kube-system/coredns-66bc5c9577-xhfv4" Mar 3 12:44:15.743584 kubelet[2772]: I0303 12:44:15.743541 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/eccd9c30-d55c-4dea-9b0c-0692a8266820-calico-apiserver-certs\") pod \"calico-apiserver-f8cd6759c-wdvrv\" (UID: \"eccd9c30-d55c-4dea-9b0c-0692a8266820\") " pod="calico-system/calico-apiserver-f8cd6759c-wdvrv" Mar 3 12:44:15.743584 kubelet[2772]: I0303 12:44:15.743566 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djww\" (UniqueName: \"kubernetes.io/projected/eccd9c30-d55c-4dea-9b0c-0692a8266820-kube-api-access-4djww\") pod \"calico-apiserver-f8cd6759c-wdvrv\" (UID: \"eccd9c30-d55c-4dea-9b0c-0692a8266820\") " pod="calico-system/calico-apiserver-f8cd6759c-wdvrv" Mar 3 12:44:15.743689 kubelet[2772]: I0303 12:44:15.743586 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhj7z\" (UniqueName: \"kubernetes.io/projected/5babebd3-6e97-412e-b6a1-5908db32c25f-kube-api-access-qhj7z\") pod \"calico-kube-controllers-569b58c775-xc222\" (UID: \"5babebd3-6e97-412e-b6a1-5908db32c25f\") " pod="calico-system/calico-kube-controllers-569b58c775-xc222" Mar 3 12:44:15.743689 kubelet[2772]: I0303 12:44:15.743608 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5babebd3-6e97-412e-b6a1-5908db32c25f-tigera-ca-bundle\") pod \"calico-kube-controllers-569b58c775-xc222\" (UID: \"5babebd3-6e97-412e-b6a1-5908db32c25f\") " pod="calico-system/calico-kube-controllers-569b58c775-xc222" Mar 3 12:44:15.743689 kubelet[2772]: I0303 12:44:15.743641 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-nginx-config\") pod \"whisker-5885656486-zr9s9\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " pod="calico-system/whisker-5885656486-zr9s9" Mar 3 12:44:15.743689 kubelet[2772]: I0303 12:44:15.743665 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-backend-key-pair\") pod \"whisker-5885656486-zr9s9\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " pod="calico-system/whisker-5885656486-zr9s9" Mar 3 12:44:15.743777 kubelet[2772]: I0303 12:44:15.743693 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6331020-20e1-4302-b0cb-7c30f36149f0-config-volume\") pod \"coredns-66bc5c9577-xhfv4\" (UID: \"d6331020-20e1-4302-b0cb-7c30f36149f0\") " pod="kube-system/coredns-66bc5c9577-xhfv4" Mar 3 12:44:15.743777 kubelet[2772]: I0303 12:44:15.743712 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9dvl\" (UniqueName: \"kubernetes.io/projected/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-kube-api-access-c9dvl\") pod \"whisker-5885656486-zr9s9\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " pod="calico-system/whisker-5885656486-zr9s9" Mar 3 12:44:15.743777 kubelet[2772]: I0303 12:44:15.743737 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-ca-bundle\") pod \"whisker-5885656486-zr9s9\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " pod="calico-system/whisker-5885656486-zr9s9" Mar 3 12:44:15.745716 systemd[1]: Created slice kubepods-besteffort-pod5babebd3_6e97_412e_b6a1_5908db32c25f.slice - libcontainer container kubepods-besteffort-pod5babebd3_6e97_412e_b6a1_5908db32c25f.slice. Mar 3 12:44:15.764062 systemd[1]: Created slice kubepods-besteffort-podebcf1a5f_a1af_487c_91d1_62e2a1c97f7e.slice - libcontainer container kubepods-besteffort-podebcf1a5f_a1af_487c_91d1_62e2a1c97f7e.slice. Mar 3 12:44:15.773091 systemd[1]: Created slice kubepods-besteffort-pod5676c205_9bdf_4ef6_9fc8_1d7ffb5e1843.slice - libcontainer container kubepods-besteffort-pod5676c205_9bdf_4ef6_9fc8_1d7ffb5e1843.slice. Mar 3 12:44:15.781845 systemd[1]: Created slice kubepods-besteffort-pod3115d2ff_25aa_4838_aef5_faf4076ac816.slice - libcontainer container kubepods-besteffort-pod3115d2ff_25aa_4838_aef5_faf4076ac816.slice. Mar 3 12:44:15.845008 kubelet[2772]: I0303 12:44:15.844734 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3115d2ff-25aa-4838-aef5-faf4076ac816-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-ldzpx\" (UID: \"3115d2ff-25aa-4838-aef5-faf4076ac816\") " pod="calico-system/goldmane-cccfbd5cf-ldzpx" Mar 3 12:44:15.845008 kubelet[2772]: I0303 12:44:15.844785 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3115d2ff-25aa-4838-aef5-faf4076ac816-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-ldzpx\" (UID: \"3115d2ff-25aa-4838-aef5-faf4076ac816\") " pod="calico-system/goldmane-cccfbd5cf-ldzpx" Mar 3 12:44:15.845008 kubelet[2772]: I0303 12:44:15.844970 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3115d2ff-25aa-4838-aef5-faf4076ac816-config\") pod \"goldmane-cccfbd5cf-ldzpx\" (UID: \"3115d2ff-25aa-4838-aef5-faf4076ac816\") " pod="calico-system/goldmane-cccfbd5cf-ldzpx" Mar 3 12:44:15.845008 kubelet[2772]: I0303 12:44:15.844996 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89hc5\" (UniqueName: \"kubernetes.io/projected/3115d2ff-25aa-4838-aef5-faf4076ac816-kube-api-access-89hc5\") pod \"goldmane-cccfbd5cf-ldzpx\" (UID: \"3115d2ff-25aa-4838-aef5-faf4076ac816\") " pod="calico-system/goldmane-cccfbd5cf-ldzpx" Mar 3 12:44:15.845410 kubelet[2772]: I0303 12:44:15.845015 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843-calico-apiserver-certs\") pod \"calico-apiserver-f8cd6759c-9wqps\" (UID: \"5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843\") " pod="calico-system/calico-apiserver-f8cd6759c-9wqps" Mar 3 12:44:15.845410 kubelet[2772]: I0303 12:44:15.845033 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fw29\" (UniqueName: \"kubernetes.io/projected/5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843-kube-api-access-2fw29\") pod \"calico-apiserver-f8cd6759c-9wqps\" (UID: \"5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843\") " pod="calico-system/calico-apiserver-f8cd6759c-9wqps" Mar 3 12:44:16.013207 containerd[1519]: time="2026-03-03T12:44:16.013086974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-k647t,Uid:3a3236b5-09aa-4b76-9ecf-67f6d695afd2,Namespace:kube-system,Attempt:0,}" Mar 3 12:44:16.026244 containerd[1519]: time="2026-03-03T12:44:16.025954155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xhfv4,Uid:d6331020-20e1-4302-b0cb-7c30f36149f0,Namespace:kube-system,Attempt:0,}" Mar 3 12:44:16.050333 containerd[1519]: time="2026-03-03T12:44:16.049770462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-wdvrv,Uid:eccd9c30-d55c-4dea-9b0c-0692a8266820,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:16.060576 containerd[1519]: time="2026-03-03T12:44:16.060428666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-569b58c775-xc222,Uid:5babebd3-6e97-412e-b6a1-5908db32c25f,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:16.073324 containerd[1519]: time="2026-03-03T12:44:16.073279967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5885656486-zr9s9,Uid:ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:16.083028 containerd[1519]: time="2026-03-03T12:44:16.082939923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-9wqps,Uid:5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:16.095998 containerd[1519]: time="2026-03-03T12:44:16.095881265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ldzpx,Uid:3115d2ff-25aa-4838-aef5-faf4076ac816,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:16.162793 containerd[1519]: time="2026-03-03T12:44:16.162741230Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 3 12:44:16.199641 containerd[1519]: time="2026-03-03T12:44:16.199516800Z" level=info msg="Container ef59015dbc8c62ca0f87b28ca85ab8ba3bf1a18d2c9f785561acc40a764f2073: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:16.248192 containerd[1519]: time="2026-03-03T12:44:16.248144622Z" level=error msg="Failed to destroy network for sandbox \"701a256e49dd4c78cd20133f045c3f76d40a01a6dcff7131c74c0116a48dad7a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.248915 containerd[1519]: time="2026-03-03T12:44:16.248808227Z" level=error msg="Failed to destroy network for sandbox \"ef7ea60f2aeb0f531282be0a590de7aa88ce6a28c1dab1450670acb39d981183\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.251860 containerd[1519]: time="2026-03-03T12:44:16.251814091Z" level=error msg="Failed to destroy network for sandbox \"de82cdc9b13fa11fa78b90af8058210edfb79aef66d94c9c3553297d5e4c9f18\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.252542 containerd[1519]: time="2026-03-03T12:44:16.252319375Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-wdvrv,Uid:eccd9c30-d55c-4dea-9b0c-0692a8266820,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"701a256e49dd4c78cd20133f045c3f76d40a01a6dcff7131c74c0116a48dad7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.252920 kubelet[2772]: E0303 12:44:16.252885 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"701a256e49dd4c78cd20133f045c3f76d40a01a6dcff7131c74c0116a48dad7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.253710 kubelet[2772]: E0303 12:44:16.253332 2772 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"701a256e49dd4c78cd20133f045c3f76d40a01a6dcff7131c74c0116a48dad7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-f8cd6759c-wdvrv" Mar 3 12:44:16.253710 kubelet[2772]: E0303 12:44:16.253372 2772 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"701a256e49dd4c78cd20133f045c3f76d40a01a6dcff7131c74c0116a48dad7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-f8cd6759c-wdvrv" Mar 3 12:44:16.253710 kubelet[2772]: E0303 12:44:16.253458 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f8cd6759c-wdvrv_calico-system(eccd9c30-d55c-4dea-9b0c-0692a8266820)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f8cd6759c-wdvrv_calico-system(eccd9c30-d55c-4dea-9b0c-0692a8266820)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"701a256e49dd4c78cd20133f045c3f76d40a01a6dcff7131c74c0116a48dad7a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-f8cd6759c-wdvrv" podUID="eccd9c30-d55c-4dea-9b0c-0692a8266820" Mar 3 12:44:16.254701 containerd[1519]: time="2026-03-03T12:44:16.254663513Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-k647t,Uid:3a3236b5-09aa-4b76-9ecf-67f6d695afd2,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef7ea60f2aeb0f531282be0a590de7aa88ce6a28c1dab1450670acb39d981183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.257518 kubelet[2772]: E0303 12:44:16.257427 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef7ea60f2aeb0f531282be0a590de7aa88ce6a28c1dab1450670acb39d981183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.258308 kubelet[2772]: E0303 12:44:16.258032 2772 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef7ea60f2aeb0f531282be0a590de7aa88ce6a28c1dab1450670acb39d981183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-k647t" Mar 3 12:44:16.258308 kubelet[2772]: E0303 12:44:16.258064 2772 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef7ea60f2aeb0f531282be0a590de7aa88ce6a28c1dab1450670acb39d981183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-k647t" Mar 3 12:44:16.258308 kubelet[2772]: E0303 12:44:16.258243 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-k647t_kube-system(3a3236b5-09aa-4b76-9ecf-67f6d695afd2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-k647t_kube-system(3a3236b5-09aa-4b76-9ecf-67f6d695afd2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ef7ea60f2aeb0f531282be0a590de7aa88ce6a28c1dab1450670acb39d981183\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-k647t" podUID="3a3236b5-09aa-4b76-9ecf-67f6d695afd2" Mar 3 12:44:16.260507 containerd[1519]: time="2026-03-03T12:44:16.260167196Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xhfv4,Uid:d6331020-20e1-4302-b0cb-7c30f36149f0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"de82cdc9b13fa11fa78b90af8058210edfb79aef66d94c9c3553297d5e4c9f18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.260795 containerd[1519]: time="2026-03-03T12:44:16.259954795Z" level=info msg="CreateContainer within sandbox \"35f395acba69ae2ca3bd02988cebf88b6b24e22f72320c07240518540504e31b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ef59015dbc8c62ca0f87b28ca85ab8ba3bf1a18d2c9f785561acc40a764f2073\"" Mar 3 12:44:16.261130 kubelet[2772]: E0303 12:44:16.260431 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de82cdc9b13fa11fa78b90af8058210edfb79aef66d94c9c3553297d5e4c9f18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.261130 kubelet[2772]: E0303 12:44:16.260615 2772 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de82cdc9b13fa11fa78b90af8058210edfb79aef66d94c9c3553297d5e4c9f18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-xhfv4" Mar 3 12:44:16.261130 kubelet[2772]: E0303 12:44:16.260651 2772 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de82cdc9b13fa11fa78b90af8058210edfb79aef66d94c9c3553297d5e4c9f18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-xhfv4" Mar 3 12:44:16.261759 kubelet[2772]: E0303 12:44:16.260712 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-xhfv4_kube-system(d6331020-20e1-4302-b0cb-7c30f36149f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-xhfv4_kube-system(d6331020-20e1-4302-b0cb-7c30f36149f0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de82cdc9b13fa11fa78b90af8058210edfb79aef66d94c9c3553297d5e4c9f18\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-xhfv4" podUID="d6331020-20e1-4302-b0cb-7c30f36149f0" Mar 3 12:44:16.265487 containerd[1519]: time="2026-03-03T12:44:16.265368357Z" level=info msg="StartContainer for \"ef59015dbc8c62ca0f87b28ca85ab8ba3bf1a18d2c9f785561acc40a764f2073\"" Mar 3 12:44:16.275653 containerd[1519]: time="2026-03-03T12:44:16.275066034Z" level=info msg="connecting to shim ef59015dbc8c62ca0f87b28ca85ab8ba3bf1a18d2c9f785561acc40a764f2073" address="unix:///run/containerd/s/b4b67af332636bf1a36eb6c6ba2015a0677a48706ffc21384ccea5e62fcf3dc9" protocol=ttrpc version=3 Mar 3 12:44:16.326327 systemd[1]: Started cri-containerd-ef59015dbc8c62ca0f87b28ca85ab8ba3bf1a18d2c9f785561acc40a764f2073.scope - libcontainer container ef59015dbc8c62ca0f87b28ca85ab8ba3bf1a18d2c9f785561acc40a764f2073. Mar 3 12:44:16.327791 containerd[1519]: time="2026-03-03T12:44:16.327748648Z" level=error msg="Failed to destroy network for sandbox \"bf21d4bb4e4df50b6e8b21cfb70e2d591e9575b7bd12cbbc4f0d27571ae36b9b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.330667 containerd[1519]: time="2026-03-03T12:44:16.330609990Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-569b58c775-xc222,Uid:5babebd3-6e97-412e-b6a1-5908db32c25f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf21d4bb4e4df50b6e8b21cfb70e2d591e9575b7bd12cbbc4f0d27571ae36b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.331075 kubelet[2772]: E0303 12:44:16.330997 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf21d4bb4e4df50b6e8b21cfb70e2d591e9575b7bd12cbbc4f0d27571ae36b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.331075 kubelet[2772]: E0303 12:44:16.331062 2772 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf21d4bb4e4df50b6e8b21cfb70e2d591e9575b7bd12cbbc4f0d27571ae36b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-569b58c775-xc222" Mar 3 12:44:16.331272 kubelet[2772]: E0303 12:44:16.331087 2772 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf21d4bb4e4df50b6e8b21cfb70e2d591e9575b7bd12cbbc4f0d27571ae36b9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-569b58c775-xc222" Mar 3 12:44:16.331934 kubelet[2772]: E0303 12:44:16.331573 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-569b58c775-xc222_calico-system(5babebd3-6e97-412e-b6a1-5908db32c25f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-569b58c775-xc222_calico-system(5babebd3-6e97-412e-b6a1-5908db32c25f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bf21d4bb4e4df50b6e8b21cfb70e2d591e9575b7bd12cbbc4f0d27571ae36b9b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-569b58c775-xc222" podUID="5babebd3-6e97-412e-b6a1-5908db32c25f" Mar 3 12:44:16.338251 containerd[1519]: time="2026-03-03T12:44:16.338155010Z" level=error msg="Failed to destroy network for sandbox \"a55a641e6c54189b271eb4da40251d2d0669ccfc307e445794e068466f0aa7ff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.342406 containerd[1519]: time="2026-03-03T12:44:16.342233282Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5885656486-zr9s9,Uid:ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a55a641e6c54189b271eb4da40251d2d0669ccfc307e445794e068466f0aa7ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.342875 kubelet[2772]: E0303 12:44:16.342842 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a55a641e6c54189b271eb4da40251d2d0669ccfc307e445794e068466f0aa7ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.342948 kubelet[2772]: E0303 12:44:16.342894 2772 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a55a641e6c54189b271eb4da40251d2d0669ccfc307e445794e068466f0aa7ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5885656486-zr9s9" Mar 3 12:44:16.342948 kubelet[2772]: E0303 12:44:16.342917 2772 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a55a641e6c54189b271eb4da40251d2d0669ccfc307e445794e068466f0aa7ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5885656486-zr9s9" Mar 3 12:44:16.343247 kubelet[2772]: E0303 12:44:16.343200 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5885656486-zr9s9_calico-system(ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5885656486-zr9s9_calico-system(ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a55a641e6c54189b271eb4da40251d2d0669ccfc307e445794e068466f0aa7ff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5885656486-zr9s9" podUID="ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e" Mar 3 12:44:16.363414 containerd[1519]: time="2026-03-03T12:44:16.363250007Z" level=error msg="Failed to destroy network for sandbox \"97e9bd0030df91ef047ad2178e64bda3259ec5c42ac48a3e7af0d3c322d02ad1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.365366 containerd[1519]: time="2026-03-03T12:44:16.365199422Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ldzpx,Uid:3115d2ff-25aa-4838-aef5-faf4076ac816,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"97e9bd0030df91ef047ad2178e64bda3259ec5c42ac48a3e7af0d3c322d02ad1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.368379 kubelet[2772]: E0303 12:44:16.367944 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97e9bd0030df91ef047ad2178e64bda3259ec5c42ac48a3e7af0d3c322d02ad1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.368379 kubelet[2772]: E0303 12:44:16.368007 2772 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97e9bd0030df91ef047ad2178e64bda3259ec5c42ac48a3e7af0d3c322d02ad1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-ldzpx" Mar 3 12:44:16.368379 kubelet[2772]: E0303 12:44:16.368028 2772 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97e9bd0030df91ef047ad2178e64bda3259ec5c42ac48a3e7af0d3c322d02ad1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-ldzpx" Mar 3 12:44:16.368592 kubelet[2772]: E0303 12:44:16.368078 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-ldzpx_calico-system(3115d2ff-25aa-4838-aef5-faf4076ac816)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-ldzpx_calico-system(3115d2ff-25aa-4838-aef5-faf4076ac816)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"97e9bd0030df91ef047ad2178e64bda3259ec5c42ac48a3e7af0d3c322d02ad1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-ldzpx" podUID="3115d2ff-25aa-4838-aef5-faf4076ac816" Mar 3 12:44:16.369242 containerd[1519]: time="2026-03-03T12:44:16.369182814Z" level=error msg="Failed to destroy network for sandbox \"fd083c3a9c9b76c3a9513a755f02d91492159d3d8d6f9ac83be1c27bfe29d981\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.380607 containerd[1519]: time="2026-03-03T12:44:16.380536583Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-9wqps,Uid:5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd083c3a9c9b76c3a9513a755f02d91492159d3d8d6f9ac83be1c27bfe29d981\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.381653 kubelet[2772]: E0303 12:44:16.381307 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd083c3a9c9b76c3a9513a755f02d91492159d3d8d6f9ac83be1c27bfe29d981\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 12:44:16.382488 kubelet[2772]: E0303 12:44:16.381873 2772 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd083c3a9c9b76c3a9513a755f02d91492159d3d8d6f9ac83be1c27bfe29d981\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-f8cd6759c-9wqps" Mar 3 12:44:16.382488 kubelet[2772]: E0303 12:44:16.382218 2772 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd083c3a9c9b76c3a9513a755f02d91492159d3d8d6f9ac83be1c27bfe29d981\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-f8cd6759c-9wqps" Mar 3 12:44:16.382488 kubelet[2772]: E0303 12:44:16.382358 2772 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f8cd6759c-9wqps_calico-system(5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f8cd6759c-9wqps_calico-system(5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fd083c3a9c9b76c3a9513a755f02d91492159d3d8d6f9ac83be1c27bfe29d981\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-f8cd6759c-9wqps" podUID="5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843" Mar 3 12:44:16.412387 containerd[1519]: time="2026-03-03T12:44:16.412333073Z" level=info msg="StartContainer for \"ef59015dbc8c62ca0f87b28ca85ab8ba3bf1a18d2c9f785561acc40a764f2073\" returns successfully" Mar 3 12:44:16.984301 systemd[1]: Created slice kubepods-besteffort-podaf4d9ea5_01b2_4705_ac48_8ebf47211342.slice - libcontainer container kubepods-besteffort-podaf4d9ea5_01b2_4705_ac48_8ebf47211342.slice. Mar 3 12:44:16.990032 containerd[1519]: time="2026-03-03T12:44:16.989704293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5rc66,Uid:af4d9ea5-01b2-4705-ac48-8ebf47211342,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:17.197529 systemd-networkd[1422]: cali5b89270fd3d: Link UP Mar 3 12:44:17.198772 kubelet[2772]: I0303 12:44:17.197360 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-d8vgh" podStartSLOduration=3.622047841 podStartE2EDuration="19.197339085s" podCreationTimestamp="2026-03-03 12:43:58 +0000 UTC" firstStartedPulling="2026-03-03 12:43:59.308203113 +0000 UTC m=+22.473136312" lastFinishedPulling="2026-03-03 12:44:14.883494397 +0000 UTC m=+38.048427556" observedRunningTime="2026-03-03 12:44:17.194698462 +0000 UTC m=+40.359631621" watchObservedRunningTime="2026-03-03 12:44:17.197339085 +0000 UTC m=+40.362272284" Mar 3 12:44:17.198221 systemd-networkd[1422]: cali5b89270fd3d: Gained carrier Mar 3 12:44:17.225566 containerd[1519]: 2026-03-03 12:44:17.019 [ERROR][3828] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 3 12:44:17.225566 containerd[1519]: 2026-03-03 12:44:17.049 [INFO][3828] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0 csi-node-driver- calico-system af4d9ea5-01b2-4705-ac48-8ebf47211342 700 0 2026-03-03 12:43:59 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef csi-node-driver-5rc66 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5b89270fd3d [] [] }} ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-" Mar 3 12:44:17.225566 containerd[1519]: 2026-03-03 12:44:17.049 [INFO][3828] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" Mar 3 12:44:17.225566 containerd[1519]: 2026-03-03 12:44:17.096 [INFO][3840] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" HandleID="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.111 [INFO][3840] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" HandleID="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbed0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"csi-node-driver-5rc66", "timestamp":"2026-03-03 12:44:17.096317628 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004e9080)} Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.112 [INFO][3840] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.112 [INFO][3840] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.112 [INFO][3840] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.117 [INFO][3840] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.129 [INFO][3840] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.135 [INFO][3840] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.139 [INFO][3840] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.225975 containerd[1519]: 2026-03-03 12:44:17.143 [INFO][3840] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.226354 containerd[1519]: 2026-03-03 12:44:17.143 [INFO][3840] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.226354 containerd[1519]: 2026-03-03 12:44:17.147 [INFO][3840] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b Mar 3 12:44:17.226354 containerd[1519]: 2026-03-03 12:44:17.155 [INFO][3840] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.226354 containerd[1519]: 2026-03-03 12:44:17.166 [INFO][3840] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.193/26] block=192.168.6.192/26 handle="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.226354 containerd[1519]: 2026-03-03 12:44:17.167 [INFO][3840] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.193/26] handle="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:17.226354 containerd[1519]: 2026-03-03 12:44:17.167 [INFO][3840] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:17.226354 containerd[1519]: 2026-03-03 12:44:17.167 [INFO][3840] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.193/26] IPv6=[] ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" HandleID="k8s-pod-network.cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" Mar 3 12:44:17.226684 containerd[1519]: 2026-03-03 12:44:17.178 [INFO][3828] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"af4d9ea5-01b2-4705-ac48-8ebf47211342", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"csi-node-driver-5rc66", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.6.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5b89270fd3d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:17.226745 containerd[1519]: 2026-03-03 12:44:17.178 [INFO][3828] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.193/32] ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" Mar 3 12:44:17.226745 containerd[1519]: 2026-03-03 12:44:17.178 [INFO][3828] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5b89270fd3d ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" Mar 3 12:44:17.226745 containerd[1519]: 2026-03-03 12:44:17.196 [INFO][3828] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" Mar 3 12:44:17.226799 containerd[1519]: 2026-03-03 12:44:17.197 [INFO][3828] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"af4d9ea5-01b2-4705-ac48-8ebf47211342", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b", Pod:"csi-node-driver-5rc66", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.6.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5b89270fd3d", MAC:"ae:af:2e:b2:18:58", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:17.228273 containerd[1519]: 2026-03-03 12:44:17.219 [INFO][3828] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" Namespace="calico-system" Pod="csi-node-driver-5rc66" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-csi--node--driver--5rc66-eth0" Mar 3 12:44:17.251399 containerd[1519]: time="2026-03-03T12:44:17.250255635Z" level=info msg="connecting to shim cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b" address="unix:///run/containerd/s/a521125734249ba4c5cfcc955cddef6e4301514c539992d3f0d2c441a2828972" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:17.256421 kubelet[2772]: I0303 12:44:17.256358 2772 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-ca-bundle\") pod \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " Mar 3 12:44:17.256421 kubelet[2772]: I0303 12:44:17.256424 2772 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-nginx-config\") pod \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " Mar 3 12:44:17.256757 kubelet[2772]: I0303 12:44:17.256459 2772 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-backend-key-pair\") pod \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " Mar 3 12:44:17.256757 kubelet[2772]: I0303 12:44:17.256479 2772 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9dvl\" (UniqueName: \"kubernetes.io/projected/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-kube-api-access-c9dvl\") pod \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\" (UID: \"ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e\") " Mar 3 12:44:17.261126 kubelet[2772]: I0303 12:44:17.258289 2772 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e" (UID: "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 3 12:44:17.261126 kubelet[2772]: I0303 12:44:17.259039 2772 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e" (UID: "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 3 12:44:17.266884 kubelet[2772]: I0303 12:44:17.266843 2772 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-kube-api-access-c9dvl" (OuterVolumeSpecName: "kube-api-access-c9dvl") pod "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e" (UID: "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e"). InnerVolumeSpecName "kube-api-access-c9dvl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 3 12:44:17.268533 systemd[1]: var-lib-kubelet-pods-ebcf1a5f\x2da1af\x2d487c\x2d91d1\x2d62e2a1c97f7e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dc9dvl.mount: Deactivated successfully. Mar 3 12:44:17.274364 kubelet[2772]: I0303 12:44:17.274322 2772 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e" (UID: "ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 3 12:44:17.293353 systemd[1]: Started cri-containerd-cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b.scope - libcontainer container cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b. Mar 3 12:44:17.321232 containerd[1519]: time="2026-03-03T12:44:17.321193985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5rc66,Uid:af4d9ea5-01b2-4705-ac48-8ebf47211342,Namespace:calico-system,Attempt:0,} returns sandbox id \"cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b\"" Mar 3 12:44:17.324223 containerd[1519]: time="2026-03-03T12:44:17.324182852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 3 12:44:17.357328 kubelet[2772]: I0303 12:44:17.357278 2772 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-nginx-config\") on node \"ci-4459-2-4-8-fcaab3b7ef\" DevicePath \"\"" Mar 3 12:44:17.357328 kubelet[2772]: I0303 12:44:17.357312 2772 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-backend-key-pair\") on node \"ci-4459-2-4-8-fcaab3b7ef\" DevicePath \"\"" Mar 3 12:44:17.357328 kubelet[2772]: I0303 12:44:17.357325 2772 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9dvl\" (UniqueName: \"kubernetes.io/projected/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-kube-api-access-c9dvl\") on node \"ci-4459-2-4-8-fcaab3b7ef\" DevicePath \"\"" Mar 3 12:44:17.357328 kubelet[2772]: I0303 12:44:17.357335 2772 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e-whisker-ca-bundle\") on node \"ci-4459-2-4-8-fcaab3b7ef\" DevicePath \"\"" Mar 3 12:44:17.900048 systemd[1]: var-lib-kubelet-pods-ebcf1a5f\x2da1af\x2d487c\x2d91d1\x2d62e2a1c97f7e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 3 12:44:18.167588 systemd[1]: Removed slice kubepods-besteffort-podebcf1a5f_a1af_487c_91d1_62e2a1c97f7e.slice - libcontainer container kubepods-besteffort-podebcf1a5f_a1af_487c_91d1_62e2a1c97f7e.slice. Mar 3 12:44:18.243010 systemd[1]: Created slice kubepods-besteffort-pod32402fb1_0d20_4ce7_b4d4_51f761a289ca.slice - libcontainer container kubepods-besteffort-pod32402fb1_0d20_4ce7_b4d4_51f761a289ca.slice. Mar 3 12:44:18.309168 systemd-networkd[1422]: cali5b89270fd3d: Gained IPv6LL Mar 3 12:44:18.372666 kubelet[2772]: I0303 12:44:18.372610 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/32402fb1-0d20-4ce7-b4d4-51f761a289ca-nginx-config\") pod \"whisker-5b5c8d9967-tg2vr\" (UID: \"32402fb1-0d20-4ce7-b4d4-51f761a289ca\") " pod="calico-system/whisker-5b5c8d9967-tg2vr" Mar 3 12:44:18.373358 kubelet[2772]: I0303 12:44:18.373259 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k5nz\" (UniqueName: \"kubernetes.io/projected/32402fb1-0d20-4ce7-b4d4-51f761a289ca-kube-api-access-2k5nz\") pod \"whisker-5b5c8d9967-tg2vr\" (UID: \"32402fb1-0d20-4ce7-b4d4-51f761a289ca\") " pod="calico-system/whisker-5b5c8d9967-tg2vr" Mar 3 12:44:18.373358 kubelet[2772]: I0303 12:44:18.373333 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/32402fb1-0d20-4ce7-b4d4-51f761a289ca-whisker-backend-key-pair\") pod \"whisker-5b5c8d9967-tg2vr\" (UID: \"32402fb1-0d20-4ce7-b4d4-51f761a289ca\") " pod="calico-system/whisker-5b5c8d9967-tg2vr" Mar 3 12:44:18.373589 kubelet[2772]: I0303 12:44:18.373371 2772 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32402fb1-0d20-4ce7-b4d4-51f761a289ca-whisker-ca-bundle\") pod \"whisker-5b5c8d9967-tg2vr\" (UID: \"32402fb1-0d20-4ce7-b4d4-51f761a289ca\") " pod="calico-system/whisker-5b5c8d9967-tg2vr" Mar 3 12:44:18.551155 containerd[1519]: time="2026-03-03T12:44:18.550998413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b5c8d9967-tg2vr,Uid:32402fb1-0d20-4ce7-b4d4-51f761a289ca,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:18.698294 systemd-networkd[1422]: cali930e1021034: Link UP Mar 3 12:44:18.699210 systemd-networkd[1422]: cali930e1021034: Gained carrier Mar 3 12:44:18.719179 containerd[1519]: 2026-03-03 12:44:18.578 [ERROR][4052] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 3 12:44:18.719179 containerd[1519]: 2026-03-03 12:44:18.606 [INFO][4052] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0 whisker-5b5c8d9967- calico-system 32402fb1-0d20-4ce7-b4d4-51f761a289ca 896 0 2026-03-03 12:44:18 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5b5c8d9967 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef whisker-5b5c8d9967-tg2vr eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali930e1021034 [] [] }} ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-" Mar 3 12:44:18.719179 containerd[1519]: 2026-03-03 12:44:18.606 [INFO][4052] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" Mar 3 12:44:18.719179 containerd[1519]: 2026-03-03 12:44:18.637 [INFO][4062] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" HandleID="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.653 [INFO][4062] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" HandleID="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002733e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"whisker-5b5c8d9967-tg2vr", "timestamp":"2026-03-03 12:44:18.637203144 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003331e0)} Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.653 [INFO][4062] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.653 [INFO][4062] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.653 [INFO][4062] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.657 [INFO][4062] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.663 [INFO][4062] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.669 [INFO][4062] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.672 [INFO][4062] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719470 containerd[1519]: 2026-03-03 12:44:18.675 [INFO][4062] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719704 containerd[1519]: 2026-03-03 12:44:18.675 [INFO][4062] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719704 containerd[1519]: 2026-03-03 12:44:18.677 [INFO][4062] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8 Mar 3 12:44:18.719704 containerd[1519]: 2026-03-03 12:44:18.682 [INFO][4062] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719704 containerd[1519]: 2026-03-03 12:44:18.692 [INFO][4062] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.194/26] block=192.168.6.192/26 handle="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719704 containerd[1519]: 2026-03-03 12:44:18.693 [INFO][4062] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.194/26] handle="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:18.719704 containerd[1519]: 2026-03-03 12:44:18.693 [INFO][4062] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:18.719704 containerd[1519]: 2026-03-03 12:44:18.693 [INFO][4062] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.194/26] IPv6=[] ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" HandleID="k8s-pod-network.fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" Mar 3 12:44:18.719880 containerd[1519]: 2026-03-03 12:44:18.695 [INFO][4052] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0", GenerateName:"whisker-5b5c8d9967-", Namespace:"calico-system", SelfLink:"", UID:"32402fb1-0d20-4ce7-b4d4-51f761a289ca", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 44, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b5c8d9967", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"whisker-5b5c8d9967-tg2vr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.6.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali930e1021034", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:18.719880 containerd[1519]: 2026-03-03 12:44:18.695 [INFO][4052] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.194/32] ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" Mar 3 12:44:18.719965 containerd[1519]: 2026-03-03 12:44:18.695 [INFO][4052] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali930e1021034 ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" Mar 3 12:44:18.719965 containerd[1519]: 2026-03-03 12:44:18.699 [INFO][4052] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" Mar 3 12:44:18.720017 containerd[1519]: 2026-03-03 12:44:18.700 [INFO][4052] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0", GenerateName:"whisker-5b5c8d9967-", Namespace:"calico-system", SelfLink:"", UID:"32402fb1-0d20-4ce7-b4d4-51f761a289ca", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 44, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b5c8d9967", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8", Pod:"whisker-5b5c8d9967-tg2vr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.6.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali930e1021034", MAC:"4e:b2:3e:00:bd:12", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:18.720077 containerd[1519]: 2026-03-03 12:44:18.715 [INFO][4052] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" Namespace="calico-system" Pod="whisker-5b5c8d9967-tg2vr" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-whisker--5b5c8d9967--tg2vr-eth0" Mar 3 12:44:18.747835 containerd[1519]: time="2026-03-03T12:44:18.747268111Z" level=info msg="connecting to shim fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8" address="unix:///run/containerd/s/7be081002947cfa4c8cfd35f2d431de4e3857fa00c0f14b7d974c4ff8c2f504a" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:18.777582 systemd[1]: Started cri-containerd-fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8.scope - libcontainer container fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8. Mar 3 12:44:18.823352 containerd[1519]: time="2026-03-03T12:44:18.823219540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b5c8d9967-tg2vr,Uid:32402fb1-0d20-4ce7-b4d4-51f761a289ca,Namespace:calico-system,Attempt:0,} returns sandbox id \"fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8\"" Mar 3 12:44:18.976438 kubelet[2772]: I0303 12:44:18.976383 2772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e" path="/var/lib/kubelet/pods/ebcf1a5f-a1af-487c-91d1-62e2a1c97f7e/volumes" Mar 3 12:44:19.972254 systemd-networkd[1422]: cali930e1021034: Gained IPv6LL Mar 3 12:44:20.303065 containerd[1519]: time="2026-03-03T12:44:20.302927790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:20.304179 containerd[1519]: time="2026-03-03T12:44:20.304122684Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 3 12:44:20.305190 containerd[1519]: time="2026-03-03T12:44:20.305163217Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:20.308924 containerd[1519]: time="2026-03-03T12:44:20.308874580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:20.309843 containerd[1519]: time="2026-03-03T12:44:20.309731190Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 2.985502218s" Mar 3 12:44:20.309843 containerd[1519]: time="2026-03-03T12:44:20.309761351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 3 12:44:20.312486 containerd[1519]: time="2026-03-03T12:44:20.312452862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 3 12:44:20.321590 containerd[1519]: time="2026-03-03T12:44:20.321377727Z" level=info msg="CreateContainer within sandbox \"cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 3 12:44:20.333595 containerd[1519]: time="2026-03-03T12:44:20.333530630Z" level=info msg="Container bc1488f622d5d55d52f5c7364250e9a5f84b544160caa50dc435768a2dd1368b: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:20.348098 containerd[1519]: time="2026-03-03T12:44:20.348026360Z" level=info msg="CreateContainer within sandbox \"cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"bc1488f622d5d55d52f5c7364250e9a5f84b544160caa50dc435768a2dd1368b\"" Mar 3 12:44:20.350902 containerd[1519]: time="2026-03-03T12:44:20.349019572Z" level=info msg="StartContainer for \"bc1488f622d5d55d52f5c7364250e9a5f84b544160caa50dc435768a2dd1368b\"" Mar 3 12:44:20.352168 containerd[1519]: time="2026-03-03T12:44:20.352135209Z" level=info msg="connecting to shim bc1488f622d5d55d52f5c7364250e9a5f84b544160caa50dc435768a2dd1368b" address="unix:///run/containerd/s/a521125734249ba4c5cfcc955cddef6e4301514c539992d3f0d2c441a2828972" protocol=ttrpc version=3 Mar 3 12:44:20.389384 systemd[1]: Started cri-containerd-bc1488f622d5d55d52f5c7364250e9a5f84b544160caa50dc435768a2dd1368b.scope - libcontainer container bc1488f622d5d55d52f5c7364250e9a5f84b544160caa50dc435768a2dd1368b. Mar 3 12:44:20.470964 containerd[1519]: time="2026-03-03T12:44:20.470902525Z" level=info msg="StartContainer for \"bc1488f622d5d55d52f5c7364250e9a5f84b544160caa50dc435768a2dd1368b\" returns successfully" Mar 3 12:44:22.517420 containerd[1519]: time="2026-03-03T12:44:22.517330434Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:22.518927 containerd[1519]: time="2026-03-03T12:44:22.518862495Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 3 12:44:22.519619 containerd[1519]: time="2026-03-03T12:44:22.519574025Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:22.522865 containerd[1519]: time="2026-03-03T12:44:22.522809428Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:22.523766 containerd[1519]: time="2026-03-03T12:44:22.523248874Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 2.210761571s" Mar 3 12:44:22.523766 containerd[1519]: time="2026-03-03T12:44:22.523280475Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 3 12:44:22.525097 containerd[1519]: time="2026-03-03T12:44:22.525068899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 3 12:44:22.529679 containerd[1519]: time="2026-03-03T12:44:22.529639921Z" level=info msg="CreateContainer within sandbox \"fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 3 12:44:22.538697 containerd[1519]: time="2026-03-03T12:44:22.538212477Z" level=info msg="Container f04392e8d855c47de09d42212a8f67c523454e187b450137b78b91c77be2e332: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:22.551134 containerd[1519]: time="2026-03-03T12:44:22.550525563Z" level=info msg="CreateContainer within sandbox \"fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"f04392e8d855c47de09d42212a8f67c523454e187b450137b78b91c77be2e332\"" Mar 3 12:44:22.553732 containerd[1519]: time="2026-03-03T12:44:22.552362588Z" level=info msg="StartContainer for \"f04392e8d855c47de09d42212a8f67c523454e187b450137b78b91c77be2e332\"" Mar 3 12:44:22.558103 containerd[1519]: time="2026-03-03T12:44:22.558018705Z" level=info msg="connecting to shim f04392e8d855c47de09d42212a8f67c523454e187b450137b78b91c77be2e332" address="unix:///run/containerd/s/7be081002947cfa4c8cfd35f2d431de4e3857fa00c0f14b7d974c4ff8c2f504a" protocol=ttrpc version=3 Mar 3 12:44:22.585364 systemd[1]: Started cri-containerd-f04392e8d855c47de09d42212a8f67c523454e187b450137b78b91c77be2e332.scope - libcontainer container f04392e8d855c47de09d42212a8f67c523454e187b450137b78b91c77be2e332. Mar 3 12:44:22.647944 containerd[1519]: time="2026-03-03T12:44:22.647879080Z" level=info msg="StartContainer for \"f04392e8d855c47de09d42212a8f67c523454e187b450137b78b91c77be2e332\" returns successfully" Mar 3 12:44:23.587537 kubelet[2772]: I0303 12:44:23.587464 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 12:44:24.478250 containerd[1519]: time="2026-03-03T12:44:24.478199749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:24.480577 containerd[1519]: time="2026-03-03T12:44:24.480531864Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 3 12:44:24.482229 containerd[1519]: time="2026-03-03T12:44:24.482192930Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:24.485904 containerd[1519]: time="2026-03-03T12:44:24.485856585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:24.486678 containerd[1519]: time="2026-03-03T12:44:24.486615797Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.960445203s" Mar 3 12:44:24.486678 containerd[1519]: time="2026-03-03T12:44:24.486650997Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 3 12:44:24.489819 containerd[1519]: time="2026-03-03T12:44:24.489754004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 3 12:44:24.493802 containerd[1519]: time="2026-03-03T12:44:24.493762905Z" level=info msg="CreateContainer within sandbox \"cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 3 12:44:24.506529 containerd[1519]: time="2026-03-03T12:44:24.506223134Z" level=info msg="Container c61958a270c701746c6c75694d6251fdf472dc784380ee7c1b6f4fd134e1cbab: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:24.542284 containerd[1519]: time="2026-03-03T12:44:24.541772154Z" level=info msg="CreateContainer within sandbox \"cedc72072581e364d0bb5b1a81a9ae0f76d772e06ec10df45a4870909c6c4c6b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c61958a270c701746c6c75694d6251fdf472dc784380ee7c1b6f4fd134e1cbab\"" Mar 3 12:44:24.544507 containerd[1519]: time="2026-03-03T12:44:24.543642703Z" level=info msg="StartContainer for \"c61958a270c701746c6c75694d6251fdf472dc784380ee7c1b6f4fd134e1cbab\"" Mar 3 12:44:24.548693 containerd[1519]: time="2026-03-03T12:44:24.548607738Z" level=info msg="connecting to shim c61958a270c701746c6c75694d6251fdf472dc784380ee7c1b6f4fd134e1cbab" address="unix:///run/containerd/s/a521125734249ba4c5cfcc955cddef6e4301514c539992d3f0d2c441a2828972" protocol=ttrpc version=3 Mar 3 12:44:24.579556 systemd[1]: Started cri-containerd-c61958a270c701746c6c75694d6251fdf472dc784380ee7c1b6f4fd134e1cbab.scope - libcontainer container c61958a270c701746c6c75694d6251fdf472dc784380ee7c1b6f4fd134e1cbab. Mar 3 12:44:24.691101 containerd[1519]: time="2026-03-03T12:44:24.691010340Z" level=info msg="StartContainer for \"c61958a270c701746c6c75694d6251fdf472dc784380ee7c1b6f4fd134e1cbab\" returns successfully" Mar 3 12:44:25.055783 kubelet[2772]: I0303 12:44:25.055745 2772 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 3 12:44:25.062432 kubelet[2772]: I0303 12:44:25.062370 2772 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 3 12:44:25.190864 systemd-networkd[1422]: vxlan.calico: Link UP Mar 3 12:44:25.190873 systemd-networkd[1422]: vxlan.calico: Gained carrier Mar 3 12:44:25.260622 kubelet[2772]: I0303 12:44:25.260553 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5rc66" podStartSLOduration=19.095489018 podStartE2EDuration="26.260519194s" podCreationTimestamp="2026-03-03 12:43:59 +0000 UTC" firstStartedPulling="2026-03-03 12:44:17.322895841 +0000 UTC m=+40.487829000" lastFinishedPulling="2026-03-03 12:44:24.487926017 +0000 UTC m=+47.652859176" observedRunningTime="2026-03-03 12:44:25.257612907 +0000 UTC m=+48.422546066" watchObservedRunningTime="2026-03-03 12:44:25.260519194 +0000 UTC m=+48.425452353" Mar 3 12:44:26.826629 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1801629597.mount: Deactivated successfully. Mar 3 12:44:26.845482 containerd[1519]: time="2026-03-03T12:44:26.844153740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:26.845482 containerd[1519]: time="2026-03-03T12:44:26.845431361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 3 12:44:26.846072 containerd[1519]: time="2026-03-03T12:44:26.846025811Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:26.848542 containerd[1519]: time="2026-03-03T12:44:26.848490693Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:26.849524 containerd[1519]: time="2026-03-03T12:44:26.849485309Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.359691144s" Mar 3 12:44:26.849524 containerd[1519]: time="2026-03-03T12:44:26.849521470Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 3 12:44:26.857707 containerd[1519]: time="2026-03-03T12:44:26.857615525Z" level=info msg="CreateContainer within sandbox \"fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 3 12:44:26.871365 containerd[1519]: time="2026-03-03T12:44:26.869132758Z" level=info msg="Container b31b1c3fbe7727224e08047efa0d28eb9fc668e6789573ab4e63849329ea3c5c: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:26.884960 containerd[1519]: time="2026-03-03T12:44:26.884838101Z" level=info msg="CreateContainer within sandbox \"fe9d69d0aa692276e61d0f6e025354041f7bd8d6e254a0813b5b99b336fe23a8\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"b31b1c3fbe7727224e08047efa0d28eb9fc668e6789573ab4e63849329ea3c5c\"" Mar 3 12:44:26.888817 containerd[1519]: time="2026-03-03T12:44:26.888332400Z" level=info msg="StartContainer for \"b31b1c3fbe7727224e08047efa0d28eb9fc668e6789573ab4e63849329ea3c5c\"" Mar 3 12:44:26.891364 containerd[1519]: time="2026-03-03T12:44:26.891312570Z" level=info msg="connecting to shim b31b1c3fbe7727224e08047efa0d28eb9fc668e6789573ab4e63849329ea3c5c" address="unix:///run/containerd/s/7be081002947cfa4c8cfd35f2d431de4e3857fa00c0f14b7d974c4ff8c2f504a" protocol=ttrpc version=3 Mar 3 12:44:26.920384 systemd[1]: Started cri-containerd-b31b1c3fbe7727224e08047efa0d28eb9fc668e6789573ab4e63849329ea3c5c.scope - libcontainer container b31b1c3fbe7727224e08047efa0d28eb9fc668e6789573ab4e63849329ea3c5c. Mar 3 12:44:26.973037 containerd[1519]: time="2026-03-03T12:44:26.972993817Z" level=info msg="StartContainer for \"b31b1c3fbe7727224e08047efa0d28eb9fc668e6789573ab4e63849329ea3c5c\" returns successfully" Mar 3 12:44:27.012228 systemd-networkd[1422]: vxlan.calico: Gained IPv6LL Mar 3 12:44:27.975471 containerd[1519]: time="2026-03-03T12:44:27.975368083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xhfv4,Uid:d6331020-20e1-4302-b0cb-7c30f36149f0,Namespace:kube-system,Attempt:0,}" Mar 3 12:44:28.125637 systemd-networkd[1422]: cali9d273b39982: Link UP Mar 3 12:44:28.127090 systemd-networkd[1422]: cali9d273b39982: Gained carrier Mar 3 12:44:28.144026 kubelet[2772]: I0303 12:44:28.143498 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5b5c8d9967-tg2vr" podStartSLOduration=2.118510878 podStartE2EDuration="10.143475605s" podCreationTimestamp="2026-03-03 12:44:18 +0000 UTC" firstStartedPulling="2026-03-03 12:44:18.826071968 +0000 UTC m=+41.991005127" lastFinishedPulling="2026-03-03 12:44:26.851036695 +0000 UTC m=+50.015969854" observedRunningTime="2026-03-03 12:44:27.267057258 +0000 UTC m=+50.431990417" watchObservedRunningTime="2026-03-03 12:44:28.143475605 +0000 UTC m=+51.308408804" Mar 3 12:44:28.148664 containerd[1519]: 2026-03-03 12:44:28.033 [INFO][4525] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0 coredns-66bc5c9577- kube-system d6331020-20e1-4302-b0cb-7c30f36149f0 826 0 2026-03-03 12:43:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef coredns-66bc5c9577-xhfv4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9d273b39982 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-" Mar 3 12:44:28.148664 containerd[1519]: 2026-03-03 12:44:28.033 [INFO][4525] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" Mar 3 12:44:28.148664 containerd[1519]: 2026-03-03 12:44:28.063 [INFO][4536] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" HandleID="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.075 [INFO][4536] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" HandleID="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"coredns-66bc5c9577-xhfv4", "timestamp":"2026-03-03 12:44:28.063952517 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030f080)} Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.075 [INFO][4536] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.075 [INFO][4536] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.075 [INFO][4536] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.079 [INFO][4536] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.085 [INFO][4536] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.092 [INFO][4536] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.095 [INFO][4536] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.148902 containerd[1519]: 2026-03-03 12:44:28.100 [INFO][4536] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.149593 containerd[1519]: 2026-03-03 12:44:28.100 [INFO][4536] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.149593 containerd[1519]: 2026-03-03 12:44:28.102 [INFO][4536] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137 Mar 3 12:44:28.149593 containerd[1519]: 2026-03-03 12:44:28.107 [INFO][4536] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.149593 containerd[1519]: 2026-03-03 12:44:28.115 [INFO][4536] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.195/26] block=192.168.6.192/26 handle="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.149593 containerd[1519]: 2026-03-03 12:44:28.116 [INFO][4536] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.195/26] handle="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:28.149593 containerd[1519]: 2026-03-03 12:44:28.116 [INFO][4536] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:28.149593 containerd[1519]: 2026-03-03 12:44:28.116 [INFO][4536] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.195/26] IPv6=[] ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" HandleID="k8s-pod-network.79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" Mar 3 12:44:28.149749 containerd[1519]: 2026-03-03 12:44:28.120 [INFO][4525] cni-plugin/k8s.go 418: Populated endpoint ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"d6331020-20e1-4302-b0cb-7c30f36149f0", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"coredns-66bc5c9577-xhfv4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d273b39982", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:28.149749 containerd[1519]: 2026-03-03 12:44:28.120 [INFO][4525] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.195/32] ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" Mar 3 12:44:28.149749 containerd[1519]: 2026-03-03 12:44:28.120 [INFO][4525] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9d273b39982 ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" Mar 3 12:44:28.149749 containerd[1519]: 2026-03-03 12:44:28.127 [INFO][4525] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" Mar 3 12:44:28.149749 containerd[1519]: 2026-03-03 12:44:28.128 [INFO][4525] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"d6331020-20e1-4302-b0cb-7c30f36149f0", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137", Pod:"coredns-66bc5c9577-xhfv4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d273b39982", MAC:"52:60:77:cc:a3:9f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:28.149947 containerd[1519]: 2026-03-03 12:44:28.145 [INFO][4525] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" Namespace="kube-system" Pod="coredns-66bc5c9577-xhfv4" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--xhfv4-eth0" Mar 3 12:44:28.186487 containerd[1519]: time="2026-03-03T12:44:28.186354986Z" level=info msg="connecting to shim 79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137" address="unix:///run/containerd/s/844943abb260343bcacb97cd9e0129652ad4080a963633024a2d7b269513809d" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:28.221516 systemd[1]: Started cri-containerd-79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137.scope - libcontainer container 79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137. Mar 3 12:44:28.281900 containerd[1519]: time="2026-03-03T12:44:28.281848284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xhfv4,Uid:d6331020-20e1-4302-b0cb-7c30f36149f0,Namespace:kube-system,Attempt:0,} returns sandbox id \"79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137\"" Mar 3 12:44:28.290148 containerd[1519]: time="2026-03-03T12:44:28.289762908Z" level=info msg="CreateContainer within sandbox \"79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 3 12:44:28.304492 containerd[1519]: time="2026-03-03T12:44:28.304196451Z" level=info msg="Container 8613c27eee880032b6afe0e1a583fcc70f1e7a29b4c8335c34052c4cae51494d: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:28.313959 containerd[1519]: time="2026-03-03T12:44:28.313917108Z" level=info msg="CreateContainer within sandbox \"79358fbc1ca48a3846e07c0c4ccc4b47ac7337e9bb2b35ab493d408311a8a137\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8613c27eee880032b6afe0e1a583fcc70f1e7a29b4c8335c34052c4cae51494d\"" Mar 3 12:44:28.315966 containerd[1519]: time="2026-03-03T12:44:28.315380574Z" level=info msg="StartContainer for \"8613c27eee880032b6afe0e1a583fcc70f1e7a29b4c8335c34052c4cae51494d\"" Mar 3 12:44:28.317159 containerd[1519]: time="2026-03-03T12:44:28.317082685Z" level=info msg="connecting to shim 8613c27eee880032b6afe0e1a583fcc70f1e7a29b4c8335c34052c4cae51494d" address="unix:///run/containerd/s/844943abb260343bcacb97cd9e0129652ad4080a963633024a2d7b269513809d" protocol=ttrpc version=3 Mar 3 12:44:28.335369 systemd[1]: Started cri-containerd-8613c27eee880032b6afe0e1a583fcc70f1e7a29b4c8335c34052c4cae51494d.scope - libcontainer container 8613c27eee880032b6afe0e1a583fcc70f1e7a29b4c8335c34052c4cae51494d. Mar 3 12:44:28.367085 containerd[1519]: time="2026-03-03T12:44:28.366911553Z" level=info msg="StartContainer for \"8613c27eee880032b6afe0e1a583fcc70f1e7a29b4c8335c34052c4cae51494d\" returns successfully" Mar 3 12:44:28.975546 containerd[1519]: time="2026-03-03T12:44:28.975358309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-wdvrv,Uid:eccd9c30-d55c-4dea-9b0c-0692a8266820,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:28.981354 containerd[1519]: time="2026-03-03T12:44:28.980877370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-9wqps,Uid:5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:28.982863 containerd[1519]: time="2026-03-03T12:44:28.982824925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-k647t,Uid:3a3236b5-09aa-4b76-9ecf-67f6d695afd2,Namespace:kube-system,Attempt:0,}" Mar 3 12:44:29.174547 systemd-networkd[1422]: calica86bca8fab: Link UP Mar 3 12:44:29.175933 systemd-networkd[1422]: calica86bca8fab: Gained carrier Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.057 [INFO][4653] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0 coredns-66bc5c9577- kube-system 3a3236b5-09aa-4b76-9ecf-67f6d695afd2 822 0 2026-03-03 12:43:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef coredns-66bc5c9577-k647t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calica86bca8fab [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.057 [INFO][4653] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.102 [INFO][4678] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" HandleID="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.119 [INFO][4678] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" HandleID="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f9e80), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"coredns-66bc5c9577-k647t", "timestamp":"2026-03-03 12:44:29.102605416 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001866e0)} Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.119 [INFO][4678] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.119 [INFO][4678] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.119 [INFO][4678] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.124 [INFO][4678] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.141 [INFO][4678] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.147 [INFO][4678] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.149 [INFO][4678] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.153 [INFO][4678] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.153 [INFO][4678] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.155 [INFO][4678] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.160 [INFO][4678] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.168 [INFO][4678] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.196/26] block=192.168.6.192/26 handle="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.168 [INFO][4678] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.196/26] handle="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.168 [INFO][4678] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:29.196487 containerd[1519]: 2026-03-03 12:44:29.168 [INFO][4678] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.196/26] IPv6=[] ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" HandleID="k8s-pod-network.5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" Mar 3 12:44:29.197259 containerd[1519]: 2026-03-03 12:44:29.172 [INFO][4653] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"3a3236b5-09aa-4b76-9ecf-67f6d695afd2", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"coredns-66bc5c9577-k647t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calica86bca8fab", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:29.197259 containerd[1519]: 2026-03-03 12:44:29.172 [INFO][4653] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.196/32] ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" Mar 3 12:44:29.197259 containerd[1519]: 2026-03-03 12:44:29.172 [INFO][4653] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica86bca8fab ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" Mar 3 12:44:29.197259 containerd[1519]: 2026-03-03 12:44:29.175 [INFO][4653] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" Mar 3 12:44:29.197259 containerd[1519]: 2026-03-03 12:44:29.176 [INFO][4653] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"3a3236b5-09aa-4b76-9ecf-67f6d695afd2", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e", Pod:"coredns-66bc5c9577-k647t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calica86bca8fab", MAC:"b2:a2:69:6c:09:31", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:29.198958 containerd[1519]: 2026-03-03 12:44:29.195 [INFO][4653] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" Namespace="kube-system" Pod="coredns-66bc5c9577-k647t" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-coredns--66bc5c9577--k647t-eth0" Mar 3 12:44:29.242707 containerd[1519]: time="2026-03-03T12:44:29.242235576Z" level=info msg="connecting to shim 5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e" address="unix:///run/containerd/s/1b4836dcf33cd5a3f867a677f67dfc6e9f499b0a4b1c1f148f2aa79caecc5147" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:29.292407 kubelet[2772]: I0303 12:44:29.292189 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-xhfv4" podStartSLOduration=47.29217 podStartE2EDuration="47.29217s" podCreationTimestamp="2026-03-03 12:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 12:44:29.281773483 +0000 UTC m=+52.446706642" watchObservedRunningTime="2026-03-03 12:44:29.29217 +0000 UTC m=+52.457103159" Mar 3 12:44:29.294310 systemd[1]: Started cri-containerd-5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e.scope - libcontainer container 5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e. Mar 3 12:44:29.313620 systemd-networkd[1422]: cali324c1bd5333: Link UP Mar 3 12:44:29.316446 systemd-networkd[1422]: cali324c1bd5333: Gained carrier Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.064 [INFO][4642] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0 calico-apiserver-f8cd6759c- calico-system eccd9c30-d55c-4dea-9b0c-0692a8266820 831 0 2026-03-03 12:43:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f8cd6759c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef calico-apiserver-f8cd6759c-wdvrv eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali324c1bd5333 [] [] }} ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.066 [INFO][4642] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.126 [INFO][4683] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" HandleID="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.141 [INFO][4683] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" HandleID="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003638c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"calico-apiserver-f8cd6759c-wdvrv", "timestamp":"2026-03-03 12:44:29.126368506 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000133080)} Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.141 [INFO][4683] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.168 [INFO][4683] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.168 [INFO][4683] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.224 [INFO][4683] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.244 [INFO][4683] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.262 [INFO][4683] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.266 [INFO][4683] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.270 [INFO][4683] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.270 [INFO][4683] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.274 [INFO][4683] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7 Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.287 [INFO][4683] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.307 [INFO][4683] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.197/26] block=192.168.6.192/26 handle="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.307 [INFO][4683] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.197/26] handle="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.307 [INFO][4683] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:29.351129 containerd[1519]: 2026-03-03 12:44:29.307 [INFO][4683] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.197/26] IPv6=[] ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" HandleID="k8s-pod-network.97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" Mar 3 12:44:29.353001 containerd[1519]: 2026-03-03 12:44:29.309 [INFO][4642] cni-plugin/k8s.go 418: Populated endpoint ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0", GenerateName:"calico-apiserver-f8cd6759c-", Namespace:"calico-system", SelfLink:"", UID:"eccd9c30-d55c-4dea-9b0c-0692a8266820", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f8cd6759c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"calico-apiserver-f8cd6759c-wdvrv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali324c1bd5333", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:29.353001 containerd[1519]: 2026-03-03 12:44:29.309 [INFO][4642] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.197/32] ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" Mar 3 12:44:29.353001 containerd[1519]: 2026-03-03 12:44:29.309 [INFO][4642] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali324c1bd5333 ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" Mar 3 12:44:29.353001 containerd[1519]: 2026-03-03 12:44:29.315 [INFO][4642] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" Mar 3 12:44:29.353001 containerd[1519]: 2026-03-03 12:44:29.316 [INFO][4642] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0", GenerateName:"calico-apiserver-f8cd6759c-", Namespace:"calico-system", SelfLink:"", UID:"eccd9c30-d55c-4dea-9b0c-0692a8266820", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f8cd6759c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7", Pod:"calico-apiserver-f8cd6759c-wdvrv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali324c1bd5333", MAC:"16:5a:1b:13:16:84", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:29.353001 containerd[1519]: 2026-03-03 12:44:29.347 [INFO][4642] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-wdvrv" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--wdvrv-eth0" Mar 3 12:44:29.379733 systemd-networkd[1422]: cali9d273b39982: Gained IPv6LL Mar 3 12:44:29.410868 containerd[1519]: time="2026-03-03T12:44:29.410824762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-k647t,Uid:3a3236b5-09aa-4b76-9ecf-67f6d695afd2,Namespace:kube-system,Attempt:0,} returns sandbox id \"5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e\"" Mar 3 12:44:29.422981 containerd[1519]: time="2026-03-03T12:44:29.422811469Z" level=info msg="CreateContainer within sandbox \"5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 3 12:44:29.437902 containerd[1519]: time="2026-03-03T12:44:29.437866153Z" level=info msg="Container cd32634051e804c97362cb5279ba2e1758e28d14b7f8d8f81a00fad3e2e8c981: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:29.439613 systemd-networkd[1422]: cali1f294af521e: Link UP Mar 3 12:44:29.440023 systemd-networkd[1422]: cali1f294af521e: Gained carrier Mar 3 12:44:29.455182 containerd[1519]: time="2026-03-03T12:44:29.453751854Z" level=info msg="connecting to shim 97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7" address="unix:///run/containerd/s/cf239178012c918482d5476a36755d79de40fd2254b8a6de3577dbd1d6fdc3bf" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:29.457564 containerd[1519]: time="2026-03-03T12:44:29.457508365Z" level=info msg="CreateContainer within sandbox \"5b0dfc86f1406b377f2a553a7a168072617b1b861111c2e1a73993e15073490e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cd32634051e804c97362cb5279ba2e1758e28d14b7f8d8f81a00fad3e2e8c981\"" Mar 3 12:44:29.462525 containerd[1519]: time="2026-03-03T12:44:29.462375457Z" level=info msg="StartContainer for \"cd32634051e804c97362cb5279ba2e1758e28d14b7f8d8f81a00fad3e2e8c981\"" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.079 [INFO][4651] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0 calico-apiserver-f8cd6759c- calico-system 5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843 832 0 2026-03-03 12:43:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f8cd6759c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef calico-apiserver-f8cd6759c-9wqps eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali1f294af521e [] [] }} ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.080 [INFO][4651] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.146 [INFO][4689] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" HandleID="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.163 [INFO][4689] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" HandleID="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004da10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"calico-apiserver-f8cd6759c-9wqps", "timestamp":"2026-03-03 12:44:29.146631089 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002e31e0)} Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.163 [INFO][4689] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.307 [INFO][4689] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.307 [INFO][4689] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.336 [INFO][4689] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.362 [INFO][4689] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.385 [INFO][4689] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.393 [INFO][4689] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.398 [INFO][4689] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.399 [INFO][4689] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.403 [INFO][4689] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045 Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.413 [INFO][4689] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.427 [INFO][4689] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.198/26] block=192.168.6.192/26 handle="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.427 [INFO][4689] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.198/26] handle="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.427 [INFO][4689] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:29.468009 containerd[1519]: 2026-03-03 12:44:29.427 [INFO][4689] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.198/26] IPv6=[] ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" HandleID="k8s-pod-network.b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" Mar 3 12:44:29.470057 containerd[1519]: 2026-03-03 12:44:29.433 [INFO][4651] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0", GenerateName:"calico-apiserver-f8cd6759c-", Namespace:"calico-system", SelfLink:"", UID:"5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f8cd6759c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"calico-apiserver-f8cd6759c-9wqps", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1f294af521e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:29.470057 containerd[1519]: 2026-03-03 12:44:29.433 [INFO][4651] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.198/32] ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" Mar 3 12:44:29.470057 containerd[1519]: 2026-03-03 12:44:29.433 [INFO][4651] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f294af521e ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" Mar 3 12:44:29.470057 containerd[1519]: 2026-03-03 12:44:29.439 [INFO][4651] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" Mar 3 12:44:29.470057 containerd[1519]: 2026-03-03 12:44:29.442 [INFO][4651] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0", GenerateName:"calico-apiserver-f8cd6759c-", Namespace:"calico-system", SelfLink:"", UID:"5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f8cd6759c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045", Pod:"calico-apiserver-f8cd6759c-9wqps", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1f294af521e", MAC:"ce:22:ed:0b:aa:a4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:29.470057 containerd[1519]: 2026-03-03 12:44:29.462 [INFO][4651] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" Namespace="calico-system" Pod="calico-apiserver-f8cd6759c-9wqps" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--apiserver--f8cd6759c--9wqps-eth0" Mar 3 12:44:29.470057 containerd[1519]: time="2026-03-03T12:44:29.468026924Z" level=info msg="connecting to shim cd32634051e804c97362cb5279ba2e1758e28d14b7f8d8f81a00fad3e2e8c981" address="unix:///run/containerd/s/1b4836dcf33cd5a3f867a677f67dfc6e9f499b0a4b1c1f148f2aa79caecc5147" protocol=ttrpc version=3 Mar 3 12:44:29.499094 systemd[1]: Started cri-containerd-cd32634051e804c97362cb5279ba2e1758e28d14b7f8d8f81a00fad3e2e8c981.scope - libcontainer container cd32634051e804c97362cb5279ba2e1758e28d14b7f8d8f81a00fad3e2e8c981. Mar 3 12:44:29.520441 systemd[1]: Started cri-containerd-97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7.scope - libcontainer container 97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7. Mar 3 12:44:29.527273 containerd[1519]: time="2026-03-03T12:44:29.527159681Z" level=info msg="connecting to shim b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045" address="unix:///run/containerd/s/c61e4ddbcb2bdb3198e4d45ee999c5de71626bcf9b986a8fe663568978df085d" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:29.574307 systemd[1]: Started cri-containerd-b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045.scope - libcontainer container b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045. Mar 3 12:44:29.581557 containerd[1519]: time="2026-03-03T12:44:29.581240784Z" level=info msg="StartContainer for \"cd32634051e804c97362cb5279ba2e1758e28d14b7f8d8f81a00fad3e2e8c981\" returns successfully" Mar 3 12:44:29.654938 containerd[1519]: time="2026-03-03T12:44:29.654887856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-9wqps,Uid:5676c205-9bdf-4ef6-9fc8-1d7ffb5e1843,Namespace:calico-system,Attempt:0,} returns sandbox id \"b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045\"" Mar 3 12:44:29.662703 containerd[1519]: time="2026-03-03T12:44:29.662389637Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 3 12:44:29.690042 containerd[1519]: time="2026-03-03T12:44:29.689397428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f8cd6759c-wdvrv,Uid:eccd9c30-d55c-4dea-9b0c-0692a8266820,Namespace:calico-system,Attempt:0,} returns sandbox id \"97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7\"" Mar 3 12:44:30.283044 kubelet[2772]: I0303 12:44:30.282948 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-k647t" podStartSLOduration=48.282930197 podStartE2EDuration="48.282930197s" podCreationTimestamp="2026-03-03 12:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 12:44:30.282023219 +0000 UTC m=+53.446956458" watchObservedRunningTime="2026-03-03 12:44:30.282930197 +0000 UTC m=+53.447863356" Mar 3 12:44:30.915343 systemd-networkd[1422]: calica86bca8fab: Gained IPv6LL Mar 3 12:44:30.918671 systemd-networkd[1422]: cali324c1bd5333: Gained IPv6LL Mar 3 12:44:30.975618 containerd[1519]: time="2026-03-03T12:44:30.975144108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ldzpx,Uid:3115d2ff-25aa-4838-aef5-faf4076ac816,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:31.118940 systemd-networkd[1422]: cali3551d82233f: Link UP Mar 3 12:44:31.119317 systemd-networkd[1422]: cali3551d82233f: Gained carrier Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.021 [INFO][4944] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0 goldmane-cccfbd5cf- calico-system 3115d2ff-25aa-4838-aef5-faf4076ac816 830 0 2026-03-03 12:43:56 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef goldmane-cccfbd5cf-ldzpx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali3551d82233f [] [] }} ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.021 [INFO][4944] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.052 [INFO][4954] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" HandleID="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.066 [INFO][4954] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" HandleID="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbdd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"goldmane-cccfbd5cf-ldzpx", "timestamp":"2026-03-03 12:44:31.052285332 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002e58c0)} Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.066 [INFO][4954] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.066 [INFO][4954] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.066 [INFO][4954] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.069 [INFO][4954] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.075 [INFO][4954] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.083 [INFO][4954] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.085 [INFO][4954] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.088 [INFO][4954] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.089 [INFO][4954] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.091 [INFO][4954] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317 Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.097 [INFO][4954] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.111 [INFO][4954] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.199/26] block=192.168.6.192/26 handle="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.111 [INFO][4954] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.199/26] handle="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.111 [INFO][4954] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:31.138845 containerd[1519]: 2026-03-03 12:44:31.111 [INFO][4954] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.199/26] IPv6=[] ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" HandleID="k8s-pod-network.9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" Mar 3 12:44:31.143851 containerd[1519]: 2026-03-03 12:44:31.114 [INFO][4944] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"3115d2ff-25aa-4838-aef5-faf4076ac816", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"goldmane-cccfbd5cf-ldzpx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.6.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3551d82233f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:31.143851 containerd[1519]: 2026-03-03 12:44:31.115 [INFO][4944] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.199/32] ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" Mar 3 12:44:31.143851 containerd[1519]: 2026-03-03 12:44:31.115 [INFO][4944] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3551d82233f ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" Mar 3 12:44:31.143851 containerd[1519]: 2026-03-03 12:44:31.119 [INFO][4944] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" Mar 3 12:44:31.143851 containerd[1519]: 2026-03-03 12:44:31.120 [INFO][4944] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"3115d2ff-25aa-4838-aef5-faf4076ac816", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317", Pod:"goldmane-cccfbd5cf-ldzpx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.6.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3551d82233f", MAC:"56:b7:77:85:b1:4c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:31.143851 containerd[1519]: 2026-03-03 12:44:31.134 [INFO][4944] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ldzpx" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-goldmane--cccfbd5cf--ldzpx-eth0" Mar 3 12:44:31.181697 containerd[1519]: time="2026-03-03T12:44:31.179999876Z" level=info msg="connecting to shim 9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317" address="unix:///run/containerd/s/e5787cbaa399ff59b7f4a90cf4ad0458aaa471a38d9a9c3b7be7e079a04ccfa9" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:31.220182 systemd[1]: Started cri-containerd-9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317.scope - libcontainer container 9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317. Mar 3 12:44:31.285528 containerd[1519]: time="2026-03-03T12:44:31.285312446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ldzpx,Uid:3115d2ff-25aa-4838-aef5-faf4076ac816,Namespace:calico-system,Attempt:0,} returns sandbox id \"9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317\"" Mar 3 12:44:31.363833 systemd-networkd[1422]: cali1f294af521e: Gained IPv6LL Mar 3 12:44:31.975837 containerd[1519]: time="2026-03-03T12:44:31.975776415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-569b58c775-xc222,Uid:5babebd3-6e97-412e-b6a1-5908db32c25f,Namespace:calico-system,Attempt:0,}" Mar 3 12:44:32.163504 systemd-networkd[1422]: cali43f4a5558fc: Link UP Mar 3 12:44:32.164693 systemd-networkd[1422]: cali43f4a5558fc: Gained carrier Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.023 [INFO][5032] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0 calico-kube-controllers-569b58c775- calico-system 5babebd3-6e97-412e-b6a1-5908db32c25f 828 0 2026-03-03 12:43:59 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:569b58c775 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-8-fcaab3b7ef calico-kube-controllers-569b58c775-xc222 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali43f4a5558fc [] [] }} ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.024 [INFO][5032] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.056 [INFO][5045] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" HandleID="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.067 [INFO][5045] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" HandleID="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f3280), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-8-fcaab3b7ef", "pod":"calico-kube-controllers-569b58c775-xc222", "timestamp":"2026-03-03 12:44:32.056572925 +0000 UTC"}, Hostname:"ci-4459-2-4-8-fcaab3b7ef", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002a8f20)} Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.067 [INFO][5045] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.068 [INFO][5045] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.068 [INFO][5045] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-8-fcaab3b7ef' Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.073 [INFO][5045] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.081 [INFO][5045] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.106 [INFO][5045] ipam/ipam.go 526: Trying affinity for 192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.111 [INFO][5045] ipam/ipam.go 160: Attempting to load block cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.115 [INFO][5045] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.6.192/26 host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.115 [INFO][5045] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.6.192/26 handle="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.118 [INFO][5045] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.127 [INFO][5045] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.6.192/26 handle="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.144 [INFO][5045] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.6.200/26] block=192.168.6.192/26 handle="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.144 [INFO][5045] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.6.200/26] handle="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" host="ci-4459-2-4-8-fcaab3b7ef" Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.144 [INFO][5045] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 12:44:32.184069 containerd[1519]: 2026-03-03 12:44:32.144 [INFO][5045] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.6.200/26] IPv6=[] ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" HandleID="k8s-pod-network.f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Workload="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" Mar 3 12:44:32.184769 containerd[1519]: 2026-03-03 12:44:32.150 [INFO][5032] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0", GenerateName:"calico-kube-controllers-569b58c775-", Namespace:"calico-system", SelfLink:"", UID:"5babebd3-6e97-412e-b6a1-5908db32c25f", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"569b58c775", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"", Pod:"calico-kube-controllers-569b58c775-xc222", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.6.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali43f4a5558fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:32.184769 containerd[1519]: 2026-03-03 12:44:32.151 [INFO][5032] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.6.200/32] ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" Mar 3 12:44:32.184769 containerd[1519]: 2026-03-03 12:44:32.151 [INFO][5032] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43f4a5558fc ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" Mar 3 12:44:32.184769 containerd[1519]: 2026-03-03 12:44:32.168 [INFO][5032] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" Mar 3 12:44:32.184769 containerd[1519]: 2026-03-03 12:44:32.169 [INFO][5032] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0", GenerateName:"calico-kube-controllers-569b58c775-", Namespace:"calico-system", SelfLink:"", UID:"5babebd3-6e97-412e-b6a1-5908db32c25f", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 12, 43, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"569b58c775", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-8-fcaab3b7ef", ContainerID:"f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd", Pod:"calico-kube-controllers-569b58c775-xc222", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.6.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali43f4a5558fc", MAC:"aa:db:dc:03:67:4e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 12:44:32.184769 containerd[1519]: 2026-03-03 12:44:32.180 [INFO][5032] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" Namespace="calico-system" Pod="calico-kube-controllers-569b58c775-xc222" WorkloadEndpoint="ci--4459--2--4--8--fcaab3b7ef-k8s-calico--kube--controllers--569b58c775--xc222-eth0" Mar 3 12:44:32.232460 containerd[1519]: time="2026-03-03T12:44:32.232254430Z" level=info msg="connecting to shim f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd" address="unix:///run/containerd/s/f2420c969f91c13fac0f07d98ef8f56534b81a1c19f9cbbb0a390a1be6ceb08f" namespace=k8s.io protocol=ttrpc version=3 Mar 3 12:44:32.273409 systemd[1]: Started cri-containerd-f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd.scope - libcontainer container f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd. Mar 3 12:44:32.367505 containerd[1519]: time="2026-03-03T12:44:32.367461251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-569b58c775-xc222,Uid:5babebd3-6e97-412e-b6a1-5908db32c25f,Namespace:calico-system,Attempt:0,} returns sandbox id \"f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd\"" Mar 3 12:44:32.707410 systemd-networkd[1422]: cali3551d82233f: Gained IPv6LL Mar 3 12:44:32.873011 containerd[1519]: time="2026-03-03T12:44:32.872945958Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:32.874792 containerd[1519]: time="2026-03-03T12:44:32.874723355Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 3 12:44:32.875249 containerd[1519]: time="2026-03-03T12:44:32.875202685Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:32.878358 containerd[1519]: time="2026-03-03T12:44:32.878303630Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:32.879133 containerd[1519]: time="2026-03-03T12:44:32.879090286Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.216500125s" Mar 3 12:44:32.879312 containerd[1519]: time="2026-03-03T12:44:32.879218169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 3 12:44:32.881189 containerd[1519]: time="2026-03-03T12:44:32.881083808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 3 12:44:32.884785 containerd[1519]: time="2026-03-03T12:44:32.884619401Z" level=info msg="CreateContainer within sandbox \"b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 3 12:44:32.896528 containerd[1519]: time="2026-03-03T12:44:32.896348446Z" level=info msg="Container 069f4570ddacb7cd4cc4668a87425d1229a54328ac94d3efab8fffd735908203: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:32.907929 containerd[1519]: time="2026-03-03T12:44:32.907888087Z" level=info msg="CreateContainer within sandbox \"b4b051b6a981b4ea1a7901f48d2dec95ba966f3e879dd9cf08f6dc4f39a48045\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"069f4570ddacb7cd4cc4668a87425d1229a54328ac94d3efab8fffd735908203\"" Mar 3 12:44:32.910193 containerd[1519]: time="2026-03-03T12:44:32.910147934Z" level=info msg="StartContainer for \"069f4570ddacb7cd4cc4668a87425d1229a54328ac94d3efab8fffd735908203\"" Mar 3 12:44:32.912385 containerd[1519]: time="2026-03-03T12:44:32.912338940Z" level=info msg="connecting to shim 069f4570ddacb7cd4cc4668a87425d1229a54328ac94d3efab8fffd735908203" address="unix:///run/containerd/s/c61e4ddbcb2bdb3198e4d45ee999c5de71626bcf9b986a8fe663568978df085d" protocol=ttrpc version=3 Mar 3 12:44:32.940314 systemd[1]: Started cri-containerd-069f4570ddacb7cd4cc4668a87425d1229a54328ac94d3efab8fffd735908203.scope - libcontainer container 069f4570ddacb7cd4cc4668a87425d1229a54328ac94d3efab8fffd735908203. Mar 3 12:44:32.995277 containerd[1519]: time="2026-03-03T12:44:32.994760739Z" level=info msg="StartContainer for \"069f4570ddacb7cd4cc4668a87425d1229a54328ac94d3efab8fffd735908203\" returns successfully" Mar 3 12:44:33.282842 containerd[1519]: time="2026-03-03T12:44:33.282318112Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:33.284182 containerd[1519]: time="2026-03-03T12:44:33.284150631Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 3 12:44:33.289589 containerd[1519]: time="2026-03-03T12:44:33.289547707Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 407.339276ms" Mar 3 12:44:33.289848 containerd[1519]: time="2026-03-03T12:44:33.289766552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 3 12:44:33.290728 containerd[1519]: time="2026-03-03T12:44:33.290600690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 3 12:44:33.295070 containerd[1519]: time="2026-03-03T12:44:33.294695538Z" level=info msg="CreateContainer within sandbox \"97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 3 12:44:33.311142 kubelet[2772]: I0303 12:44:33.309474 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-f8cd6759c-9wqps" podStartSLOduration=34.088765528 podStartE2EDuration="37.309455415s" podCreationTimestamp="2026-03-03 12:43:56 +0000 UTC" firstStartedPulling="2026-03-03 12:44:29.659719947 +0000 UTC m=+52.824653106" lastFinishedPulling="2026-03-03 12:44:32.880409834 +0000 UTC m=+56.045342993" observedRunningTime="2026-03-03 12:44:33.304352785 +0000 UTC m=+56.469285904" watchObservedRunningTime="2026-03-03 12:44:33.309455415 +0000 UTC m=+56.474388614" Mar 3 12:44:33.311566 containerd[1519]: time="2026-03-03T12:44:33.309663419Z" level=info msg="Container fee1da7dac2d3361b410356fca8b80a46b48958dbf26020f1856ee8b94bf7bae: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:33.328396 containerd[1519]: time="2026-03-03T12:44:33.328002533Z" level=info msg="CreateContainer within sandbox \"97728d2605b9e5beac3c926d8b76b44ca66e67d4519dc11ed8fe75ea47885ad7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fee1da7dac2d3361b410356fca8b80a46b48958dbf26020f1856ee8b94bf7bae\"" Mar 3 12:44:33.329788 containerd[1519]: time="2026-03-03T12:44:33.329753891Z" level=info msg="StartContainer for \"fee1da7dac2d3361b410356fca8b80a46b48958dbf26020f1856ee8b94bf7bae\"" Mar 3 12:44:33.333029 containerd[1519]: time="2026-03-03T12:44:33.332986480Z" level=info msg="connecting to shim fee1da7dac2d3361b410356fca8b80a46b48958dbf26020f1856ee8b94bf7bae" address="unix:///run/containerd/s/cf239178012c918482d5476a36755d79de40fd2254b8a6de3577dbd1d6fdc3bf" protocol=ttrpc version=3 Mar 3 12:44:33.360370 systemd[1]: Started cri-containerd-fee1da7dac2d3361b410356fca8b80a46b48958dbf26020f1856ee8b94bf7bae.scope - libcontainer container fee1da7dac2d3361b410356fca8b80a46b48958dbf26020f1856ee8b94bf7bae. Mar 3 12:44:33.411012 containerd[1519]: time="2026-03-03T12:44:33.410943635Z" level=info msg="StartContainer for \"fee1da7dac2d3361b410356fca8b80a46b48958dbf26020f1856ee8b94bf7bae\" returns successfully" Mar 3 12:44:34.117288 systemd-networkd[1422]: cali43f4a5558fc: Gained IPv6LL Mar 3 12:44:34.294170 kubelet[2772]: I0303 12:44:34.293558 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 12:44:35.296276 kubelet[2772]: I0303 12:44:35.296219 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 12:44:36.853879 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1025168941.mount: Deactivated successfully. Mar 3 12:44:37.291844 containerd[1519]: time="2026-03-03T12:44:37.290708677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:37.292317 containerd[1519]: time="2026-03-03T12:44:37.292038188Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 3 12:44:37.293345 containerd[1519]: time="2026-03-03T12:44:37.292737965Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:37.295610 containerd[1519]: time="2026-03-03T12:44:37.295573552Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:37.296578 containerd[1519]: time="2026-03-03T12:44:37.296528335Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 4.005896565s" Mar 3 12:44:37.296697 containerd[1519]: time="2026-03-03T12:44:37.296680338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 3 12:44:37.298870 containerd[1519]: time="2026-03-03T12:44:37.298838430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 3 12:44:37.301916 containerd[1519]: time="2026-03-03T12:44:37.301883622Z" level=info msg="CreateContainer within sandbox \"9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 3 12:44:37.316505 containerd[1519]: time="2026-03-03T12:44:37.316425287Z" level=info msg="Container 0e83680fe991c91719426a259b4b3ebbf0acadd018093784f0156724a8c17d23: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:37.322714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2933734127.mount: Deactivated successfully. Mar 3 12:44:37.331491 containerd[1519]: time="2026-03-03T12:44:37.331167517Z" level=info msg="CreateContainer within sandbox \"9ad16ef4ec60ca9de3cd327dc1624f9841b6909f7c4ff25aa8266f086046c317\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"0e83680fe991c91719426a259b4b3ebbf0acadd018093784f0156724a8c17d23\"" Mar 3 12:44:37.332442 containerd[1519]: time="2026-03-03T12:44:37.332346425Z" level=info msg="StartContainer for \"0e83680fe991c91719426a259b4b3ebbf0acadd018093784f0156724a8c17d23\"" Mar 3 12:44:37.334823 containerd[1519]: time="2026-03-03T12:44:37.334745202Z" level=info msg="connecting to shim 0e83680fe991c91719426a259b4b3ebbf0acadd018093784f0156724a8c17d23" address="unix:///run/containerd/s/e5787cbaa399ff59b7f4a90cf4ad0458aaa471a38d9a9c3b7be7e079a04ccfa9" protocol=ttrpc version=3 Mar 3 12:44:37.363302 systemd[1]: Started cri-containerd-0e83680fe991c91719426a259b4b3ebbf0acadd018093784f0156724a8c17d23.scope - libcontainer container 0e83680fe991c91719426a259b4b3ebbf0acadd018093784f0156724a8c17d23. Mar 3 12:44:37.410176 containerd[1519]: time="2026-03-03T12:44:37.410121072Z" level=info msg="StartContainer for \"0e83680fe991c91719426a259b4b3ebbf0acadd018093784f0156724a8c17d23\" returns successfully" Mar 3 12:44:38.332569 kubelet[2772]: I0303 12:44:38.331031 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-ldzpx" podStartSLOduration=36.320574597 podStartE2EDuration="42.331016514s" podCreationTimestamp="2026-03-03 12:43:56 +0000 UTC" firstStartedPulling="2026-03-03 12:44:31.287277046 +0000 UTC m=+54.452210205" lastFinishedPulling="2026-03-03 12:44:37.297719003 +0000 UTC m=+60.462652122" observedRunningTime="2026-03-03 12:44:38.330326658 +0000 UTC m=+61.495259817" watchObservedRunningTime="2026-03-03 12:44:38.331016514 +0000 UTC m=+61.495949673" Mar 3 12:44:38.332569 kubelet[2772]: I0303 12:44:38.331918 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-f8cd6759c-wdvrv" podStartSLOduration=38.732772714 podStartE2EDuration="42.331905856s" podCreationTimestamp="2026-03-03 12:43:56 +0000 UTC" firstStartedPulling="2026-03-03 12:44:29.691340825 +0000 UTC m=+52.856273944" lastFinishedPulling="2026-03-03 12:44:33.290473927 +0000 UTC m=+56.455407086" observedRunningTime="2026-03-03 12:44:34.307620876 +0000 UTC m=+57.472554035" watchObservedRunningTime="2026-03-03 12:44:38.331905856 +0000 UTC m=+61.496839015" Mar 3 12:44:40.423137 containerd[1519]: time="2026-03-03T12:44:40.422202919Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:40.424456 containerd[1519]: time="2026-03-03T12:44:40.424425655Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 3 12:44:40.425203 containerd[1519]: time="2026-03-03T12:44:40.425179354Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:40.428676 containerd[1519]: time="2026-03-03T12:44:40.428648082Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 12:44:40.429233 containerd[1519]: time="2026-03-03T12:44:40.429180815Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.130208302s" Mar 3 12:44:40.429329 containerd[1519]: time="2026-03-03T12:44:40.429313419Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 3 12:44:40.448813 containerd[1519]: time="2026-03-03T12:44:40.448778430Z" level=info msg="CreateContainer within sandbox \"f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 3 12:44:40.457471 containerd[1519]: time="2026-03-03T12:44:40.457317486Z" level=info msg="Container a7ee5689b80bcf2adeef85848c258d89a1359722e20a0d6c26acbc1bb57f6c4f: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:44:40.468420 containerd[1519]: time="2026-03-03T12:44:40.468298964Z" level=info msg="CreateContainer within sandbox \"f09c2ac8eacc671e778686e5c1cb09c606345390685d42eddf7fead3cb896ddd\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a7ee5689b80bcf2adeef85848c258d89a1359722e20a0d6c26acbc1bb57f6c4f\"" Mar 3 12:44:40.469309 containerd[1519]: time="2026-03-03T12:44:40.469256428Z" level=info msg="StartContainer for \"a7ee5689b80bcf2adeef85848c258d89a1359722e20a0d6c26acbc1bb57f6c4f\"" Mar 3 12:44:40.470926 containerd[1519]: time="2026-03-03T12:44:40.470824547Z" level=info msg="connecting to shim a7ee5689b80bcf2adeef85848c258d89a1359722e20a0d6c26acbc1bb57f6c4f" address="unix:///run/containerd/s/f2420c969f91c13fac0f07d98ef8f56534b81a1c19f9cbbb0a390a1be6ceb08f" protocol=ttrpc version=3 Mar 3 12:44:40.496459 systemd[1]: Started cri-containerd-a7ee5689b80bcf2adeef85848c258d89a1359722e20a0d6c26acbc1bb57f6c4f.scope - libcontainer container a7ee5689b80bcf2adeef85848c258d89a1359722e20a0d6c26acbc1bb57f6c4f. Mar 3 12:44:40.540660 containerd[1519]: time="2026-03-03T12:44:40.540602791Z" level=info msg="StartContainer for \"a7ee5689b80bcf2adeef85848c258d89a1359722e20a0d6c26acbc1bb57f6c4f\" returns successfully" Mar 3 12:44:41.343814 kubelet[2772]: I0303 12:44:41.343526 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-569b58c775-xc222" podStartSLOduration=34.282119754 podStartE2EDuration="42.343486363s" podCreationTimestamp="2026-03-03 12:43:59 +0000 UTC" firstStartedPulling="2026-03-03 12:44:32.370520315 +0000 UTC m=+55.535453474" lastFinishedPulling="2026-03-03 12:44:40.431886924 +0000 UTC m=+63.596820083" observedRunningTime="2026-03-03 12:44:41.342044246 +0000 UTC m=+64.506977405" watchObservedRunningTime="2026-03-03 12:44:41.343486363 +0000 UTC m=+64.508419522" Mar 3 12:44:52.972364 kubelet[2772]: I0303 12:44:52.971747 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 12:44:55.485635 kubelet[2772]: I0303 12:44:55.485313 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 12:44:56.632308 systemd[1]: Started sshd@7-78.47.249.221:22-46.101.207.52:33286.service - OpenSSH per-connection server daemon (46.101.207.52:33286). Mar 3 12:44:56.718881 sshd[5452]: Connection closed by 46.101.207.52 port 33286 Mar 3 12:44:56.721877 systemd[1]: sshd@7-78.47.249.221:22-46.101.207.52:33286.service: Deactivated successfully. Mar 3 12:46:06.156378 systemd[1]: Started sshd@8-78.47.249.221:22-20.161.92.111:44676.service - OpenSSH per-connection server daemon (20.161.92.111:44676). Mar 3 12:46:06.690155 sshd[5726]: Accepted publickey for core from 20.161.92.111 port 44676 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:06.692031 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:06.698345 systemd-logind[1493]: New session 8 of user core. Mar 3 12:46:06.703802 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 3 12:46:07.069215 sshd[5729]: Connection closed by 20.161.92.111 port 44676 Mar 3 12:46:07.070271 sshd-session[5726]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:07.076974 systemd-logind[1493]: Session 8 logged out. Waiting for processes to exit. Mar 3 12:46:07.077548 systemd[1]: sshd@8-78.47.249.221:22-20.161.92.111:44676.service: Deactivated successfully. Mar 3 12:46:07.081482 systemd[1]: session-8.scope: Deactivated successfully. Mar 3 12:46:07.084570 systemd-logind[1493]: Removed session 8. Mar 3 12:46:12.180188 systemd[1]: Started sshd@9-78.47.249.221:22-20.161.92.111:60478.service - OpenSSH per-connection server daemon (20.161.92.111:60478). Mar 3 12:46:12.728423 sshd[5808]: Accepted publickey for core from 20.161.92.111 port 60478 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:12.731510 sshd-session[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:12.738599 systemd-logind[1493]: New session 9 of user core. Mar 3 12:46:12.745513 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 3 12:46:13.111744 sshd[5811]: Connection closed by 20.161.92.111 port 60478 Mar 3 12:46:13.113399 sshd-session[5808]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:13.119013 systemd[1]: sshd@9-78.47.249.221:22-20.161.92.111:60478.service: Deactivated successfully. Mar 3 12:46:13.121377 systemd[1]: session-9.scope: Deactivated successfully. Mar 3 12:46:13.122370 systemd-logind[1493]: Session 9 logged out. Waiting for processes to exit. Mar 3 12:46:13.124286 systemd-logind[1493]: Removed session 9. Mar 3 12:46:18.218639 systemd[1]: Started sshd@10-78.47.249.221:22-20.161.92.111:60494.service - OpenSSH per-connection server daemon (20.161.92.111:60494). Mar 3 12:46:18.756224 sshd[5850]: Accepted publickey for core from 20.161.92.111 port 60494 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:18.758414 sshd-session[5850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:18.765176 systemd-logind[1493]: New session 10 of user core. Mar 3 12:46:18.767271 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 3 12:46:19.136375 sshd[5853]: Connection closed by 20.161.92.111 port 60494 Mar 3 12:46:19.137306 sshd-session[5850]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:19.143497 systemd-logind[1493]: Session 10 logged out. Waiting for processes to exit. Mar 3 12:46:19.144075 systemd[1]: sshd@10-78.47.249.221:22-20.161.92.111:60494.service: Deactivated successfully. Mar 3 12:46:19.147748 systemd[1]: session-10.scope: Deactivated successfully. Mar 3 12:46:19.150735 systemd-logind[1493]: Removed session 10. Mar 3 12:46:24.251997 systemd[1]: Started sshd@11-78.47.249.221:22-20.161.92.111:35328.service - OpenSSH per-connection server daemon (20.161.92.111:35328). Mar 3 12:46:24.790276 sshd[5866]: Accepted publickey for core from 20.161.92.111 port 35328 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:24.791573 sshd-session[5866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:24.807774 systemd-logind[1493]: New session 11 of user core. Mar 3 12:46:24.818552 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 3 12:46:25.166259 sshd[5869]: Connection closed by 20.161.92.111 port 35328 Mar 3 12:46:25.166775 sshd-session[5866]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:25.171844 systemd[1]: sshd@11-78.47.249.221:22-20.161.92.111:35328.service: Deactivated successfully. Mar 3 12:46:25.172574 systemd-logind[1493]: Session 11 logged out. Waiting for processes to exit. Mar 3 12:46:25.174898 systemd[1]: session-11.scope: Deactivated successfully. Mar 3 12:46:25.177396 systemd-logind[1493]: Removed session 11. Mar 3 12:46:30.275351 systemd[1]: Started sshd@12-78.47.249.221:22-20.161.92.111:50170.service - OpenSSH per-connection server daemon (20.161.92.111:50170). Mar 3 12:46:30.815185 sshd[5902]: Accepted publickey for core from 20.161.92.111 port 50170 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:30.817097 sshd-session[5902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:30.822344 systemd-logind[1493]: New session 12 of user core. Mar 3 12:46:30.829382 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 3 12:46:31.192837 sshd[5905]: Connection closed by 20.161.92.111 port 50170 Mar 3 12:46:31.194426 sshd-session[5902]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:31.199807 systemd[1]: sshd@12-78.47.249.221:22-20.161.92.111:50170.service: Deactivated successfully. Mar 3 12:46:31.202564 systemd[1]: session-12.scope: Deactivated successfully. Mar 3 12:46:31.205562 systemd-logind[1493]: Session 12 logged out. Waiting for processes to exit. Mar 3 12:46:31.208053 systemd-logind[1493]: Removed session 12. Mar 3 12:46:31.299011 systemd[1]: Started sshd@13-78.47.249.221:22-20.161.92.111:50178.service - OpenSSH per-connection server daemon (20.161.92.111:50178). Mar 3 12:46:31.833181 sshd[5942]: Accepted publickey for core from 20.161.92.111 port 50178 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:31.834894 sshd-session[5942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:31.840894 systemd-logind[1493]: New session 13 of user core. Mar 3 12:46:31.850486 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 3 12:46:32.248701 sshd[5945]: Connection closed by 20.161.92.111 port 50178 Mar 3 12:46:32.249199 sshd-session[5942]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:32.255161 systemd-logind[1493]: Session 13 logged out. Waiting for processes to exit. Mar 3 12:46:32.255772 systemd[1]: sshd@13-78.47.249.221:22-20.161.92.111:50178.service: Deactivated successfully. Mar 3 12:46:32.258767 systemd[1]: session-13.scope: Deactivated successfully. Mar 3 12:46:32.261391 systemd-logind[1493]: Removed session 13. Mar 3 12:46:32.365135 systemd[1]: Started sshd@14-78.47.249.221:22-20.161.92.111:50194.service - OpenSSH per-connection server daemon (20.161.92.111:50194). Mar 3 12:46:32.893332 sshd[5954]: Accepted publickey for core from 20.161.92.111 port 50194 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:32.895802 sshd-session[5954]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:32.904714 systemd-logind[1493]: New session 14 of user core. Mar 3 12:46:32.910309 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 3 12:46:33.253261 sshd[5957]: Connection closed by 20.161.92.111 port 50194 Mar 3 12:46:33.253559 sshd-session[5954]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:33.260737 systemd[1]: sshd@14-78.47.249.221:22-20.161.92.111:50194.service: Deactivated successfully. Mar 3 12:46:33.265384 systemd[1]: session-14.scope: Deactivated successfully. Mar 3 12:46:33.268087 systemd-logind[1493]: Session 14 logged out. Waiting for processes to exit. Mar 3 12:46:33.270133 systemd-logind[1493]: Removed session 14. Mar 3 12:46:38.361522 systemd[1]: Started sshd@15-78.47.249.221:22-20.161.92.111:50206.service - OpenSSH per-connection server daemon (20.161.92.111:50206). Mar 3 12:46:38.886266 sshd[5971]: Accepted publickey for core from 20.161.92.111 port 50206 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:38.888821 sshd-session[5971]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:38.897437 systemd-logind[1493]: New session 15 of user core. Mar 3 12:46:38.903335 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 3 12:46:39.254905 sshd[5974]: Connection closed by 20.161.92.111 port 50206 Mar 3 12:46:39.254668 sshd-session[5971]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:39.262582 systemd[1]: sshd@15-78.47.249.221:22-20.161.92.111:50206.service: Deactivated successfully. Mar 3 12:46:39.265743 systemd[1]: session-15.scope: Deactivated successfully. Mar 3 12:46:39.268872 systemd-logind[1493]: Session 15 logged out. Waiting for processes to exit. Mar 3 12:46:39.270417 systemd-logind[1493]: Removed session 15. Mar 3 12:46:39.358152 systemd[1]: Started sshd@16-78.47.249.221:22-20.161.92.111:50210.service - OpenSSH per-connection server daemon (20.161.92.111:50210). Mar 3 12:46:39.879218 sshd[6009]: Accepted publickey for core from 20.161.92.111 port 50210 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:39.880948 sshd-session[6009]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:39.886095 systemd-logind[1493]: New session 16 of user core. Mar 3 12:46:39.892396 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 3 12:46:40.427307 sshd[6012]: Connection closed by 20.161.92.111 port 50210 Mar 3 12:46:40.428944 sshd-session[6009]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:40.435208 systemd[1]: sshd@16-78.47.249.221:22-20.161.92.111:50210.service: Deactivated successfully. Mar 3 12:46:40.438471 systemd[1]: session-16.scope: Deactivated successfully. Mar 3 12:46:40.440297 systemd-logind[1493]: Session 16 logged out. Waiting for processes to exit. Mar 3 12:46:40.442363 systemd-logind[1493]: Removed session 16. Mar 3 12:46:40.540329 systemd[1]: Started sshd@17-78.47.249.221:22-20.161.92.111:37184.service - OpenSSH per-connection server daemon (20.161.92.111:37184). Mar 3 12:46:41.086166 sshd[6022]: Accepted publickey for core from 20.161.92.111 port 37184 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:41.088316 sshd-session[6022]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:41.096645 systemd-logind[1493]: New session 17 of user core. Mar 3 12:46:41.100507 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 3 12:46:42.082534 sshd[6025]: Connection closed by 20.161.92.111 port 37184 Mar 3 12:46:42.081969 sshd-session[6022]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:42.090067 systemd[1]: sshd@17-78.47.249.221:22-20.161.92.111:37184.service: Deactivated successfully. Mar 3 12:46:42.092851 systemd[1]: session-17.scope: Deactivated successfully. Mar 3 12:46:42.094089 systemd-logind[1493]: Session 17 logged out. Waiting for processes to exit. Mar 3 12:46:42.097424 systemd-logind[1493]: Removed session 17. Mar 3 12:46:42.195216 systemd[1]: Started sshd@18-78.47.249.221:22-20.161.92.111:37198.service - OpenSSH per-connection server daemon (20.161.92.111:37198). Mar 3 12:46:42.743165 sshd[6069]: Accepted publickey for core from 20.161.92.111 port 37198 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:42.744635 sshd-session[6069]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:42.749617 systemd-logind[1493]: New session 18 of user core. Mar 3 12:46:42.759389 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 3 12:46:43.255056 sshd[6072]: Connection closed by 20.161.92.111 port 37198 Mar 3 12:46:43.257543 sshd-session[6069]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:43.265511 systemd[1]: sshd@18-78.47.249.221:22-20.161.92.111:37198.service: Deactivated successfully. Mar 3 12:46:43.269816 systemd[1]: session-18.scope: Deactivated successfully. Mar 3 12:46:43.272185 systemd-logind[1493]: Session 18 logged out. Waiting for processes to exit. Mar 3 12:46:43.275484 systemd-logind[1493]: Removed session 18. Mar 3 12:46:43.369336 systemd[1]: Started sshd@19-78.47.249.221:22-20.161.92.111:37214.service - OpenSSH per-connection server daemon (20.161.92.111:37214). Mar 3 12:46:43.909446 sshd[6086]: Accepted publickey for core from 20.161.92.111 port 37214 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:43.911801 sshd-session[6086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:43.917143 systemd-logind[1493]: New session 19 of user core. Mar 3 12:46:43.924357 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 3 12:46:44.282314 sshd[6089]: Connection closed by 20.161.92.111 port 37214 Mar 3 12:46:44.282749 sshd-session[6086]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:44.288417 systemd[1]: sshd@19-78.47.249.221:22-20.161.92.111:37214.service: Deactivated successfully. Mar 3 12:46:44.291719 systemd[1]: session-19.scope: Deactivated successfully. Mar 3 12:46:44.293680 systemd-logind[1493]: Session 19 logged out. Waiting for processes to exit. Mar 3 12:46:44.295324 systemd-logind[1493]: Removed session 19. Mar 3 12:46:49.391365 systemd[1]: Started sshd@20-78.47.249.221:22-20.161.92.111:37216.service - OpenSSH per-connection server daemon (20.161.92.111:37216). Mar 3 12:46:49.916374 sshd[6127]: Accepted publickey for core from 20.161.92.111 port 37216 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:49.918820 sshd-session[6127]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:49.926044 systemd-logind[1493]: New session 20 of user core. Mar 3 12:46:49.938534 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 3 12:46:50.282480 sshd[6130]: Connection closed by 20.161.92.111 port 37216 Mar 3 12:46:50.283660 sshd-session[6127]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:50.290773 systemd[1]: sshd@20-78.47.249.221:22-20.161.92.111:37216.service: Deactivated successfully. Mar 3 12:46:50.295768 systemd[1]: session-20.scope: Deactivated successfully. Mar 3 12:46:50.297233 systemd-logind[1493]: Session 20 logged out. Waiting for processes to exit. Mar 3 12:46:50.299258 systemd-logind[1493]: Removed session 20. Mar 3 12:46:55.391998 systemd[1]: Started sshd@21-78.47.249.221:22-20.161.92.111:57334.service - OpenSSH per-connection server daemon (20.161.92.111:57334). Mar 3 12:46:55.918181 sshd[6142]: Accepted publickey for core from 20.161.92.111 port 57334 ssh2: RSA SHA256:mNzaZZaspe4Fxf+jS07F0XWJMvP4QDyeKLuiSZXmzwQ Mar 3 12:46:55.920286 sshd-session[6142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 12:46:55.929848 systemd-logind[1493]: New session 21 of user core. Mar 3 12:46:55.938469 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 3 12:46:56.277129 sshd[6145]: Connection closed by 20.161.92.111 port 57334 Mar 3 12:46:56.278189 sshd-session[6142]: pam_unix(sshd:session): session closed for user core Mar 3 12:46:56.284510 systemd[1]: sshd@21-78.47.249.221:22-20.161.92.111:57334.service: Deactivated successfully. Mar 3 12:46:56.287608 systemd[1]: session-21.scope: Deactivated successfully. Mar 3 12:46:56.289753 systemd-logind[1493]: Session 21 logged out. Waiting for processes to exit. Mar 3 12:46:56.291701 systemd-logind[1493]: Removed session 21. Mar 3 12:47:12.666551 systemd[1]: cri-containerd-7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a.scope: Deactivated successfully. Mar 3 12:47:12.666863 systemd[1]: cri-containerd-7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a.scope: Consumed 15.418s CPU time, 133.1M memory peak, 4.5M read from disk. Mar 3 12:47:12.675414 containerd[1519]: time="2026-03-03T12:47:12.669269261Z" level=info msg="received container exit event container_id:\"7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a\" id:\"7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a\" pid:3101 exit_status:1 exited_at:{seconds:1772542032 nanos:667990892}" Mar 3 12:47:12.697549 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a-rootfs.mount: Deactivated successfully. Mar 3 12:47:12.869759 kubelet[2772]: I0303 12:47:12.869715 2772 scope.go:117] "RemoveContainer" containerID="7eaee0916c58c80a2cef2c2d003ac6584876847125bbdb69ae8def63760c1f8a" Mar 3 12:47:12.873094 containerd[1519]: time="2026-03-03T12:47:12.873058366Z" level=info msg="CreateContainer within sandbox \"c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 3 12:47:12.883872 containerd[1519]: time="2026-03-03T12:47:12.883200521Z" level=info msg="Container 02b784b987f1ba9f253a38d165a38210135f43c652f23db52de6f398b9fcbf4e: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:47:12.892844 containerd[1519]: time="2026-03-03T12:47:12.892798432Z" level=info msg="CreateContainer within sandbox \"c3efc65b5c07e4599df4090789450b0ba2350d89d2172da1b6d23b075d8855c6\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"02b784b987f1ba9f253a38d165a38210135f43c652f23db52de6f398b9fcbf4e\"" Mar 3 12:47:12.893461 containerd[1519]: time="2026-03-03T12:47:12.893429636Z" level=info msg="StartContainer for \"02b784b987f1ba9f253a38d165a38210135f43c652f23db52de6f398b9fcbf4e\"" Mar 3 12:47:12.894520 containerd[1519]: time="2026-03-03T12:47:12.894486324Z" level=info msg="connecting to shim 02b784b987f1ba9f253a38d165a38210135f43c652f23db52de6f398b9fcbf4e" address="unix:///run/containerd/s/71587103546140843c8514e03a1a174366315456d97ad834502fe29c05a99de6" protocol=ttrpc version=3 Mar 3 12:47:12.917310 systemd[1]: Started cri-containerd-02b784b987f1ba9f253a38d165a38210135f43c652f23db52de6f398b9fcbf4e.scope - libcontainer container 02b784b987f1ba9f253a38d165a38210135f43c652f23db52de6f398b9fcbf4e. Mar 3 12:47:12.939827 systemd[1]: cri-containerd-073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220.scope: Deactivated successfully. Mar 3 12:47:12.940269 systemd[1]: cri-containerd-073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220.scope: Consumed 4.548s CPU time, 65M memory peak, 2.2M read from disk. Mar 3 12:47:12.942722 containerd[1519]: time="2026-03-03T12:47:12.942677400Z" level=info msg="received container exit event container_id:\"073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220\" id:\"073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220\" pid:2606 exit_status:1 exited_at:{seconds:1772542032 nanos:942397158}" Mar 3 12:47:12.994155 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220-rootfs.mount: Deactivated successfully. Mar 3 12:47:13.000311 containerd[1519]: time="2026-03-03T12:47:13.000184505Z" level=info msg="StartContainer for \"02b784b987f1ba9f253a38d165a38210135f43c652f23db52de6f398b9fcbf4e\" returns successfully" Mar 3 12:47:13.093475 kubelet[2772]: E0303 12:47:13.093094 2772 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38270->10.0.0.2:2379: read: connection timed out" Mar 3 12:47:13.871219 kubelet[2772]: I0303 12:47:13.871177 2772 scope.go:117] "RemoveContainer" containerID="073b827cc78cd5c19130823288f5711380ad764178e58d665dcaea37865c4220" Mar 3 12:47:13.875012 containerd[1519]: time="2026-03-03T12:47:13.874964674Z" level=info msg="CreateContainer within sandbox \"83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 3 12:47:13.887140 containerd[1519]: time="2026-03-03T12:47:13.887070168Z" level=info msg="Container 148622d809f2ef4b941bc87191b8df4f9130a97877e32189b7cb501bfa3e8a36: CDI devices from CRI Config.CDIDevices: []" Mar 3 12:47:13.900035 containerd[1519]: time="2026-03-03T12:47:13.899847068Z" level=info msg="CreateContainer within sandbox \"83d329c0f982f51741ecc33e9fa5e3161e34beeb9054fe0c2bf166caab3112d7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"148622d809f2ef4b941bc87191b8df4f9130a97877e32189b7cb501bfa3e8a36\"" Mar 3 12:47:13.900965 containerd[1519]: time="2026-03-03T12:47:13.900891596Z" level=info msg="StartContainer for \"148622d809f2ef4b941bc87191b8df4f9130a97877e32189b7cb501bfa3e8a36\"" Mar 3 12:47:13.902098 containerd[1519]: time="2026-03-03T12:47:13.902058685Z" level=info msg="connecting to shim 148622d809f2ef4b941bc87191b8df4f9130a97877e32189b7cb501bfa3e8a36" address="unix:///run/containerd/s/df42b206f8201e7ae8ca83c9347846ee090bfbf9b2ba02326c061ff64c901682" protocol=ttrpc version=3 Mar 3 12:47:13.927308 systemd[1]: Started cri-containerd-148622d809f2ef4b941bc87191b8df4f9130a97877e32189b7cb501bfa3e8a36.scope - libcontainer container 148622d809f2ef4b941bc87191b8df4f9130a97877e32189b7cb501bfa3e8a36. Mar 3 12:47:13.975031 containerd[1519]: time="2026-03-03T12:47:13.974983653Z" level=info msg="StartContainer for \"148622d809f2ef4b941bc87191b8df4f9130a97877e32189b7cb501bfa3e8a36\" returns successfully"