Mar 12 23:41:49.798914 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 12 23:41:49.798937 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Mar 12 22:07:21 -00 2026 Mar 12 23:41:49.798948 kernel: KASLR enabled Mar 12 23:41:49.798955 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 12 23:41:49.798960 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Mar 12 23:41:49.798966 kernel: random: crng init done Mar 12 23:41:49.798972 kernel: secureboot: Secure boot disabled Mar 12 23:41:49.798978 kernel: ACPI: Early table checksum verification disabled Mar 12 23:41:49.798984 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Mar 12 23:41:49.798990 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Mar 12 23:41:49.798997 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799003 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799008 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799014 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799021 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799029 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799035 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799041 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799047 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:41:49.799053 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Mar 12 23:41:49.799059 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Mar 12 23:41:49.799065 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 12 23:41:49.799071 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Mar 12 23:41:49.799078 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Mar 12 23:41:49.799084 kernel: Zone ranges: Mar 12 23:41:49.799089 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 12 23:41:49.799097 kernel: DMA32 empty Mar 12 23:41:49.799103 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Mar 12 23:41:49.799109 kernel: Device empty Mar 12 23:41:49.799115 kernel: Movable zone start for each node Mar 12 23:41:49.799121 kernel: Early memory node ranges Mar 12 23:41:49.799127 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Mar 12 23:41:49.799133 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Mar 12 23:41:49.799139 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Mar 12 23:41:49.799145 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Mar 12 23:41:49.799151 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Mar 12 23:41:49.799157 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Mar 12 23:41:49.799163 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Mar 12 23:41:49.799170 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Mar 12 23:41:49.799177 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Mar 12 23:41:49.799186 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Mar 12 23:41:49.799193 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 12 23:41:49.799199 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Mar 12 23:41:49.799207 kernel: psci: probing for conduit method from ACPI. Mar 12 23:41:49.799213 kernel: psci: PSCIv1.1 detected in firmware. Mar 12 23:41:49.799219 kernel: psci: Using standard PSCI v0.2 function IDs Mar 12 23:41:49.799227 kernel: psci: Trusted OS migration not required Mar 12 23:41:49.799233 kernel: psci: SMC Calling Convention v1.1 Mar 12 23:41:49.799239 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 12 23:41:49.799246 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 12 23:41:49.799252 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 12 23:41:49.799259 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 12 23:41:49.799266 kernel: Detected PIPT I-cache on CPU0 Mar 12 23:41:49.799273 kernel: CPU features: detected: GIC system register CPU interface Mar 12 23:41:49.799280 kernel: CPU features: detected: Spectre-v4 Mar 12 23:41:49.799287 kernel: CPU features: detected: Spectre-BHB Mar 12 23:41:49.799293 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 12 23:41:49.799300 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 12 23:41:49.799306 kernel: CPU features: detected: ARM erratum 1418040 Mar 12 23:41:49.799313 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 12 23:41:49.799319 kernel: alternatives: applying boot alternatives Mar 12 23:41:49.799327 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:41:49.799334 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 12 23:41:49.799340 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 12 23:41:49.799347 kernel: Fallback order for Node 0: 0 Mar 12 23:41:49.799354 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Mar 12 23:41:49.799361 kernel: Policy zone: Normal Mar 12 23:41:49.799367 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 12 23:41:49.799374 kernel: software IO TLB: area num 2. Mar 12 23:41:49.799380 kernel: software IO TLB: mapped [mem 0x00000000f5000000-0x00000000f9000000] (64MB) Mar 12 23:41:49.799387 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 12 23:41:49.799394 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 12 23:41:49.799401 kernel: rcu: RCU event tracing is enabled. Mar 12 23:41:49.799408 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 12 23:41:49.799415 kernel: Trampoline variant of Tasks RCU enabled. Mar 12 23:41:49.799421 kernel: Tracing variant of Tasks RCU enabled. Mar 12 23:41:49.799428 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 12 23:41:49.799436 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 12 23:41:49.799443 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 12 23:41:49.799449 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 12 23:41:49.799456 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 12 23:41:49.799462 kernel: GICv3: 256 SPIs implemented Mar 12 23:41:49.799469 kernel: GICv3: 0 Extended SPIs implemented Mar 12 23:41:49.799475 kernel: Root IRQ handler: gic_handle_irq Mar 12 23:41:49.799482 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 12 23:41:49.799488 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Mar 12 23:41:49.799495 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 12 23:41:49.799501 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 12 23:41:49.799510 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Mar 12 23:41:49.799516 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Mar 12 23:41:49.799524 kernel: GICv3: using LPI property table @0x0000000100120000 Mar 12 23:41:49.799531 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Mar 12 23:41:49.799539 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 12 23:41:49.799545 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:41:49.799552 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 12 23:41:49.799558 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 12 23:41:49.799565 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 12 23:41:49.799572 kernel: Console: colour dummy device 80x25 Mar 12 23:41:49.799579 kernel: ACPI: Core revision 20240827 Mar 12 23:41:49.799587 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 12 23:41:49.799594 kernel: pid_max: default: 32768 minimum: 301 Mar 12 23:41:49.799601 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 12 23:41:49.799607 kernel: landlock: Up and running. Mar 12 23:41:49.799650 kernel: SELinux: Initializing. Mar 12 23:41:49.799658 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 12 23:41:49.799665 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 12 23:41:49.799672 kernel: rcu: Hierarchical SRCU implementation. Mar 12 23:41:49.799679 kernel: rcu: Max phase no-delay instances is 400. Mar 12 23:41:49.799689 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 12 23:41:49.799696 kernel: Remapping and enabling EFI services. Mar 12 23:41:49.799705 kernel: smp: Bringing up secondary CPUs ... Mar 12 23:41:49.799713 kernel: Detected PIPT I-cache on CPU1 Mar 12 23:41:49.799721 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 12 23:41:49.799728 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Mar 12 23:41:49.799736 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:41:49.799743 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 12 23:41:49.799750 kernel: smp: Brought up 1 node, 2 CPUs Mar 12 23:41:49.799758 kernel: SMP: Total of 2 processors activated. Mar 12 23:41:49.799769 kernel: CPU: All CPU(s) started at EL1 Mar 12 23:41:49.799777 kernel: CPU features: detected: 32-bit EL0 Support Mar 12 23:41:49.799786 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 12 23:41:49.799793 kernel: CPU features: detected: Common not Private translations Mar 12 23:41:49.799800 kernel: CPU features: detected: CRC32 instructions Mar 12 23:41:49.799807 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 12 23:41:49.799827 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 12 23:41:49.799837 kernel: CPU features: detected: LSE atomic instructions Mar 12 23:41:49.799844 kernel: CPU features: detected: Privileged Access Never Mar 12 23:41:49.799851 kernel: CPU features: detected: RAS Extension Support Mar 12 23:41:49.799858 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 12 23:41:49.799866 kernel: alternatives: applying system-wide alternatives Mar 12 23:41:49.799873 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Mar 12 23:41:49.799880 kernel: Memory: 3858852K/4096000K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 215668K reserved, 16384K cma-reserved) Mar 12 23:41:49.799887 kernel: devtmpfs: initialized Mar 12 23:41:49.799894 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 12 23:41:49.799903 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 12 23:41:49.799910 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 12 23:41:49.799917 kernel: 0 pages in range for non-PLT usage Mar 12 23:41:49.799923 kernel: 508400 pages in range for PLT usage Mar 12 23:41:49.799930 kernel: pinctrl core: initialized pinctrl subsystem Mar 12 23:41:49.799937 kernel: SMBIOS 3.0.0 present. Mar 12 23:41:49.799944 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Mar 12 23:41:49.799951 kernel: DMI: Memory slots populated: 1/1 Mar 12 23:41:49.799958 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 12 23:41:49.799967 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 12 23:41:49.799974 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 12 23:41:49.799981 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 12 23:41:49.799988 kernel: audit: initializing netlink subsys (disabled) Mar 12 23:41:49.799995 kernel: audit: type=2000 audit(0.016:1): state=initialized audit_enabled=0 res=1 Mar 12 23:41:49.800002 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 12 23:41:49.800009 kernel: cpuidle: using governor menu Mar 12 23:41:49.800016 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 12 23:41:49.800023 kernel: ASID allocator initialised with 32768 entries Mar 12 23:41:49.800032 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 12 23:41:49.800041 kernel: Serial: AMBA PL011 UART driver Mar 12 23:41:49.800049 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 12 23:41:49.800057 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 12 23:41:49.800064 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 12 23:41:49.800072 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 12 23:41:49.800079 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 12 23:41:49.800086 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 12 23:41:49.800093 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 12 23:41:49.800101 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 12 23:41:49.800108 kernel: ACPI: Added _OSI(Module Device) Mar 12 23:41:49.800115 kernel: ACPI: Added _OSI(Processor Device) Mar 12 23:41:49.800122 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 12 23:41:49.800129 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 12 23:41:49.800136 kernel: ACPI: Interpreter enabled Mar 12 23:41:49.800143 kernel: ACPI: Using GIC for interrupt routing Mar 12 23:41:49.800150 kernel: ACPI: MCFG table detected, 1 entries Mar 12 23:41:49.800157 kernel: ACPI: CPU0 has been hot-added Mar 12 23:41:49.800165 kernel: ACPI: CPU1 has been hot-added Mar 12 23:41:49.800172 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 12 23:41:49.800179 kernel: printk: legacy console [ttyAMA0] enabled Mar 12 23:41:49.800186 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 12 23:41:49.800342 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 12 23:41:49.800414 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 12 23:41:49.800477 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 12 23:41:49.800538 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 12 23:41:49.800595 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 12 23:41:49.800604 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 12 23:41:49.800611 kernel: PCI host bridge to bus 0000:00 Mar 12 23:41:49.800695 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 12 23:41:49.800772 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 12 23:41:49.800868 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 12 23:41:49.800925 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 12 23:41:49.801011 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Mar 12 23:41:49.801092 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Mar 12 23:41:49.801169 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Mar 12 23:41:49.801233 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Mar 12 23:41:49.801300 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.801363 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Mar 12 23:41:49.801427 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 12 23:41:49.801490 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Mar 12 23:41:49.801550 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Mar 12 23:41:49.801634 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.801700 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Mar 12 23:41:49.801760 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 12 23:41:49.801835 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Mar 12 23:41:49.801925 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.801987 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Mar 12 23:41:49.802049 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 12 23:41:49.802106 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Mar 12 23:41:49.802165 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Mar 12 23:41:49.802230 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.802288 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Mar 12 23:41:49.802349 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 12 23:41:49.802407 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Mar 12 23:41:49.802465 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Mar 12 23:41:49.802536 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.802596 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Mar 12 23:41:49.802704 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 12 23:41:49.802767 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Mar 12 23:41:49.802856 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Mar 12 23:41:49.802965 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.803029 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Mar 12 23:41:49.803088 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 12 23:41:49.803145 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Mar 12 23:41:49.803203 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Mar 12 23:41:49.803267 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.803330 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Mar 12 23:41:49.803391 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 12 23:41:49.803450 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Mar 12 23:41:49.803508 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Mar 12 23:41:49.803572 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.803648 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Mar 12 23:41:49.803714 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 12 23:41:49.803772 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Mar 12 23:41:49.804779 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:41:49.804884 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Mar 12 23:41:49.804945 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 12 23:41:49.805004 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Mar 12 23:41:49.805071 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Mar 12 23:41:49.805391 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Mar 12 23:41:49.805476 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 12 23:41:49.806899 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Mar 12 23:41:49.806995 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Mar 12 23:41:49.807058 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 12 23:41:49.807130 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 12 23:41:49.807192 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Mar 12 23:41:49.807271 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Mar 12 23:41:49.807334 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Mar 12 23:41:49.807395 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Mar 12 23:41:49.807464 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 12 23:41:49.807527 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Mar 12 23:41:49.807597 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 12 23:41:49.807724 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Mar 12 23:41:49.807791 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Mar 12 23:41:49.808960 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Mar 12 23:41:49.809037 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Mar 12 23:41:49.809099 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Mar 12 23:41:49.809169 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 12 23:41:49.809233 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Mar 12 23:41:49.809299 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Mar 12 23:41:49.809359 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 12 23:41:49.809423 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Mar 12 23:41:49.809485 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Mar 12 23:41:49.809544 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Mar 12 23:41:49.809635 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Mar 12 23:41:49.809702 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Mar 12 23:41:49.809766 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Mar 12 23:41:49.810882 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 12 23:41:49.810970 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Mar 12 23:41:49.811032 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Mar 12 23:41:49.811096 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 12 23:41:49.811156 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Mar 12 23:41:49.811220 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Mar 12 23:41:49.811283 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 12 23:41:49.811343 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Mar 12 23:41:49.811401 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Mar 12 23:41:49.811465 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 12 23:41:49.811525 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Mar 12 23:41:49.811584 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Mar 12 23:41:49.811673 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 12 23:41:49.811737 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Mar 12 23:41:49.811795 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Mar 12 23:41:49.813947 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 12 23:41:49.814028 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Mar 12 23:41:49.814089 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Mar 12 23:41:49.814154 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 12 23:41:49.814221 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Mar 12 23:41:49.814281 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Mar 12 23:41:49.814344 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Mar 12 23:41:49.814404 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Mar 12 23:41:49.814466 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Mar 12 23:41:49.814525 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Mar 12 23:41:49.814587 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Mar 12 23:41:49.814778 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Mar 12 23:41:49.814886 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Mar 12 23:41:49.814950 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Mar 12 23:41:49.815013 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Mar 12 23:41:49.815073 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Mar 12 23:41:49.815135 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Mar 12 23:41:49.815195 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Mar 12 23:41:49.815257 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Mar 12 23:41:49.815322 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Mar 12 23:41:49.815383 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Mar 12 23:41:49.815712 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Mar 12 23:41:49.815780 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Mar 12 23:41:49.815897 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Mar 12 23:41:49.815968 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Mar 12 23:41:49.816027 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Mar 12 23:41:49.816088 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Mar 12 23:41:49.816154 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Mar 12 23:41:49.816214 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Mar 12 23:41:49.816274 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Mar 12 23:41:49.816334 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Mar 12 23:41:49.816398 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Mar 12 23:41:49.816459 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Mar 12 23:41:49.816518 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Mar 12 23:41:49.816581 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Mar 12 23:41:49.816661 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Mar 12 23:41:49.816726 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Mar 12 23:41:49.816784 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Mar 12 23:41:49.818408 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Mar 12 23:41:49.818499 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Mar 12 23:41:49.818561 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Mar 12 23:41:49.818645 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Mar 12 23:41:49.818714 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Mar 12 23:41:49.818774 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Mar 12 23:41:49.820886 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Mar 12 23:41:49.821000 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Mar 12 23:41:49.821066 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Mar 12 23:41:49.821136 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Mar 12 23:41:49.821198 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 12 23:41:49.821258 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 12 23:41:49.821317 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Mar 12 23:41:49.821377 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Mar 12 23:41:49.821444 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Mar 12 23:41:49.821507 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 12 23:41:49.821569 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 12 23:41:49.821650 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Mar 12 23:41:49.821716 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Mar 12 23:41:49.821783 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Mar 12 23:41:49.821910 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Mar 12 23:41:49.821977 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 12 23:41:49.822037 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 12 23:41:49.822101 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Mar 12 23:41:49.822158 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Mar 12 23:41:49.822226 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Mar 12 23:41:49.822287 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 12 23:41:49.822347 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 12 23:41:49.822406 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Mar 12 23:41:49.822466 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Mar 12 23:41:49.822537 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Mar 12 23:41:49.822599 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Mar 12 23:41:49.822712 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 12 23:41:49.822777 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 12 23:41:49.823136 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Mar 12 23:41:49.823207 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Mar 12 23:41:49.823275 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Mar 12 23:41:49.823344 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Mar 12 23:41:49.823407 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 12 23:41:49.823480 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 12 23:41:49.823539 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Mar 12 23:41:49.823597 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 12 23:41:49.823688 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Mar 12 23:41:49.823753 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Mar 12 23:41:49.825232 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Mar 12 23:41:49.825345 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 12 23:41:49.825527 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 12 23:41:49.825604 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Mar 12 23:41:49.825709 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 12 23:41:49.825774 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 12 23:41:49.825854 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 12 23:41:49.825914 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Mar 12 23:41:49.825974 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 12 23:41:49.826037 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 12 23:41:49.826097 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Mar 12 23:41:49.826166 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Mar 12 23:41:49.826227 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Mar 12 23:41:49.826289 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 12 23:41:49.826343 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 12 23:41:49.826396 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 12 23:41:49.826468 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 12 23:41:49.826525 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Mar 12 23:41:49.826584 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Mar 12 23:41:49.826673 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Mar 12 23:41:49.826735 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Mar 12 23:41:49.826791 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Mar 12 23:41:49.828965 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Mar 12 23:41:49.829045 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Mar 12 23:41:49.829108 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Mar 12 23:41:49.829174 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Mar 12 23:41:49.829233 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Mar 12 23:41:49.829289 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Mar 12 23:41:49.829358 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Mar 12 23:41:49.829415 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Mar 12 23:41:49.829471 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Mar 12 23:41:49.829538 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Mar 12 23:41:49.829596 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Mar 12 23:41:49.829702 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 12 23:41:49.829771 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Mar 12 23:41:49.829848 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Mar 12 23:41:49.829906 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 12 23:41:49.829969 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Mar 12 23:41:49.830030 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Mar 12 23:41:49.830084 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 12 23:41:49.830149 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Mar 12 23:41:49.830204 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Mar 12 23:41:49.830259 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Mar 12 23:41:49.830268 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 12 23:41:49.830276 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 12 23:41:49.830286 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 12 23:41:49.830296 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 12 23:41:49.830304 kernel: iommu: Default domain type: Translated Mar 12 23:41:49.830311 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 12 23:41:49.830319 kernel: efivars: Registered efivars operations Mar 12 23:41:49.830326 kernel: vgaarb: loaded Mar 12 23:41:49.830333 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 12 23:41:49.830341 kernel: VFS: Disk quotas dquot_6.6.0 Mar 12 23:41:49.830348 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 12 23:41:49.830357 kernel: pnp: PnP ACPI init Mar 12 23:41:49.830428 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 12 23:41:49.830438 kernel: pnp: PnP ACPI: found 1 devices Mar 12 23:41:49.830446 kernel: NET: Registered PF_INET protocol family Mar 12 23:41:49.830453 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 12 23:41:49.830461 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 12 23:41:49.830468 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 12 23:41:49.830476 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 23:41:49.830485 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 12 23:41:49.830493 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 12 23:41:49.830501 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 12 23:41:49.830508 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 12 23:41:49.830516 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 12 23:41:49.830586 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Mar 12 23:41:49.830597 kernel: PCI: CLS 0 bytes, default 64 Mar 12 23:41:49.830604 kernel: kvm [1]: HYP mode not available Mar 12 23:41:49.830624 kernel: Initialise system trusted keyrings Mar 12 23:41:49.830636 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 12 23:41:49.830643 kernel: Key type asymmetric registered Mar 12 23:41:49.830651 kernel: Asymmetric key parser 'x509' registered Mar 12 23:41:49.830658 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 12 23:41:49.830666 kernel: io scheduler mq-deadline registered Mar 12 23:41:49.830673 kernel: io scheduler kyber registered Mar 12 23:41:49.830681 kernel: io scheduler bfq registered Mar 12 23:41:49.830688 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 12 23:41:49.830766 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Mar 12 23:41:49.832895 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Mar 12 23:41:49.832987 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.833055 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Mar 12 23:41:49.833118 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Mar 12 23:41:49.833180 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.833246 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Mar 12 23:41:49.833309 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Mar 12 23:41:49.833369 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.833440 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Mar 12 23:41:49.833503 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Mar 12 23:41:49.833562 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.833646 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Mar 12 23:41:49.833718 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Mar 12 23:41:49.833783 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.833867 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Mar 12 23:41:49.833933 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Mar 12 23:41:49.833993 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.834057 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Mar 12 23:41:49.834119 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Mar 12 23:41:49.834179 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.834243 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Mar 12 23:41:49.834303 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Mar 12 23:41:49.834362 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.834375 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Mar 12 23:41:49.834437 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Mar 12 23:41:49.834498 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Mar 12 23:41:49.834556 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:41:49.834566 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 12 23:41:49.834574 kernel: ACPI: button: Power Button [PWRB] Mar 12 23:41:49.834582 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 12 23:41:49.834693 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Mar 12 23:41:49.834771 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Mar 12 23:41:49.834782 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 12 23:41:49.834790 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 12 23:41:49.836326 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Mar 12 23:41:49.836347 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Mar 12 23:41:49.836355 kernel: thunder_xcv, ver 1.0 Mar 12 23:41:49.836363 kernel: thunder_bgx, ver 1.0 Mar 12 23:41:49.836371 kernel: nicpf, ver 1.0 Mar 12 23:41:49.836378 kernel: nicvf, ver 1.0 Mar 12 23:41:49.836472 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 12 23:41:49.836531 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-12T23:41:49 UTC (1773358909) Mar 12 23:41:49.836541 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 12 23:41:49.836548 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 12 23:41:49.836556 kernel: watchdog: NMI not fully supported Mar 12 23:41:49.836564 kernel: watchdog: Hard watchdog permanently disabled Mar 12 23:41:49.836572 kernel: NET: Registered PF_INET6 protocol family Mar 12 23:41:49.836579 kernel: Segment Routing with IPv6 Mar 12 23:41:49.836589 kernel: In-situ OAM (IOAM) with IPv6 Mar 12 23:41:49.836596 kernel: NET: Registered PF_PACKET protocol family Mar 12 23:41:49.836603 kernel: Key type dns_resolver registered Mar 12 23:41:49.836611 kernel: registered taskstats version 1 Mar 12 23:41:49.836658 kernel: Loading compiled-in X.509 certificates Mar 12 23:41:49.836667 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 653709f5ad64856a37b70c07139630123477ee1c' Mar 12 23:41:49.836674 kernel: Demotion targets for Node 0: null Mar 12 23:41:49.836682 kernel: Key type .fscrypt registered Mar 12 23:41:49.836689 kernel: Key type fscrypt-provisioning registered Mar 12 23:41:49.836699 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 12 23:41:49.836707 kernel: ima: Allocated hash algorithm: sha1 Mar 12 23:41:49.836714 kernel: ima: No architecture policies found Mar 12 23:41:49.836721 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 12 23:41:49.836729 kernel: clk: Disabling unused clocks Mar 12 23:41:49.836736 kernel: PM: genpd: Disabling unused power domains Mar 12 23:41:49.836743 kernel: Warning: unable to open an initial console. Mar 12 23:41:49.836751 kernel: Freeing unused kernel memory: 39552K Mar 12 23:41:49.836759 kernel: Run /init as init process Mar 12 23:41:49.836766 kernel: with arguments: Mar 12 23:41:49.836776 kernel: /init Mar 12 23:41:49.836783 kernel: with environment: Mar 12 23:41:49.836790 kernel: HOME=/ Mar 12 23:41:49.836797 kernel: TERM=linux Mar 12 23:41:49.836806 systemd[1]: Successfully made /usr/ read-only. Mar 12 23:41:49.836836 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:41:49.836845 systemd[1]: Detected virtualization kvm. Mar 12 23:41:49.836855 systemd[1]: Detected architecture arm64. Mar 12 23:41:49.836863 systemd[1]: Running in initrd. Mar 12 23:41:49.836871 systemd[1]: No hostname configured, using default hostname. Mar 12 23:41:49.836879 systemd[1]: Hostname set to . Mar 12 23:41:49.836887 systemd[1]: Initializing machine ID from VM UUID. Mar 12 23:41:49.836895 systemd[1]: Queued start job for default target initrd.target. Mar 12 23:41:49.836903 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:41:49.836911 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:41:49.836921 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 12 23:41:49.836929 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:41:49.836937 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 12 23:41:49.836947 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 12 23:41:49.836957 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 12 23:41:49.836965 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 12 23:41:49.836973 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:41:49.836982 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:41:49.836990 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:41:49.836998 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:41:49.837006 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:41:49.837014 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:41:49.837021 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:41:49.837030 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:41:49.837037 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 12 23:41:49.837045 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 12 23:41:49.837055 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:41:49.837063 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:41:49.837071 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:41:49.837079 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:41:49.837087 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 12 23:41:49.837094 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:41:49.837102 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 12 23:41:49.837110 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 12 23:41:49.837120 systemd[1]: Starting systemd-fsck-usr.service... Mar 12 23:41:49.837128 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:41:49.837136 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:41:49.837144 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:41:49.837152 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 12 23:41:49.837192 systemd-journald[245]: Collecting audit messages is disabled. Mar 12 23:41:49.837215 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:41:49.837224 systemd[1]: Finished systemd-fsck-usr.service. Mar 12 23:41:49.837234 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 23:41:49.837242 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:49.837250 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 23:41:49.837258 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:41:49.837267 systemd-journald[245]: Journal started Mar 12 23:41:49.837285 systemd-journald[245]: Runtime Journal (/run/log/journal/0fc5ca300ffb47658ab64707a5a8a4e0) is 8M, max 76.5M, 68.5M free. Mar 12 23:41:49.818881 systemd-modules-load[246]: Inserted module 'overlay' Mar 12 23:41:49.842364 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:41:49.842408 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 12 23:41:49.844670 systemd-modules-load[246]: Inserted module 'br_netfilter' Mar 12 23:41:49.845546 kernel: Bridge firewalling registered Mar 12 23:41:49.847119 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:41:49.849943 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:41:49.851135 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:41:49.858773 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:41:49.874385 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:41:49.878663 systemd-tmpfiles[268]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 12 23:41:49.880751 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:41:49.882904 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:41:49.884721 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:41:49.888172 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 12 23:41:49.892600 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:41:49.925268 dracut-cmdline[286]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:41:49.937057 systemd-resolved[287]: Positive Trust Anchors: Mar 12 23:41:49.937073 systemd-resolved[287]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:41:49.937111 systemd-resolved[287]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:41:49.948960 systemd-resolved[287]: Defaulting to hostname 'linux'. Mar 12 23:41:49.950559 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:41:49.951966 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:41:50.018884 kernel: SCSI subsystem initialized Mar 12 23:41:50.022885 kernel: Loading iSCSI transport class v2.0-870. Mar 12 23:41:50.030985 kernel: iscsi: registered transport (tcp) Mar 12 23:41:50.044850 kernel: iscsi: registered transport (qla4xxx) Mar 12 23:41:50.044909 kernel: QLogic iSCSI HBA Driver Mar 12 23:41:50.068756 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:41:50.099481 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:41:50.106026 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:41:50.163175 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 12 23:41:50.165160 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 12 23:41:50.234858 kernel: raid6: neonx8 gen() 15418 MB/s Mar 12 23:41:50.250890 kernel: raid6: neonx4 gen() 11979 MB/s Mar 12 23:41:50.267883 kernel: raid6: neonx2 gen() 13105 MB/s Mar 12 23:41:50.284885 kernel: raid6: neonx1 gen() 10428 MB/s Mar 12 23:41:50.301881 kernel: raid6: int64x8 gen() 6870 MB/s Mar 12 23:41:50.318878 kernel: raid6: int64x4 gen() 7322 MB/s Mar 12 23:41:50.335879 kernel: raid6: int64x2 gen() 6068 MB/s Mar 12 23:41:50.352880 kernel: raid6: int64x1 gen() 5036 MB/s Mar 12 23:41:50.352947 kernel: raid6: using algorithm neonx8 gen() 15418 MB/s Mar 12 23:41:50.369883 kernel: raid6: .... xor() 11737 MB/s, rmw enabled Mar 12 23:41:50.369941 kernel: raid6: using neon recovery algorithm Mar 12 23:41:50.375113 kernel: xor: measuring software checksum speed Mar 12 23:41:50.375170 kernel: 8regs : 21624 MB/sec Mar 12 23:41:50.375194 kernel: 32regs : 20634 MB/sec Mar 12 23:41:50.375216 kernel: arm64_neon : 28099 MB/sec Mar 12 23:41:50.375864 kernel: xor: using function: arm64_neon (28099 MB/sec) Mar 12 23:41:50.432886 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 12 23:41:50.441091 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:41:50.444195 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:41:50.480118 systemd-udevd[495]: Using default interface naming scheme 'v255'. Mar 12 23:41:50.485460 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:41:50.491216 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 12 23:41:50.522017 dracut-pre-trigger[505]: rd.md=0: removing MD RAID activation Mar 12 23:41:50.556111 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:41:50.558770 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:41:50.620398 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:41:50.624084 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 12 23:41:50.707849 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Mar 12 23:41:50.714769 kernel: scsi host0: Virtio SCSI HBA Mar 12 23:41:50.715448 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 12 23:41:50.715482 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 12 23:41:50.730119 kernel: ACPI: bus type USB registered Mar 12 23:41:50.730236 kernel: usbcore: registered new interface driver usbfs Mar 12 23:41:50.730249 kernel: usbcore: registered new interface driver hub Mar 12 23:41:50.733848 kernel: usbcore: registered new device driver usb Mar 12 23:41:50.748886 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:41:50.749629 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:50.752798 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:41:50.755321 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:41:50.770986 kernel: sd 0:0:0:1: Power-on or device reset occurred Mar 12 23:41:50.771162 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Mar 12 23:41:50.773097 kernel: sd 0:0:0:1: [sda] Write Protect is off Mar 12 23:41:50.773195 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Mar 12 23:41:50.773284 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 12 23:41:50.782947 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 12 23:41:50.783004 kernel: GPT:17805311 != 80003071 Mar 12 23:41:50.783014 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 12 23:41:50.783027 kernel: GPT:17805311 != 80003071 Mar 12 23:41:50.783840 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 12 23:41:50.783863 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 12 23:41:50.784862 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Mar 12 23:41:50.788185 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 12 23:41:50.788370 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 12 23:41:50.789287 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 12 23:41:50.791914 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 12 23:41:50.792072 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 12 23:41:50.792150 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 12 23:41:50.792994 kernel: hub 1-0:1.0: USB hub found Mar 12 23:41:50.793151 kernel: hub 1-0:1.0: 4 ports detected Mar 12 23:41:50.794114 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 12 23:41:50.794878 kernel: hub 2-0:1.0: USB hub found Mar 12 23:41:50.794997 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:50.797267 kernel: hub 2-0:1.0: 4 ports detected Mar 12 23:41:50.798051 kernel: sr 0:0:0:0: Power-on or device reset occurred Mar 12 23:41:50.798231 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Mar 12 23:41:50.798329 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 12 23:41:50.800885 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Mar 12 23:41:50.861782 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 12 23:41:50.864537 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 12 23:41:50.873694 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 12 23:41:50.883207 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 12 23:41:50.891327 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 12 23:41:50.894121 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 12 23:41:50.908349 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 12 23:41:50.909279 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:41:50.910771 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:41:50.912085 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:41:50.916095 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 12 23:41:50.921299 disk-uuid[601]: Primary Header is updated. Mar 12 23:41:50.921299 disk-uuid[601]: Secondary Entries is updated. Mar 12 23:41:50.921299 disk-uuid[601]: Secondary Header is updated. Mar 12 23:41:50.934974 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 12 23:41:50.942021 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:41:51.030844 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 12 23:41:51.161862 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Mar 12 23:41:51.163124 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 12 23:41:51.163300 kernel: usbcore: registered new interface driver usbhid Mar 12 23:41:51.163833 kernel: usbhid: USB HID core driver Mar 12 23:41:51.269859 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Mar 12 23:41:51.396850 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Mar 12 23:41:51.449872 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Mar 12 23:41:51.955897 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 12 23:41:51.956903 disk-uuid[603]: The operation has completed successfully. Mar 12 23:41:52.012979 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 12 23:41:52.014180 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 12 23:41:52.038593 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 12 23:41:52.067573 sh[627]: Success Mar 12 23:41:52.082838 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 12 23:41:52.084032 kernel: device-mapper: uevent: version 1.0.3 Mar 12 23:41:52.084078 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 12 23:41:52.093888 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 12 23:41:52.136365 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 12 23:41:52.139489 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 12 23:41:52.148175 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 12 23:41:52.158348 kernel: BTRFS: device fsid fcbb17b2-5053-44fc-82f0-b24e4919d6d8 devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (640) Mar 12 23:41:52.158406 kernel: BTRFS info (device dm-0): first mount of filesystem fcbb17b2-5053-44fc-82f0-b24e4919d6d8 Mar 12 23:41:52.158430 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:41:52.166216 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 12 23:41:52.166278 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 12 23:41:52.166288 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 12 23:41:52.168274 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 12 23:41:52.170101 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:41:52.170912 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 12 23:41:52.172131 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 12 23:41:52.176279 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 12 23:41:52.205873 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (677) Mar 12 23:41:52.209676 kernel: BTRFS info (device sda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:41:52.209747 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:41:52.215860 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 12 23:41:52.215928 kernel: BTRFS info (device sda6): turning on async discard Mar 12 23:41:52.216267 kernel: BTRFS info (device sda6): enabling free space tree Mar 12 23:41:52.221872 kernel: BTRFS info (device sda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:41:52.223894 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 12 23:41:52.226564 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 12 23:41:52.313342 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:41:52.318931 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:41:52.363980 systemd-networkd[810]: lo: Link UP Mar 12 23:41:52.364578 systemd-networkd[810]: lo: Gained carrier Mar 12 23:41:52.366220 systemd-networkd[810]: Enumeration completed Mar 12 23:41:52.366335 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:41:52.366954 systemd-networkd[810]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:52.366958 systemd-networkd[810]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:41:52.368529 systemd[1]: Reached target network.target - Network. Mar 12 23:41:52.369880 systemd-networkd[810]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:52.369883 systemd-networkd[810]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:41:52.370209 systemd-networkd[810]: eth0: Link UP Mar 12 23:41:52.370340 systemd-networkd[810]: eth1: Link UP Mar 12 23:41:52.370454 systemd-networkd[810]: eth0: Gained carrier Mar 12 23:41:52.370463 systemd-networkd[810]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:52.373262 systemd-networkd[810]: eth1: Gained carrier Mar 12 23:41:52.373274 systemd-networkd[810]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:52.385086 ignition[730]: Ignition 2.22.0 Mar 12 23:41:52.385103 ignition[730]: Stage: fetch-offline Mar 12 23:41:52.385136 ignition[730]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:52.385144 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 12 23:41:52.385233 ignition[730]: parsed url from cmdline: "" Mar 12 23:41:52.388246 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:41:52.385237 ignition[730]: no config URL provided Mar 12 23:41:52.385241 ignition[730]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:41:52.390389 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 12 23:41:52.385248 ignition[730]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:41:52.385253 ignition[730]: failed to fetch config: resource requires networking Mar 12 23:41:52.385541 ignition[730]: Ignition finished successfully Mar 12 23:41:52.409959 systemd-networkd[810]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 12 23:41:52.423911 systemd-networkd[810]: eth0: DHCPv4 address 49.13.116.83/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 12 23:41:52.425542 ignition[818]: Ignition 2.22.0 Mar 12 23:41:52.425549 ignition[818]: Stage: fetch Mar 12 23:41:52.425752 ignition[818]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:52.425762 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 12 23:41:52.425862 ignition[818]: parsed url from cmdline: "" Mar 12 23:41:52.425866 ignition[818]: no config URL provided Mar 12 23:41:52.425870 ignition[818]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:41:52.425877 ignition[818]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:41:52.425916 ignition[818]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 12 23:41:52.433548 ignition[818]: GET result: OK Mar 12 23:41:52.433804 ignition[818]: parsing config with SHA512: 7e4ddc5f21928ad1372ac7a242dce419a1575ca8373b7f2032d11831abf84ea167a569807b5e369c1e44d0f1408c1d8509485530063632360ebc4eb24166fc37 Mar 12 23:41:52.441027 unknown[818]: fetched base config from "system" Mar 12 23:41:52.442349 unknown[818]: fetched base config from "system" Mar 12 23:41:52.442376 unknown[818]: fetched user config from "hetzner" Mar 12 23:41:52.442952 ignition[818]: fetch: fetch complete Mar 12 23:41:52.445955 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 12 23:41:52.442961 ignition[818]: fetch: fetch passed Mar 12 23:41:52.450216 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 12 23:41:52.443029 ignition[818]: Ignition finished successfully Mar 12 23:41:52.485350 ignition[826]: Ignition 2.22.0 Mar 12 23:41:52.485371 ignition[826]: Stage: kargs Mar 12 23:41:52.485527 ignition[826]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:52.485537 ignition[826]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 12 23:41:52.486598 ignition[826]: kargs: kargs passed Mar 12 23:41:52.486714 ignition[826]: Ignition finished successfully Mar 12 23:41:52.491768 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 12 23:41:52.495035 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 12 23:41:52.530745 ignition[833]: Ignition 2.22.0 Mar 12 23:41:52.531487 ignition[833]: Stage: disks Mar 12 23:41:52.532230 ignition[833]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:52.532242 ignition[833]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 12 23:41:52.533103 ignition[833]: disks: disks passed Mar 12 23:41:52.533180 ignition[833]: Ignition finished successfully Mar 12 23:41:52.536642 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 12 23:41:52.538093 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 12 23:41:52.539536 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 12 23:41:52.541086 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:41:52.541632 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:41:52.543015 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:41:52.545219 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 12 23:41:52.590572 systemd-fsck[842]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 12 23:41:52.594717 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 12 23:41:52.600416 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 12 23:41:52.674896 kernel: EXT4-fs (sda9): mounted filesystem 4b09db19-3beb-48c2-8dcb-3eec5602206c r/w with ordered data mode. Quota mode: none. Mar 12 23:41:52.676165 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 12 23:41:52.676807 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 12 23:41:52.679286 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:41:52.680773 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 12 23:41:52.684339 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 12 23:41:52.685090 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 12 23:41:52.685123 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:41:52.701415 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 12 23:41:52.704871 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (850) Mar 12 23:41:52.709073 kernel: BTRFS info (device sda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:41:52.709135 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:41:52.708032 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 12 23:41:52.716584 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 12 23:41:52.716671 kernel: BTRFS info (device sda6): turning on async discard Mar 12 23:41:52.716684 kernel: BTRFS info (device sda6): enabling free space tree Mar 12 23:41:52.722777 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:41:52.750570 coreos-metadata[852]: Mar 12 23:41:52.750 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 12 23:41:52.752920 coreos-metadata[852]: Mar 12 23:41:52.752 INFO Fetch successful Mar 12 23:41:52.755029 coreos-metadata[852]: Mar 12 23:41:52.754 INFO wrote hostname ci-4459-2-4-n-69ffcbf899 to /sysroot/etc/hostname Mar 12 23:41:52.758008 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 12 23:41:52.773266 initrd-setup-root[878]: cut: /sysroot/etc/passwd: No such file or directory Mar 12 23:41:52.779745 initrd-setup-root[885]: cut: /sysroot/etc/group: No such file or directory Mar 12 23:41:52.785491 initrd-setup-root[892]: cut: /sysroot/etc/shadow: No such file or directory Mar 12 23:41:52.790687 initrd-setup-root[899]: cut: /sysroot/etc/gshadow: No such file or directory Mar 12 23:41:52.887585 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 12 23:41:52.890951 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 12 23:41:52.892363 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 12 23:41:52.909854 kernel: BTRFS info (device sda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:41:52.927149 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 12 23:41:52.940667 ignition[969]: INFO : Ignition 2.22.0 Mar 12 23:41:52.941913 ignition[969]: INFO : Stage: mount Mar 12 23:41:52.941913 ignition[969]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:52.941913 ignition[969]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 12 23:41:52.944419 ignition[969]: INFO : mount: mount passed Mar 12 23:41:52.944419 ignition[969]: INFO : Ignition finished successfully Mar 12 23:41:52.946678 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 12 23:41:52.948487 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 12 23:41:53.158094 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 12 23:41:53.160839 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:41:53.181445 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (980) Mar 12 23:41:53.181516 kernel: BTRFS info (device sda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:41:53.181540 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:41:53.186134 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 12 23:41:53.186217 kernel: BTRFS info (device sda6): turning on async discard Mar 12 23:41:53.186237 kernel: BTRFS info (device sda6): enabling free space tree Mar 12 23:41:53.189223 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:41:53.224514 ignition[997]: INFO : Ignition 2.22.0 Mar 12 23:41:53.226010 ignition[997]: INFO : Stage: files Mar 12 23:41:53.226010 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:53.226010 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 12 23:41:53.230311 ignition[997]: DEBUG : files: compiled without relabeling support, skipping Mar 12 23:41:53.230311 ignition[997]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 12 23:41:53.230311 ignition[997]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 12 23:41:53.234347 ignition[997]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 12 23:41:53.234347 ignition[997]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 12 23:41:53.234347 ignition[997]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 12 23:41:53.234150 unknown[997]: wrote ssh authorized keys file for user: core Mar 12 23:41:53.238147 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:41:53.239233 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 12 23:41:53.326306 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 12 23:41:53.395934 systemd-networkd[810]: eth0: Gained IPv6LL Mar 12 23:41:53.398942 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:41:53.400274 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:41:53.412206 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:41:53.412206 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:41:53.412206 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 12 23:41:53.412206 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 12 23:41:53.412206 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 12 23:41:53.412206 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-arm64.raw: attempt #1 Mar 12 23:41:53.730011 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 12 23:41:53.780938 systemd-networkd[810]: eth1: Gained IPv6LL Mar 12 23:41:53.982444 ignition[997]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Mar 12 23:41:53.982444 ignition[997]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 12 23:41:53.986138 ignition[997]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:41:53.988669 ignition[997]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:41:53.988669 ignition[997]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 12 23:41:53.988669 ignition[997]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 12 23:41:53.994647 ignition[997]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 12 23:41:53.994647 ignition[997]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 12 23:41:53.994647 ignition[997]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 12 23:41:53.994647 ignition[997]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 12 23:41:53.994647 ignition[997]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 12 23:41:53.994647 ignition[997]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:41:53.994647 ignition[997]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:41:53.994647 ignition[997]: INFO : files: files passed Mar 12 23:41:53.994647 ignition[997]: INFO : Ignition finished successfully Mar 12 23:41:53.992215 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 12 23:41:53.996733 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 12 23:41:54.002059 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 12 23:41:54.022078 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 12 23:41:54.022207 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 12 23:41:54.034192 initrd-setup-root-after-ignition[1030]: grep: Mar 12 23:41:54.034897 initrd-setup-root-after-ignition[1026]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:41:54.036046 initrd-setup-root-after-ignition[1026]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:41:54.037235 initrd-setup-root-after-ignition[1030]: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:41:54.038944 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:41:54.040951 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 12 23:41:54.043230 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 12 23:41:54.099781 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 12 23:41:54.100735 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 12 23:41:54.103114 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 12 23:41:54.103750 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 12 23:41:54.105297 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 12 23:41:54.106317 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 12 23:41:54.147858 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:41:54.153296 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 12 23:41:54.176175 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:41:54.177290 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:41:54.178946 systemd[1]: Stopped target timers.target - Timer Units. Mar 12 23:41:54.180811 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 12 23:41:54.180970 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:41:54.184045 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 12 23:41:54.184679 systemd[1]: Stopped target basic.target - Basic System. Mar 12 23:41:54.185809 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 12 23:41:54.187066 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:41:54.188157 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 12 23:41:54.190629 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:41:54.191538 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 12 23:41:54.192686 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:41:54.193968 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 12 23:41:54.195805 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 12 23:41:54.197944 systemd[1]: Stopped target swap.target - Swaps. Mar 12 23:41:54.199222 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 12 23:41:54.199391 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:41:54.200997 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:41:54.201947 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:41:54.203987 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 12 23:41:54.204392 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:41:54.206060 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 12 23:41:54.206276 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 12 23:41:54.208619 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 12 23:41:54.208860 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:41:54.210397 systemd[1]: ignition-files.service: Deactivated successfully. Mar 12 23:41:54.210507 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 12 23:41:54.211568 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 12 23:41:54.211695 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 12 23:41:54.213557 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 12 23:41:54.218142 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 12 23:41:54.218767 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 12 23:41:54.218947 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:41:54.223734 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 12 23:41:54.223915 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:41:54.229714 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 12 23:41:54.232900 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 12 23:41:54.249762 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 12 23:41:54.257400 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 12 23:41:54.258870 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 12 23:41:54.260957 ignition[1050]: INFO : Ignition 2.22.0 Mar 12 23:41:54.260957 ignition[1050]: INFO : Stage: umount Mar 12 23:41:54.262166 ignition[1050]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:54.262166 ignition[1050]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 12 23:41:54.263491 ignition[1050]: INFO : umount: umount passed Mar 12 23:41:54.263491 ignition[1050]: INFO : Ignition finished successfully Mar 12 23:41:54.265234 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 12 23:41:54.265361 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 12 23:41:54.268247 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 12 23:41:54.268320 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 12 23:41:54.269049 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 12 23:41:54.269104 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 12 23:41:54.270470 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 12 23:41:54.270527 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 12 23:41:54.271654 systemd[1]: Stopped target network.target - Network. Mar 12 23:41:54.272802 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 12 23:41:54.272914 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:41:54.274088 systemd[1]: Stopped target paths.target - Path Units. Mar 12 23:41:54.275030 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 12 23:41:54.279894 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:41:54.283077 systemd[1]: Stopped target slices.target - Slice Units. Mar 12 23:41:54.284438 systemd[1]: Stopped target sockets.target - Socket Units. Mar 12 23:41:54.286377 systemd[1]: iscsid.socket: Deactivated successfully. Mar 12 23:41:54.286429 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:41:54.287864 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 12 23:41:54.287903 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:41:54.289410 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 12 23:41:54.289476 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 12 23:41:54.290917 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 12 23:41:54.290962 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 12 23:41:54.291921 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 12 23:41:54.292027 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 12 23:41:54.293333 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 12 23:41:54.294513 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 12 23:41:54.303434 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 12 23:41:54.303723 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 12 23:41:54.309608 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 12 23:41:54.309915 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 12 23:41:54.310023 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 12 23:41:54.313570 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 12 23:41:54.314425 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 12 23:41:54.316979 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 12 23:41:54.317027 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:41:54.320498 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 12 23:41:54.321101 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 12 23:41:54.321164 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:41:54.322986 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 12 23:41:54.323038 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:41:54.326725 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 12 23:41:54.326790 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 12 23:41:54.327650 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 12 23:41:54.327701 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:41:54.330175 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:41:54.335927 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 12 23:41:54.336001 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:41:54.353265 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 12 23:41:54.354168 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 12 23:41:54.355998 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 12 23:41:54.357154 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:41:54.359223 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 12 23:41:54.359300 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 12 23:41:54.360459 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 12 23:41:54.360497 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:41:54.361278 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 12 23:41:54.361336 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:41:54.363002 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 12 23:41:54.363059 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 12 23:41:54.364916 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 23:41:54.364987 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:41:54.367768 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 12 23:41:54.368512 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 12 23:41:54.368576 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:41:54.372407 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 12 23:41:54.372494 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:41:54.375540 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 12 23:41:54.375656 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:41:54.378773 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 12 23:41:54.378901 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:41:54.381977 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:41:54.382080 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:54.387439 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 12 23:41:54.387524 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Mar 12 23:41:54.387558 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 12 23:41:54.387618 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:41:54.391399 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 12 23:41:54.391502 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 12 23:41:54.392920 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 12 23:41:54.394542 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 12 23:41:54.417131 systemd[1]: Switching root. Mar 12 23:41:54.464116 systemd-journald[245]: Journal stopped Mar 12 23:41:55.440724 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Mar 12 23:41:55.440804 kernel: SELinux: policy capability network_peer_controls=1 Mar 12 23:41:55.440836 kernel: SELinux: policy capability open_perms=1 Mar 12 23:41:55.440847 kernel: SELinux: policy capability extended_socket_class=1 Mar 12 23:41:55.440861 kernel: SELinux: policy capability always_check_network=0 Mar 12 23:41:55.440870 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 12 23:41:55.440883 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 12 23:41:55.440892 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 12 23:41:55.440901 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 12 23:41:55.440914 kernel: SELinux: policy capability userspace_initial_context=0 Mar 12 23:41:55.440924 kernel: audit: type=1403 audit(1773358914.639:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 12 23:41:55.440934 systemd[1]: Successfully loaded SELinux policy in 63.880ms. Mar 12 23:41:55.440955 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.055ms. Mar 12 23:41:55.440967 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:41:55.440980 systemd[1]: Detected virtualization kvm. Mar 12 23:41:55.440990 systemd[1]: Detected architecture arm64. Mar 12 23:41:55.440999 systemd[1]: Detected first boot. Mar 12 23:41:55.441009 systemd[1]: Hostname set to . Mar 12 23:41:55.441018 systemd[1]: Initializing machine ID from VM UUID. Mar 12 23:41:55.441029 zram_generator::config[1094]: No configuration found. Mar 12 23:41:55.441039 kernel: NET: Registered PF_VSOCK protocol family Mar 12 23:41:55.441053 systemd[1]: Populated /etc with preset unit settings. Mar 12 23:41:55.441077 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 12 23:41:55.441149 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 12 23:41:55.441161 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 12 23:41:55.441171 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 12 23:41:55.441181 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 12 23:41:55.441191 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 12 23:41:55.441203 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 12 23:41:55.441213 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 12 23:41:55.441223 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 12 23:41:55.441237 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 12 23:41:55.441247 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 12 23:41:55.441257 systemd[1]: Created slice user.slice - User and Session Slice. Mar 12 23:41:55.441267 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:41:55.441278 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:41:55.441287 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 12 23:41:55.441298 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 12 23:41:55.441309 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 12 23:41:55.441319 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:41:55.441329 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 12 23:41:55.441338 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:41:55.441348 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:41:55.441360 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 12 23:41:55.441370 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 12 23:41:55.441379 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 12 23:41:55.441389 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 12 23:41:55.441398 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:41:55.441409 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:41:55.441419 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:41:55.441430 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:41:55.441441 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 12 23:41:55.441452 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 12 23:41:55.441463 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 12 23:41:55.441473 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:41:55.441483 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:41:55.441493 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:41:55.441502 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 12 23:41:55.441512 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 12 23:41:55.441522 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 12 23:41:55.441531 systemd[1]: Mounting media.mount - External Media Directory... Mar 12 23:41:55.441542 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 12 23:41:55.441552 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 12 23:41:55.441562 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 12 23:41:55.441572 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 12 23:41:55.441582 systemd[1]: Reached target machines.target - Containers. Mar 12 23:41:55.441604 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 12 23:41:55.441624 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:41:55.441634 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:41:55.441646 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 12 23:41:55.441656 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:41:55.441666 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:41:55.441676 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:41:55.441687 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 12 23:41:55.441699 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:41:55.441710 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 12 23:41:55.441720 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 12 23:41:55.441731 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 12 23:41:55.441741 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 12 23:41:55.441751 systemd[1]: Stopped systemd-fsck-usr.service. Mar 12 23:41:55.441762 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:41:55.441772 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:41:55.441782 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:41:55.441792 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:41:55.447838 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 12 23:41:55.447887 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 12 23:41:55.447899 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:41:55.447917 systemd[1]: verity-setup.service: Deactivated successfully. Mar 12 23:41:55.447928 systemd[1]: Stopped verity-setup.service. Mar 12 23:41:55.447938 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 12 23:41:55.447948 kernel: loop: module loaded Mar 12 23:41:55.447959 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 12 23:41:55.447968 systemd[1]: Mounted media.mount - External Media Directory. Mar 12 23:41:55.447978 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 12 23:41:55.447989 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 12 23:41:55.447999 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 12 23:41:55.448010 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:41:55.448021 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 12 23:41:55.448030 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 12 23:41:55.448041 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:41:55.448051 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:41:55.448060 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:41:55.448071 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:41:55.448081 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:41:55.448092 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:41:55.448102 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:41:55.448112 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 12 23:41:55.448122 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:41:55.448132 kernel: fuse: init (API version 7.41) Mar 12 23:41:55.448170 systemd-journald[1162]: Collecting audit messages is disabled. Mar 12 23:41:55.448201 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:41:55.448212 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 23:41:55.448224 systemd-journald[1162]: Journal started Mar 12 23:41:55.448245 systemd-journald[1162]: Runtime Journal (/run/log/journal/0fc5ca300ffb47658ab64707a5a8a4e0) is 8M, max 76.5M, 68.5M free. Mar 12 23:41:55.172471 systemd[1]: Queued start job for default target multi-user.target. Mar 12 23:41:55.180042 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 12 23:41:55.180644 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 12 23:41:55.451019 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:41:55.451312 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 12 23:41:55.457990 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 12 23:41:55.459009 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:41:55.460021 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 12 23:41:55.460861 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 12 23:41:55.479382 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:41:55.482775 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 12 23:41:55.485051 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 12 23:41:55.485092 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:41:55.490959 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 12 23:41:55.494801 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 12 23:41:55.495645 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:41:55.504001 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 12 23:41:55.506206 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 12 23:41:55.506985 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:41:55.510082 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 12 23:41:55.511848 kernel: ACPI: bus type drm_connector registered Mar 12 23:41:55.516051 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 12 23:41:55.518897 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 12 23:41:55.520050 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:41:55.527952 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:41:55.529204 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 12 23:41:55.531074 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:41:55.534643 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 12 23:41:55.536698 systemd-tmpfiles[1183]: ACLs are not supported, ignoring. Mar 12 23:41:55.536711 systemd-tmpfiles[1183]: ACLs are not supported, ignoring. Mar 12 23:41:55.546899 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:41:55.557116 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 12 23:41:55.566503 systemd-journald[1162]: Time spent on flushing to /var/log/journal/0fc5ca300ffb47658ab64707a5a8a4e0 is 38.606ms for 1179 entries. Mar 12 23:41:55.566503 systemd-journald[1162]: System Journal (/var/log/journal/0fc5ca300ffb47658ab64707a5a8a4e0) is 8M, max 584.8M, 576.8M free. Mar 12 23:41:55.620018 systemd-journald[1162]: Received client request to flush runtime journal. Mar 12 23:41:55.620068 kernel: loop0: detected capacity change from 0 to 197488 Mar 12 23:41:55.620082 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 12 23:41:55.574299 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 12 23:41:55.576404 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 12 23:41:55.588981 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 12 23:41:55.624028 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:41:55.628669 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 12 23:41:55.641253 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 12 23:41:55.646163 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:41:55.649976 kernel: loop1: detected capacity change from 0 to 119840 Mar 12 23:41:55.653440 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 12 23:41:55.683927 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Mar 12 23:41:55.683944 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Mar 12 23:41:55.690847 kernel: loop2: detected capacity change from 0 to 8 Mar 12 23:41:55.692889 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:41:55.713049 kernel: loop3: detected capacity change from 0 to 100632 Mar 12 23:41:55.768858 kernel: loop4: detected capacity change from 0 to 197488 Mar 12 23:41:55.788707 kernel: loop5: detected capacity change from 0 to 119840 Mar 12 23:41:55.807891 kernel: loop6: detected capacity change from 0 to 8 Mar 12 23:41:55.809859 kernel: loop7: detected capacity change from 0 to 100632 Mar 12 23:41:55.833771 (sd-merge)[1245]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 12 23:41:55.835999 (sd-merge)[1245]: Merged extensions into '/usr'. Mar 12 23:41:55.843962 systemd[1]: Reload requested from client PID 1216 ('systemd-sysext') (unit systemd-sysext.service)... Mar 12 23:41:55.843986 systemd[1]: Reloading... Mar 12 23:41:55.943032 zram_generator::config[1271]: No configuration found. Mar 12 23:41:56.075972 ldconfig[1209]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 12 23:41:56.136767 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 12 23:41:56.137268 systemd[1]: Reloading finished in 292 ms. Mar 12 23:41:56.179093 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 12 23:41:56.180129 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 12 23:41:56.189128 systemd[1]: Starting ensure-sysext.service... Mar 12 23:41:56.196078 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:41:56.210410 systemd[1]: Reload requested from client PID 1308 ('systemctl') (unit ensure-sysext.service)... Mar 12 23:41:56.210433 systemd[1]: Reloading... Mar 12 23:41:56.236212 systemd-tmpfiles[1309]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 12 23:41:56.236917 systemd-tmpfiles[1309]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 12 23:41:56.237873 systemd-tmpfiles[1309]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 12 23:41:56.238324 systemd-tmpfiles[1309]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 12 23:41:56.239167 systemd-tmpfiles[1309]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 12 23:41:56.239475 systemd-tmpfiles[1309]: ACLs are not supported, ignoring. Mar 12 23:41:56.239583 systemd-tmpfiles[1309]: ACLs are not supported, ignoring. Mar 12 23:41:56.242570 systemd-tmpfiles[1309]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:41:56.244854 systemd-tmpfiles[1309]: Skipping /boot Mar 12 23:41:56.250940 systemd-tmpfiles[1309]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:41:56.250955 systemd-tmpfiles[1309]: Skipping /boot Mar 12 23:41:56.289852 zram_generator::config[1338]: No configuration found. Mar 12 23:41:56.442873 systemd[1]: Reloading finished in 232 ms. Mar 12 23:41:56.465750 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 12 23:41:56.471700 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:41:56.478120 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:41:56.492016 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 12 23:41:56.497053 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 12 23:41:56.500839 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:41:56.504182 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:41:56.515069 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 12 23:41:56.522580 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:41:56.524153 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:41:56.527096 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:41:56.532246 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:41:56.533038 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:41:56.533200 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:41:56.535636 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:41:56.536854 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:41:56.536950 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:41:56.539519 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 12 23:41:56.545282 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:41:56.548306 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:41:56.549993 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:41:56.550120 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:41:56.558255 systemd[1]: Finished ensure-sysext.service. Mar 12 23:41:56.582678 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 12 23:41:56.585502 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:41:56.585789 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:41:56.594692 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:41:56.600032 systemd-udevd[1384]: Using default interface naming scheme 'v255'. Mar 12 23:41:56.603022 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:41:56.605434 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:41:56.608886 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 12 23:41:56.610897 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 12 23:41:56.616687 augenrules[1409]: No rules Mar 12 23:41:56.613091 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 12 23:41:56.616143 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:41:56.616334 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:41:56.617339 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:41:56.617572 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:41:56.619269 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:41:56.619413 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:41:56.626012 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:41:56.628076 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 12 23:41:56.629977 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 12 23:41:56.640217 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:41:56.650980 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:41:56.672114 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 12 23:41:56.687556 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 12 23:41:56.792037 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 12 23:41:56.871863 kernel: mousedev: PS/2 mouse device common for all mice Mar 12 23:41:56.956580 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 12 23:41:56.959993 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 12 23:41:56.995227 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 12 23:41:56.996283 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:41:56.998937 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:41:57.003733 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:41:57.006895 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:41:57.007774 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:41:57.008968 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:41:57.009008 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 12 23:41:57.027336 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 12 23:41:57.032752 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:41:57.033038 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:41:57.036080 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:41:57.036579 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:41:57.041647 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:41:57.073983 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:41:57.076941 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:41:57.078194 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:41:57.085954 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Mar 12 23:41:57.086881 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 12 23:41:57.086980 kernel: [drm] features: -context_init Mar 12 23:41:57.087991 kernel: [drm] number of scanouts: 1 Mar 12 23:41:57.088091 kernel: [drm] number of cap sets: 0 Mar 12 23:41:57.091085 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 12 23:41:57.099334 kernel: Console: switching to colour frame buffer device 160x50 Mar 12 23:41:57.110881 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 12 23:41:57.129958 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:41:57.233580 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:41:57.236764 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:57.241044 systemd-networkd[1432]: lo: Link UP Mar 12 23:41:57.241059 systemd-networkd[1432]: lo: Gained carrier Mar 12 23:41:57.244221 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:41:57.249143 systemd-networkd[1432]: Enumeration completed Mar 12 23:41:57.249607 systemd-networkd[1432]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:57.249610 systemd-networkd[1432]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:41:57.250395 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:41:57.251332 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:41:57.257034 systemd-networkd[1432]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:57.257045 systemd-networkd[1432]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:41:57.257447 systemd-networkd[1432]: eth0: Link UP Mar 12 23:41:57.257545 systemd-networkd[1432]: eth0: Gained carrier Mar 12 23:41:57.257561 systemd-networkd[1432]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:57.261320 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 12 23:41:57.266142 systemd-networkd[1432]: eth1: Link UP Mar 12 23:41:57.267129 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 12 23:41:57.270262 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 12 23:41:57.271335 systemd[1]: Reached target time-set.target - System Time Set. Mar 12 23:41:57.272227 systemd-networkd[1432]: eth1: Gained carrier Mar 12 23:41:57.272261 systemd-networkd[1432]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:57.306954 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 12 23:41:57.311967 systemd-networkd[1432]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 12 23:41:57.312515 systemd-timesyncd[1397]: Network configuration changed, trying to establish connection. Mar 12 23:41:57.316957 systemd-networkd[1432]: eth0: DHCPv4 address 49.13.116.83/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 12 23:41:57.326471 systemd-resolved[1382]: Positive Trust Anchors: Mar 12 23:41:57.326492 systemd-resolved[1382]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:41:57.326524 systemd-resolved[1382]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:41:57.333439 systemd-resolved[1382]: Using system hostname 'ci-4459-2-4-n-69ffcbf899'. Mar 12 23:41:57.336191 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:41:57.336943 systemd[1]: Reached target network.target - Network. Mar 12 23:41:57.337420 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:41:57.359407 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:57.361140 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:41:57.361866 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 12 23:41:57.362533 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 12 23:41:57.364203 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 12 23:41:57.364919 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 12 23:41:57.365609 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 12 23:41:57.366379 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 12 23:41:57.366409 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:41:57.366959 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:41:57.368779 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 12 23:41:57.372245 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 12 23:41:57.376153 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 12 23:41:57.378287 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 12 23:41:57.379045 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 12 23:41:57.389151 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 12 23:41:57.391318 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 12 23:41:57.394328 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 12 23:41:57.395931 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:41:57.396505 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:41:57.397160 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:41:57.397196 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:41:57.398493 systemd[1]: Starting containerd.service - containerd container runtime... Mar 12 23:41:57.401983 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 12 23:41:57.404064 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 12 23:41:57.409034 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 12 23:41:57.410731 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 12 23:41:57.415177 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 12 23:41:57.415791 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 12 23:41:57.417905 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 12 23:41:57.423288 systemd-timesyncd[1397]: Contacted time server 194.59.205.229:123 (1.flatcar.pool.ntp.org). Mar 12 23:41:57.423452 systemd-timesyncd[1397]: Initial clock synchronization to Thu 2026-03-12 23:41:57.782702 UTC. Mar 12 23:41:57.428028 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 12 23:41:57.431359 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 12 23:41:57.433774 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 12 23:41:57.440132 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 12 23:41:57.448055 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 12 23:41:57.449733 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 12 23:41:57.455492 extend-filesystems[1524]: Found /dev/sda6 Mar 12 23:41:57.459199 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 12 23:41:57.461664 jq[1523]: false Mar 12 23:41:57.464204 systemd[1]: Starting update-engine.service - Update Engine... Mar 12 23:41:57.467044 extend-filesystems[1524]: Found /dev/sda9 Mar 12 23:41:57.470161 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 12 23:41:57.472076 coreos-metadata[1520]: Mar 12 23:41:57.471 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 12 23:41:57.473180 coreos-metadata[1520]: Mar 12 23:41:57.472 INFO Fetch successful Mar 12 23:41:57.473180 coreos-metadata[1520]: Mar 12 23:41:57.473 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 12 23:41:57.474118 coreos-metadata[1520]: Mar 12 23:41:57.473 INFO Fetch successful Mar 12 23:41:57.484855 extend-filesystems[1524]: Checking size of /dev/sda9 Mar 12 23:41:57.488991 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 12 23:41:57.491213 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 12 23:41:57.492010 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 12 23:41:57.493204 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 12 23:41:57.493368 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 12 23:41:57.518026 jq[1539]: true Mar 12 23:41:57.526776 tar[1551]: linux-arm64/LICENSE Mar 12 23:41:57.526776 tar[1551]: linux-arm64/helm Mar 12 23:41:57.527103 extend-filesystems[1524]: Resized partition /dev/sda9 Mar 12 23:41:57.534927 extend-filesystems[1567]: resize2fs 1.47.3 (8-Jul-2025) Mar 12 23:41:57.536831 systemd[1]: motdgen.service: Deactivated successfully. Mar 12 23:41:57.538288 (ntainerd)[1553]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 12 23:41:57.540911 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 12 23:41:57.545857 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Mar 12 23:41:57.574849 dbus-daemon[1521]: [system] SELinux support is enabled Mar 12 23:41:57.575062 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 12 23:41:57.579238 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 12 23:41:57.579268 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 12 23:41:57.583062 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 12 23:41:57.583088 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 12 23:41:57.592399 jq[1568]: true Mar 12 23:41:57.611896 update_engine[1538]: I20260312 23:41:57.608166 1538 main.cc:92] Flatcar Update Engine starting Mar 12 23:41:57.616369 systemd[1]: Started update-engine.service - Update Engine. Mar 12 23:41:57.622277 update_engine[1538]: I20260312 23:41:57.622110 1538 update_check_scheduler.cc:74] Next update check in 6m30s Mar 12 23:41:57.629075 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 12 23:41:57.667854 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Mar 12 23:41:57.678020 extend-filesystems[1567]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 12 23:41:57.678020 extend-filesystems[1567]: old_desc_blocks = 1, new_desc_blocks = 5 Mar 12 23:41:57.678020 extend-filesystems[1567]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Mar 12 23:41:57.688290 extend-filesystems[1524]: Resized filesystem in /dev/sda9 Mar 12 23:41:57.679032 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 12 23:41:57.680001 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 12 23:41:57.700888 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 12 23:41:57.702370 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 12 23:41:57.733850 bash[1597]: Updated "/home/core/.ssh/authorized_keys" Mar 12 23:41:57.735881 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 12 23:41:57.743234 systemd[1]: Starting sshkeys.service... Mar 12 23:41:57.757731 systemd-logind[1535]: New seat seat0. Mar 12 23:41:57.764024 systemd-logind[1535]: Watching system buttons on /dev/input/event0 (Power Button) Mar 12 23:41:57.764052 systemd-logind[1535]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Mar 12 23:41:57.764268 systemd[1]: Started systemd-logind.service - User Login Management. Mar 12 23:41:57.808448 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 12 23:41:57.812882 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 12 23:41:57.878112 containerd[1553]: time="2026-03-12T23:41:57Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 12 23:41:57.881468 containerd[1553]: time="2026-03-12T23:41:57.881425600Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.909917520Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.04µs" Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.909957000Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.909977480Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910130640Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910145520Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910171240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910237480Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910248720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910469560Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910485720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910502120Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912466 containerd[1553]: time="2026-03-12T23:41:57.910510440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912805 containerd[1553]: time="2026-03-12T23:41:57.910576520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912805 containerd[1553]: time="2026-03-12T23:41:57.910773800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:41:57.912805 containerd[1553]: time="2026-03-12T23:41:57.910801120Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:41:57.926572 containerd[1553]: time="2026-03-12T23:41:57.910812200Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 12 23:41:57.926572 containerd[1553]: time="2026-03-12T23:41:57.926563600Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 12 23:41:57.927181 containerd[1553]: time="2026-03-12T23:41:57.926951840Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 12 23:41:57.927181 containerd[1553]: time="2026-03-12T23:41:57.927070080Z" level=info msg="metadata content store policy set" policy=shared Mar 12 23:41:57.942367 coreos-metadata[1604]: Mar 12 23:41:57.942 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 12 23:41:57.947342 containerd[1553]: time="2026-03-12T23:41:57.947289680Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 12 23:41:57.947432 containerd[1553]: time="2026-03-12T23:41:57.947368240Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 12 23:41:57.947432 containerd[1553]: time="2026-03-12T23:41:57.947384280Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 12 23:41:57.947432 containerd[1553]: time="2026-03-12T23:41:57.947396480Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 12 23:41:57.947432 containerd[1553]: time="2026-03-12T23:41:57.947410000Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 12 23:41:57.947432 containerd[1553]: time="2026-03-12T23:41:57.947423280Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 12 23:41:57.947535 containerd[1553]: time="2026-03-12T23:41:57.947439080Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 12 23:41:57.947535 containerd[1553]: time="2026-03-12T23:41:57.947451200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 12 23:41:57.947535 containerd[1553]: time="2026-03-12T23:41:57.947471640Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 12 23:41:57.947535 containerd[1553]: time="2026-03-12T23:41:57.947485480Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 12 23:41:57.947535 containerd[1553]: time="2026-03-12T23:41:57.947495440Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 12 23:41:57.947535 containerd[1553]: time="2026-03-12T23:41:57.947509920Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 12 23:41:57.947848 containerd[1553]: time="2026-03-12T23:41:57.947785600Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.947906040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.947931160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.947943800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.947955920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.947966960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.947979280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.947990280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.948002560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.948016320Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.948036000Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.948234800Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.948255080Z" level=info msg="Start snapshots syncer" Mar 12 23:41:57.948860 containerd[1553]: time="2026-03-12T23:41:57.948278920Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 12 23:41:57.949157 containerd[1553]: time="2026-03-12T23:41:57.948565520Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 12 23:41:57.949157 containerd[1553]: time="2026-03-12T23:41:57.948661080Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 12 23:41:57.949252 containerd[1553]: time="2026-03-12T23:41:57.948709360Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 12 23:41:57.949439 containerd[1553]: time="2026-03-12T23:41:57.949308040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 12 23:41:57.949439 containerd[1553]: time="2026-03-12T23:41:57.949349440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 12 23:41:57.949439 containerd[1553]: time="2026-03-12T23:41:57.949385760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 12 23:41:57.949439 containerd[1553]: time="2026-03-12T23:41:57.949395920Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 12 23:41:57.949439 containerd[1553]: time="2026-03-12T23:41:57.949407000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 12 23:41:57.949439 containerd[1553]: time="2026-03-12T23:41:57.949417360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 12 23:41:57.949639 containerd[1553]: time="2026-03-12T23:41:57.949427800Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 12 23:41:57.949639 containerd[1553]: time="2026-03-12T23:41:57.949603280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 12 23:41:57.949639 containerd[1553]: time="2026-03-12T23:41:57.949621680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 12 23:41:57.949793 containerd[1553]: time="2026-03-12T23:41:57.949727720Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 12 23:41:57.949917 coreos-metadata[1604]: Mar 12 23:41:57.949 INFO Fetch successful Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.949779840Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.949995320Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950011080Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950020920Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950030440Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950060080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950073800Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950160640Z" level=info msg="runtime interface created" Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950165920Z" level=info msg="created NRI interface" Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950173720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950187640Z" level=info msg="Connect containerd service" Mar 12 23:41:57.950232 containerd[1553]: time="2026-03-12T23:41:57.950214480Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 12 23:41:57.956124 containerd[1553]: time="2026-03-12T23:41:57.956081600Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 23:41:57.956729 unknown[1604]: wrote ssh authorized keys file for user: core Mar 12 23:41:58.007199 update-ssh-keys[1619]: Updated "/home/core/.ssh/authorized_keys" Mar 12 23:41:58.009158 locksmithd[1578]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 12 23:41:58.012429 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 12 23:41:58.016425 systemd[1]: Finished sshkeys.service. Mar 12 23:41:58.115225 containerd[1553]: time="2026-03-12T23:41:58.115010586Z" level=info msg="Start subscribing containerd event" Mar 12 23:41:58.115414 containerd[1553]: time="2026-03-12T23:41:58.115398300Z" level=info msg="Start recovering state" Mar 12 23:41:58.115613 containerd[1553]: time="2026-03-12T23:41:58.115597926Z" level=info msg="Start event monitor" Mar 12 23:41:58.115744 containerd[1553]: time="2026-03-12T23:41:58.115728627Z" level=info msg="Start cni network conf syncer for default" Mar 12 23:41:58.116633 containerd[1553]: time="2026-03-12T23:41:58.115810926Z" level=info msg="Start streaming server" Mar 12 23:41:58.116633 containerd[1553]: time="2026-03-12T23:41:58.116404452Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 12 23:41:58.116633 containerd[1553]: time="2026-03-12T23:41:58.116414693Z" level=info msg="runtime interface starting up..." Mar 12 23:41:58.116633 containerd[1553]: time="2026-03-12T23:41:58.116421381Z" level=info msg="starting plugins..." Mar 12 23:41:58.116633 containerd[1553]: time="2026-03-12T23:41:58.116444118Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 12 23:41:58.117003 containerd[1553]: time="2026-03-12T23:41:58.116969765Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 12 23:41:58.117055 containerd[1553]: time="2026-03-12T23:41:58.117023308Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 12 23:41:58.117911 containerd[1553]: time="2026-03-12T23:41:58.117087676Z" level=info msg="containerd successfully booted in 0.239382s" Mar 12 23:41:58.117200 systemd[1]: Started containerd.service - containerd container runtime. Mar 12 23:41:58.263140 tar[1551]: linux-arm64/README.md Mar 12 23:41:58.283318 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 12 23:41:58.324078 systemd-networkd[1432]: eth0: Gained IPv6LL Mar 12 23:41:58.329119 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 12 23:41:58.331217 systemd[1]: Reached target network-online.target - Network is Online. Mar 12 23:41:58.337101 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:41:58.340146 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 12 23:41:58.385331 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 12 23:41:58.389625 systemd-networkd[1432]: eth1: Gained IPv6LL Mar 12 23:41:59.101029 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:41:59.110349 (kubelet)[1651]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:41:59.567112 kubelet[1651]: E0312 23:41:59.566990 1651 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:41:59.569907 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:41:59.570222 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:41:59.571960 systemd[1]: kubelet.service: Consumed 775ms CPU time, 247.4M memory peak. Mar 12 23:41:59.696611 sshd_keygen[1559]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 12 23:41:59.725766 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 12 23:41:59.730970 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 12 23:41:59.760255 systemd[1]: issuegen.service: Deactivated successfully. Mar 12 23:41:59.760987 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 12 23:41:59.765910 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 12 23:41:59.783057 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 12 23:41:59.789410 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 12 23:41:59.794070 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 12 23:41:59.795309 systemd[1]: Reached target getty.target - Login Prompts. Mar 12 23:41:59.797189 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 12 23:41:59.799371 systemd[1]: Startup finished in 2.400s (kernel) + 5.020s (initrd) + 5.222s (userspace) = 12.643s. Mar 12 23:42:09.720046 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 12 23:42:09.723631 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:09.887725 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:09.900787 (kubelet)[1687]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:42:09.949986 kubelet[1687]: E0312 23:42:09.949922 1687 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:42:09.953206 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:42:09.953338 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:42:09.953986 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.8M memory peak. Mar 12 23:42:19.970168 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 12 23:42:19.974066 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:20.135849 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:20.144530 (kubelet)[1702]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:42:20.194889 kubelet[1702]: E0312 23:42:20.194804 1702 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:42:20.197560 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:42:20.197768 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:42:20.199965 systemd[1]: kubelet.service: Consumed 164ms CPU time, 104.6M memory peak. Mar 12 23:42:30.220050 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 12 23:42:30.225067 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:30.389983 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:30.405670 (kubelet)[1718]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:42:30.447533 kubelet[1718]: E0312 23:42:30.447465 1718 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:42:30.449725 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:42:30.449889 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:42:30.450442 systemd[1]: kubelet.service: Consumed 165ms CPU time, 106.8M memory peak. Mar 12 23:42:36.630879 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 12 23:42:36.632707 systemd[1]: Started sshd@0-49.13.116.83:22-20.161.92.111:54358.service - OpenSSH per-connection server daemon (20.161.92.111:54358). Mar 12 23:42:37.190297 sshd[1725]: Accepted publickey for core from 20.161.92.111 port 54358 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:42:37.193603 sshd-session[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:42:37.212241 systemd-logind[1535]: New session 1 of user core. Mar 12 23:42:37.213349 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 12 23:42:37.214795 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 12 23:42:37.253687 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 12 23:42:37.257746 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 12 23:42:37.278303 (systemd)[1730]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 12 23:42:37.282561 systemd-logind[1535]: New session c1 of user core. Mar 12 23:42:37.431322 systemd[1730]: Queued start job for default target default.target. Mar 12 23:42:37.440506 systemd[1730]: Created slice app.slice - User Application Slice. Mar 12 23:42:37.440569 systemd[1730]: Reached target paths.target - Paths. Mar 12 23:42:37.440642 systemd[1730]: Reached target timers.target - Timers. Mar 12 23:42:37.443126 systemd[1730]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 12 23:42:37.476741 systemd[1730]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 12 23:42:37.477021 systemd[1730]: Reached target sockets.target - Sockets. Mar 12 23:42:37.477127 systemd[1730]: Reached target basic.target - Basic System. Mar 12 23:42:37.477163 systemd[1730]: Reached target default.target - Main User Target. Mar 12 23:42:37.477197 systemd[1730]: Startup finished in 185ms. Mar 12 23:42:37.477598 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 12 23:42:37.486205 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 12 23:42:37.792052 systemd[1]: Started sshd@1-49.13.116.83:22-20.161.92.111:54366.service - OpenSSH per-connection server daemon (20.161.92.111:54366). Mar 12 23:42:38.323297 sshd[1741]: Accepted publickey for core from 20.161.92.111 port 54366 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:42:38.326372 sshd-session[1741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:42:38.332984 systemd-logind[1535]: New session 2 of user core. Mar 12 23:42:38.345222 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 12 23:42:38.613133 sshd[1744]: Connection closed by 20.161.92.111 port 54366 Mar 12 23:42:38.614404 sshd-session[1741]: pam_unix(sshd:session): session closed for user core Mar 12 23:42:38.621716 systemd-logind[1535]: Session 2 logged out. Waiting for processes to exit. Mar 12 23:42:38.622503 systemd[1]: sshd@1-49.13.116.83:22-20.161.92.111:54366.service: Deactivated successfully. Mar 12 23:42:38.626124 systemd[1]: session-2.scope: Deactivated successfully. Mar 12 23:42:38.627996 systemd-logind[1535]: Removed session 2. Mar 12 23:42:38.719895 systemd[1]: Started sshd@2-49.13.116.83:22-20.161.92.111:54378.service - OpenSSH per-connection server daemon (20.161.92.111:54378). Mar 12 23:42:39.250671 sshd[1750]: Accepted publickey for core from 20.161.92.111 port 54378 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:42:39.253559 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:42:39.258884 systemd-logind[1535]: New session 3 of user core. Mar 12 23:42:39.268157 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 12 23:42:39.534612 sshd[1753]: Connection closed by 20.161.92.111 port 54378 Mar 12 23:42:39.533448 sshd-session[1750]: pam_unix(sshd:session): session closed for user core Mar 12 23:42:39.539155 systemd[1]: sshd@2-49.13.116.83:22-20.161.92.111:54378.service: Deactivated successfully. Mar 12 23:42:39.541587 systemd[1]: session-3.scope: Deactivated successfully. Mar 12 23:42:39.542859 systemd-logind[1535]: Session 3 logged out. Waiting for processes to exit. Mar 12 23:42:39.545021 systemd-logind[1535]: Removed session 3. Mar 12 23:42:39.639606 systemd[1]: Started sshd@3-49.13.116.83:22-20.161.92.111:54388.service - OpenSSH per-connection server daemon (20.161.92.111:54388). Mar 12 23:42:40.172931 sshd[1759]: Accepted publickey for core from 20.161.92.111 port 54388 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:42:40.175402 sshd-session[1759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:42:40.181899 systemd-logind[1535]: New session 4 of user core. Mar 12 23:42:40.192152 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 12 23:42:40.462074 sshd[1762]: Connection closed by 20.161.92.111 port 54388 Mar 12 23:42:40.463318 sshd-session[1759]: pam_unix(sshd:session): session closed for user core Mar 12 23:42:40.468836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 12 23:42:40.469695 systemd[1]: sshd@3-49.13.116.83:22-20.161.92.111:54388.service: Deactivated successfully. Mar 12 23:42:40.473948 systemd[1]: session-4.scope: Deactivated successfully. Mar 12 23:42:40.475206 systemd-logind[1535]: Session 4 logged out. Waiting for processes to exit. Mar 12 23:42:40.480139 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:40.481804 systemd-logind[1535]: Removed session 4. Mar 12 23:42:40.576112 systemd[1]: Started sshd@4-49.13.116.83:22-20.161.92.111:47392.service - OpenSSH per-connection server daemon (20.161.92.111:47392). Mar 12 23:42:40.640939 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:40.656386 (kubelet)[1779]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:42:40.694642 kubelet[1779]: E0312 23:42:40.694589 1779 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:42:40.697205 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:42:40.697472 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:42:40.698070 systemd[1]: kubelet.service: Consumed 163ms CPU time, 107.2M memory peak. Mar 12 23:42:41.109854 sshd[1771]: Accepted publickey for core from 20.161.92.111 port 47392 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:42:41.112304 sshd-session[1771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:42:41.117972 systemd-logind[1535]: New session 5 of user core. Mar 12 23:42:41.126449 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 12 23:42:41.322043 sudo[1786]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 12 23:42:41.322331 sudo[1786]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:42:41.335705 sudo[1786]: pam_unix(sudo:session): session closed for user root Mar 12 23:42:41.433654 sshd[1785]: Connection closed by 20.161.92.111 port 47392 Mar 12 23:42:41.435148 sshd-session[1771]: pam_unix(sshd:session): session closed for user core Mar 12 23:42:41.440215 systemd[1]: sshd@4-49.13.116.83:22-20.161.92.111:47392.service: Deactivated successfully. Mar 12 23:42:41.443641 systemd[1]: session-5.scope: Deactivated successfully. Mar 12 23:42:41.444863 systemd-logind[1535]: Session 5 logged out. Waiting for processes to exit. Mar 12 23:42:41.446745 systemd-logind[1535]: Removed session 5. Mar 12 23:42:41.542080 systemd[1]: Started sshd@5-49.13.116.83:22-20.161.92.111:47406.service - OpenSSH per-connection server daemon (20.161.92.111:47406). Mar 12 23:42:42.062918 sshd[1792]: Accepted publickey for core from 20.161.92.111 port 47406 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:42:42.065045 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:42:42.071912 systemd-logind[1535]: New session 6 of user core. Mar 12 23:42:42.077107 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 12 23:42:42.256128 sudo[1797]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 12 23:42:42.256369 sudo[1797]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:42:42.262890 sudo[1797]: pam_unix(sudo:session): session closed for user root Mar 12 23:42:42.270252 sudo[1796]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 12 23:42:42.270795 sudo[1796]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:42:42.285985 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:42:42.339884 augenrules[1819]: No rules Mar 12 23:42:42.341810 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:42:42.342051 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:42:42.345548 sudo[1796]: pam_unix(sudo:session): session closed for user root Mar 12 23:42:42.440808 sshd[1795]: Connection closed by 20.161.92.111 port 47406 Mar 12 23:42:42.439899 sshd-session[1792]: pam_unix(sshd:session): session closed for user core Mar 12 23:42:42.445842 systemd[1]: sshd@5-49.13.116.83:22-20.161.92.111:47406.service: Deactivated successfully. Mar 12 23:42:42.448388 systemd[1]: session-6.scope: Deactivated successfully. Mar 12 23:42:42.449662 systemd-logind[1535]: Session 6 logged out. Waiting for processes to exit. Mar 12 23:42:42.452411 systemd-logind[1535]: Removed session 6. Mar 12 23:42:42.549322 systemd[1]: Started sshd@6-49.13.116.83:22-20.161.92.111:47420.service - OpenSSH per-connection server daemon (20.161.92.111:47420). Mar 12 23:42:43.071498 sshd[1828]: Accepted publickey for core from 20.161.92.111 port 47420 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:42:43.073845 sshd-session[1828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:42:43.079891 systemd-logind[1535]: New session 7 of user core. Mar 12 23:42:43.095567 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 12 23:42:43.156790 update_engine[1538]: I20260312 23:42:43.155929 1538 update_attempter.cc:509] Updating boot flags... Mar 12 23:42:43.276246 sudo[1849]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 12 23:42:43.276621 sudo[1849]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:42:43.684619 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 12 23:42:43.696541 (dockerd)[1874]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 12 23:42:43.940853 dockerd[1874]: time="2026-03-12T23:42:43.938380053Z" level=info msg="Starting up" Mar 12 23:42:43.942249 dockerd[1874]: time="2026-03-12T23:42:43.942219822Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 12 23:42:43.955562 dockerd[1874]: time="2026-03-12T23:42:43.955516588Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 12 23:42:43.976062 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1233689202-merged.mount: Deactivated successfully. Mar 12 23:42:43.997186 dockerd[1874]: time="2026-03-12T23:42:43.997114111Z" level=info msg="Loading containers: start." Mar 12 23:42:44.008981 kernel: Initializing XFRM netlink socket Mar 12 23:42:44.273934 systemd-networkd[1432]: docker0: Link UP Mar 12 23:42:44.278767 dockerd[1874]: time="2026-03-12T23:42:44.278701949Z" level=info msg="Loading containers: done." Mar 12 23:42:44.298196 dockerd[1874]: time="2026-03-12T23:42:44.298152875Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 12 23:42:44.298421 dockerd[1874]: time="2026-03-12T23:42:44.298246876Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 12 23:42:44.298421 dockerd[1874]: time="2026-03-12T23:42:44.298334795Z" level=info msg="Initializing buildkit" Mar 12 23:42:44.322005 dockerd[1874]: time="2026-03-12T23:42:44.321944784Z" level=info msg="Completed buildkit initialization" Mar 12 23:42:44.329163 dockerd[1874]: time="2026-03-12T23:42:44.329097960Z" level=info msg="Daemon has completed initialization" Mar 12 23:42:44.329882 dockerd[1874]: time="2026-03-12T23:42:44.329378523Z" level=info msg="API listen on /run/docker.sock" Mar 12 23:42:44.334011 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 12 23:42:44.771942 containerd[1553]: time="2026-03-12T23:42:44.771388040Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 12 23:42:45.388032 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4109481590.mount: Deactivated successfully. Mar 12 23:42:46.234710 containerd[1553]: time="2026-03-12T23:42:46.234628442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:46.237638 containerd[1553]: time="2026-03-12T23:42:46.236983298Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=24701894" Mar 12 23:42:46.238634 containerd[1553]: time="2026-03-12T23:42:46.238589337Z" level=info msg="ImageCreate event name:\"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:46.243527 containerd[1553]: time="2026-03-12T23:42:46.243464477Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:46.245106 containerd[1553]: time="2026-03-12T23:42:46.245066154Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"24698395\" in 1.473617208s" Mar 12 23:42:46.245234 containerd[1553]: time="2026-03-12T23:42:46.245216414Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:713a7d5fc5ed8383c9ffe550e487150c9818d05f0c4c012688fbb27885fcc7bf\"" Mar 12 23:42:46.246210 containerd[1553]: time="2026-03-12T23:42:46.246133618Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 12 23:42:47.224219 containerd[1553]: time="2026-03-12T23:42:47.224144847Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:47.225686 containerd[1553]: time="2026-03-12T23:42:47.225640175Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=19063059" Mar 12 23:42:47.226922 containerd[1553]: time="2026-03-12T23:42:47.226891730Z" level=info msg="ImageCreate event name:\"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:47.230480 containerd[1553]: time="2026-03-12T23:42:47.230430552Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:47.231803 containerd[1553]: time="2026-03-12T23:42:47.231679386Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"20675140\" in 985.487985ms" Mar 12 23:42:47.231803 containerd[1553]: time="2026-03-12T23:42:47.231715280Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:6137f51959af5f0a4da7fb6c0bd868f615a534c02d42e303ad6fb31345ee4854\"" Mar 12 23:42:47.233599 containerd[1553]: time="2026-03-12T23:42:47.233384833Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 12 23:42:48.114865 containerd[1553]: time="2026-03-12T23:42:48.114042130Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:48.117087 containerd[1553]: time="2026-03-12T23:42:48.117042736Z" level=info msg="ImageCreate event name:\"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:48.117256 containerd[1553]: time="2026-03-12T23:42:48.117236967Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=13797921" Mar 12 23:42:48.122342 containerd[1553]: time="2026-03-12T23:42:48.122305362Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:48.123645 containerd[1553]: time="2026-03-12T23:42:48.123272112Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"15410020\" in 889.849505ms" Mar 12 23:42:48.123645 containerd[1553]: time="2026-03-12T23:42:48.123310726Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:6ad431b09accba3ccc8ac6df4b239aa11c7adf8ee0a477b9f0b54cf9f083f8c6\"" Mar 12 23:42:48.124036 containerd[1553]: time="2026-03-12T23:42:48.124009259Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 12 23:42:49.073716 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount58089818.mount: Deactivated successfully. Mar 12 23:42:49.308502 containerd[1553]: time="2026-03-12T23:42:49.308389984Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:49.309897 containerd[1553]: time="2026-03-12T23:42:49.309795111Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=22329609" Mar 12 23:42:49.310625 containerd[1553]: time="2026-03-12T23:42:49.310538448Z" level=info msg="ImageCreate event name:\"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:49.313079 containerd[1553]: time="2026-03-12T23:42:49.313031390Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:49.314419 containerd[1553]: time="2026-03-12T23:42:49.314296828Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"22328602\" in 1.190190894s" Mar 12 23:42:49.314419 containerd[1553]: time="2026-03-12T23:42:49.314328959Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:df7dcaf93e84e5dfbe96b2f86588b38a8959748d9c84b2e0532e2b5ae1bc5884\"" Mar 12 23:42:49.315091 containerd[1553]: time="2026-03-12T23:42:49.315010915Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 12 23:42:49.882983 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1522755458.mount: Deactivated successfully. Mar 12 23:42:50.608029 containerd[1553]: time="2026-03-12T23:42:50.607924212Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:50.610037 containerd[1553]: time="2026-03-12T23:42:50.609972530Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=21172309" Mar 12 23:42:50.610970 containerd[1553]: time="2026-03-12T23:42:50.610864305Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:50.614484 containerd[1553]: time="2026-03-12T23:42:50.614412158Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:50.616502 containerd[1553]: time="2026-03-12T23:42:50.616366085Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.301089798s" Mar 12 23:42:50.616502 containerd[1553]: time="2026-03-12T23:42:50.616403097Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"" Mar 12 23:42:50.617431 containerd[1553]: time="2026-03-12T23:42:50.617291711Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 12 23:42:50.719803 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 12 23:42:50.723464 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:50.877370 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:50.890873 (kubelet)[2222]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:42:50.935675 kubelet[2222]: E0312 23:42:50.935594 2222 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:42:50.938741 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:42:50.938919 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:42:50.939517 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.9M memory peak. Mar 12 23:42:51.179047 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1338538872.mount: Deactivated successfully. Mar 12 23:42:51.185333 containerd[1553]: time="2026-03-12T23:42:51.185269933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:51.186171 containerd[1553]: time="2026-03-12T23:42:51.186005526Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Mar 12 23:42:51.187129 containerd[1553]: time="2026-03-12T23:42:51.187084227Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:51.189761 containerd[1553]: time="2026-03-12T23:42:51.189668605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:51.190643 containerd[1553]: time="2026-03-12T23:42:51.190360104Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 573.018898ms" Mar 12 23:42:51.190643 containerd[1553]: time="2026-03-12T23:42:51.190396916Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 12 23:42:51.191856 containerd[1553]: time="2026-03-12T23:42:51.191483540Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 12 23:42:51.744435 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3625916100.mount: Deactivated successfully. Mar 12 23:42:52.453120 containerd[1553]: time="2026-03-12T23:42:52.453049832Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:52.454605 containerd[1553]: time="2026-03-12T23:42:52.454307893Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21738239" Mar 12 23:42:52.455497 containerd[1553]: time="2026-03-12T23:42:52.455459282Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:52.458450 containerd[1553]: time="2026-03-12T23:42:52.458414739Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:52.459923 containerd[1553]: time="2026-03-12T23:42:52.459678722Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 1.268155209s" Mar 12 23:42:52.459923 containerd[1553]: time="2026-03-12T23:42:52.459729217Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\"" Mar 12 23:42:55.620458 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:55.621838 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.9M memory peak. Mar 12 23:42:55.626846 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:55.659082 systemd[1]: Reload requested from client PID 2320 ('systemctl') (unit session-7.scope)... Mar 12 23:42:55.659099 systemd[1]: Reloading... Mar 12 23:42:55.785865 zram_generator::config[2363]: No configuration found. Mar 12 23:42:55.975720 systemd[1]: Reloading finished in 316 ms. Mar 12 23:42:56.030223 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 12 23:42:56.030305 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 12 23:42:56.030538 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:56.030625 systemd[1]: kubelet.service: Consumed 109ms CPU time, 94.9M memory peak. Mar 12 23:42:56.032858 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:56.182721 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:56.197716 (kubelet)[2412]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:42:56.238997 kubelet[2412]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:42:57.035312 kubelet[2412]: I0312 23:42:57.035201 2412 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 12 23:42:57.035312 kubelet[2412]: I0312 23:42:57.035280 2412 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:42:57.035312 kubelet[2412]: I0312 23:42:57.035312 2412 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 23:42:57.035312 kubelet[2412]: I0312 23:42:57.035321 2412 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:42:57.035850 kubelet[2412]: I0312 23:42:57.035789 2412 server.go:951] "Client rotation is on, will bootstrap in background" Mar 12 23:42:57.047123 kubelet[2412]: E0312 23:42:57.047052 2412 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://49.13.116.83:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 49.13.116.83:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 23:42:57.048257 kubelet[2412]: I0312 23:42:57.048217 2412 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:42:57.052916 kubelet[2412]: I0312 23:42:57.052873 2412 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:42:57.055607 kubelet[2412]: I0312 23:42:57.055568 2412 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 23:42:57.056696 kubelet[2412]: I0312 23:42:57.056612 2412 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:42:57.056894 kubelet[2412]: I0312 23:42:57.056688 2412 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-69ffcbf899","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:42:57.056894 kubelet[2412]: I0312 23:42:57.056897 2412 topology_manager.go:143] "Creating topology manager with none policy" Mar 12 23:42:57.057172 kubelet[2412]: I0312 23:42:57.056907 2412 container_manager_linux.go:308] "Creating device plugin manager" Mar 12 23:42:57.057172 kubelet[2412]: I0312 23:42:57.057008 2412 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 23:42:57.059456 kubelet[2412]: I0312 23:42:57.059406 2412 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 12 23:42:57.059688 kubelet[2412]: I0312 23:42:57.059677 2412 kubelet.go:482] "Attempting to sync node with API server" Mar 12 23:42:57.059717 kubelet[2412]: I0312 23:42:57.059699 2412 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:42:57.059748 kubelet[2412]: I0312 23:42:57.059717 2412 kubelet.go:394] "Adding apiserver pod source" Mar 12 23:42:57.059748 kubelet[2412]: I0312 23:42:57.059727 2412 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:42:57.064301 kubelet[2412]: I0312 23:42:57.064276 2412 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:42:57.065406 kubelet[2412]: I0312 23:42:57.065363 2412 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:42:57.065406 kubelet[2412]: I0312 23:42:57.065406 2412 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 23:42:57.065526 kubelet[2412]: W0312 23:42:57.065450 2412 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 12 23:42:57.068582 kubelet[2412]: I0312 23:42:57.068023 2412 server.go:1257] "Started kubelet" Mar 12 23:42:57.071125 kubelet[2412]: I0312 23:42:57.071086 2412 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 12 23:42:57.074000 kubelet[2412]: I0312 23:42:57.073925 2412 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:42:57.082422 kubelet[2412]: I0312 23:42:57.074803 2412 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:42:57.082687 kubelet[2412]: I0312 23:42:57.082665 2412 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 23:42:57.082786 kubelet[2412]: I0312 23:42:57.082762 2412 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 12 23:42:57.082949 kubelet[2412]: E0312 23:42:57.082928 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:42:57.083100 kubelet[2412]: I0312 23:42:57.083083 2412 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:42:57.083220 kubelet[2412]: I0312 23:42:57.079086 2412 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:42:57.083317 kubelet[2412]: I0312 23:42:57.083286 2412 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 23:42:57.083381 kubelet[2412]: I0312 23:42:57.083364 2412 reconciler.go:29] "Reconciler: start to sync state" Mar 12 23:42:57.084757 kubelet[2412]: I0312 23:42:57.084680 2412 server.go:317] "Adding debug handlers to kubelet server" Mar 12 23:42:57.085008 kubelet[2412]: E0312 23:42:57.084875 2412 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.116.83:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-69ffcbf899?timeout=10s\": dial tcp 49.13.116.83:6443: connect: connection refused" interval="200ms" Mar 12 23:42:57.087889 kubelet[2412]: E0312 23:42:57.086360 2412 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://49.13.116.83:6443/api/v1/namespaces/default/events\": dial tcp 49.13.116.83:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-69ffcbf899.189c3c980f632348 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-69ffcbf899,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-69ffcbf899,},FirstTimestamp:2026-03-12 23:42:57.067983688 +0000 UTC m=+0.864253896,LastTimestamp:2026-03-12 23:42:57.067983688 +0000 UTC m=+0.864253896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-69ffcbf899,}" Mar 12 23:42:57.091309 kubelet[2412]: E0312 23:42:57.091265 2412 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 23:42:57.091758 kubelet[2412]: I0312 23:42:57.091739 2412 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:42:57.091927 kubelet[2412]: I0312 23:42:57.091915 2412 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:42:57.092080 kubelet[2412]: I0312 23:42:57.092060 2412 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:42:57.107370 kubelet[2412]: I0312 23:42:57.107312 2412 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 23:42:57.108762 kubelet[2412]: I0312 23:42:57.108697 2412 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 23:42:57.108762 kubelet[2412]: I0312 23:42:57.108738 2412 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 12 23:42:57.108762 kubelet[2412]: I0312 23:42:57.108768 2412 kubelet.go:2501] "Starting kubelet main sync loop" Mar 12 23:42:57.108941 kubelet[2412]: E0312 23:42:57.108895 2412 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:42:57.127093 kubelet[2412]: I0312 23:42:57.126758 2412 cpu_manager.go:225] "Starting" policy="none" Mar 12 23:42:57.127093 kubelet[2412]: I0312 23:42:57.126777 2412 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 23:42:57.127093 kubelet[2412]: I0312 23:42:57.126797 2412 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 12 23:42:57.131525 kubelet[2412]: I0312 23:42:57.131491 2412 policy_none.go:50] "Start" Mar 12 23:42:57.131525 kubelet[2412]: I0312 23:42:57.131529 2412 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 23:42:57.131525 kubelet[2412]: I0312 23:42:57.131544 2412 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 23:42:57.132957 kubelet[2412]: I0312 23:42:57.132932 2412 policy_none.go:44] "Start" Mar 12 23:42:57.140831 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 12 23:42:57.153563 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 12 23:42:57.158111 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 12 23:42:57.168860 kubelet[2412]: E0312 23:42:57.168661 2412 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:42:57.169879 kubelet[2412]: I0312 23:42:57.169415 2412 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 12 23:42:57.169879 kubelet[2412]: I0312 23:42:57.169452 2412 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:42:57.172539 kubelet[2412]: I0312 23:42:57.172484 2412 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 12 23:42:57.173910 kubelet[2412]: E0312 23:42:57.173792 2412 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:42:57.173992 kubelet[2412]: E0312 23:42:57.173964 2412 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:42:57.228578 systemd[1]: Created slice kubepods-burstable-pod3d3aff213b3453c9d4eaa5e96e8ec1be.slice - libcontainer container kubepods-burstable-pod3d3aff213b3453c9d4eaa5e96e8ec1be.slice. Mar 12 23:42:57.250801 kubelet[2412]: E0312 23:42:57.250426 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.253652 systemd[1]: Created slice kubepods-burstable-pod27377fa7a365616cb833098711d9876a.slice - libcontainer container kubepods-burstable-pod27377fa7a365616cb833098711d9876a.slice. Mar 12 23:42:57.263448 kubelet[2412]: E0312 23:42:57.263335 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.268265 systemd[1]: Created slice kubepods-burstable-pod3cc864ed7f58b447b1668cf08361ffec.slice - libcontainer container kubepods-burstable-pod3cc864ed7f58b447b1668cf08361ffec.slice. Mar 12 23:42:57.273996 kubelet[2412]: E0312 23:42:57.273939 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.275831 kubelet[2412]: I0312 23:42:57.275797 2412 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.276479 kubelet[2412]: E0312 23:42:57.276433 2412 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://49.13.116.83:6443/api/v1/nodes\": dial tcp 49.13.116.83:6443: connect: connection refused" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.286939 kubelet[2412]: I0312 23:42:57.285287 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3d3aff213b3453c9d4eaa5e96e8ec1be-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" (UID: \"3d3aff213b3453c9d4eaa5e96e8ec1be\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.286939 kubelet[2412]: I0312 23:42:57.285351 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.286939 kubelet[2412]: I0312 23:42:57.285380 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.286939 kubelet[2412]: I0312 23:42:57.285402 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.286939 kubelet[2412]: I0312 23:42:57.285423 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3d3aff213b3453c9d4eaa5e96e8ec1be-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" (UID: \"3d3aff213b3453c9d4eaa5e96e8ec1be\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.287150 kubelet[2412]: I0312 23:42:57.285446 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3d3aff213b3453c9d4eaa5e96e8ec1be-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" (UID: \"3d3aff213b3453c9d4eaa5e96e8ec1be\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.287150 kubelet[2412]: I0312 23:42:57.285467 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.287150 kubelet[2412]: I0312 23:42:57.285487 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.287150 kubelet[2412]: I0312 23:42:57.285508 2412 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3cc864ed7f58b447b1668cf08361ffec-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-69ffcbf899\" (UID: \"3cc864ed7f58b447b1668cf08361ffec\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.287150 kubelet[2412]: E0312 23:42:57.285784 2412 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.116.83:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-69ffcbf899?timeout=10s\": dial tcp 49.13.116.83:6443: connect: connection refused" interval="400ms" Mar 12 23:42:57.481014 kubelet[2412]: I0312 23:42:57.480908 2412 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.482029 kubelet[2412]: E0312 23:42:57.481916 2412 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://49.13.116.83:6443/api/v1/nodes\": dial tcp 49.13.116.83:6443: connect: connection refused" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.556921 containerd[1553]: time="2026-03-12T23:42:57.556765470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-69ffcbf899,Uid:3d3aff213b3453c9d4eaa5e96e8ec1be,Namespace:kube-system,Attempt:0,}" Mar 12 23:42:57.566084 containerd[1553]: time="2026-03-12T23:42:57.566033330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-69ffcbf899,Uid:27377fa7a365616cb833098711d9876a,Namespace:kube-system,Attempt:0,}" Mar 12 23:42:57.578064 containerd[1553]: time="2026-03-12T23:42:57.578001620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-69ffcbf899,Uid:3cc864ed7f58b447b1668cf08361ffec,Namespace:kube-system,Attempt:0,}" Mar 12 23:42:57.687437 kubelet[2412]: E0312 23:42:57.687371 2412 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.116.83:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-69ffcbf899?timeout=10s\": dial tcp 49.13.116.83:6443: connect: connection refused" interval="800ms" Mar 12 23:42:57.885104 kubelet[2412]: I0312 23:42:57.884694 2412 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:57.885982 kubelet[2412]: E0312 23:42:57.885940 2412 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://49.13.116.83:6443/api/v1/nodes\": dial tcp 49.13.116.83:6443: connect: connection refused" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:58.104217 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3814449974.mount: Deactivated successfully. Mar 12 23:42:58.112797 containerd[1553]: time="2026-03-12T23:42:58.111991921Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:58.114951 containerd[1553]: time="2026-03-12T23:42:58.114917500Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Mar 12 23:42:58.116602 containerd[1553]: time="2026-03-12T23:42:58.116554572Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:58.118939 containerd[1553]: time="2026-03-12T23:42:58.118909095Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:58.120293 containerd[1553]: time="2026-03-12T23:42:58.120268300Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:58.122425 containerd[1553]: time="2026-03-12T23:42:58.122397729Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 12 23:42:58.123156 containerd[1553]: time="2026-03-12T23:42:58.123128904Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 12 23:42:58.124985 containerd[1553]: time="2026-03-12T23:42:58.124932295Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:58.125762 containerd[1553]: time="2026-03-12T23:42:58.125683955Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 557.740111ms" Mar 12 23:42:58.126779 containerd[1553]: time="2026-03-12T23:42:58.126596013Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 567.309157ms" Mar 12 23:42:58.133553 containerd[1553]: time="2026-03-12T23:42:58.133510386Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 553.842953ms" Mar 12 23:42:58.164242 containerd[1553]: time="2026-03-12T23:42:58.163448705Z" level=info msg="connecting to shim ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181" address="unix:///run/containerd/s/1c8fc9f385024105c1d2cf6b1e62dacf799660b09a2f0e0045180c9062be58eb" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:58.185098 containerd[1553]: time="2026-03-12T23:42:58.185045189Z" level=info msg="connecting to shim ff9ac48b3bc9f7db88b960834697a7943a6f71d657e0432c3aac61f51d07df75" address="unix:///run/containerd/s/bfb324e3106fd53b31d4e660353724edc697731ea1fe2ade6208b2b9d78840ed" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:58.192667 containerd[1553]: time="2026-03-12T23:42:58.192529859Z" level=info msg="connecting to shim 95b194ce972fc756a4c874b364798ecbc95bdc657f7cc9a600e9aabfe73992fc" address="unix:///run/containerd/s/1e2f1af86840970fb70903ad564095fec802ed1a608e10ef06c7bd8552cf91e6" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:58.204035 systemd[1]: Started cri-containerd-ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181.scope - libcontainer container ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181. Mar 12 23:42:58.223000 systemd[1]: Started cri-containerd-ff9ac48b3bc9f7db88b960834697a7943a6f71d657e0432c3aac61f51d07df75.scope - libcontainer container ff9ac48b3bc9f7db88b960834697a7943a6f71d657e0432c3aac61f51d07df75. Mar 12 23:42:58.231947 systemd[1]: Started cri-containerd-95b194ce972fc756a4c874b364798ecbc95bdc657f7cc9a600e9aabfe73992fc.scope - libcontainer container 95b194ce972fc756a4c874b364798ecbc95bdc657f7cc9a600e9aabfe73992fc. Mar 12 23:42:58.272610 containerd[1553]: time="2026-03-12T23:42:58.272549154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-69ffcbf899,Uid:27377fa7a365616cb833098711d9876a,Namespace:kube-system,Attempt:0,} returns sandbox id \"ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181\"" Mar 12 23:42:58.289473 containerd[1553]: time="2026-03-12T23:42:58.288585028Z" level=info msg="CreateContainer within sandbox \"ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 12 23:42:58.300432 containerd[1553]: time="2026-03-12T23:42:58.300387170Z" level=info msg="Container ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:58.312205 containerd[1553]: time="2026-03-12T23:42:58.312160426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-69ffcbf899,Uid:3d3aff213b3453c9d4eaa5e96e8ec1be,Namespace:kube-system,Attempt:0,} returns sandbox id \"95b194ce972fc756a4c874b364798ecbc95bdc657f7cc9a600e9aabfe73992fc\"" Mar 12 23:42:58.314903 containerd[1553]: time="2026-03-12T23:42:58.314845428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-69ffcbf899,Uid:3cc864ed7f58b447b1668cf08361ffec,Namespace:kube-system,Attempt:0,} returns sandbox id \"ff9ac48b3bc9f7db88b960834697a7943a6f71d657e0432c3aac61f51d07df75\"" Mar 12 23:42:58.318770 containerd[1553]: time="2026-03-12T23:42:58.318672943Z" level=info msg="CreateContainer within sandbox \"ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9\"" Mar 12 23:42:58.319840 containerd[1553]: time="2026-03-12T23:42:58.319753761Z" level=info msg="CreateContainer within sandbox \"95b194ce972fc756a4c874b364798ecbc95bdc657f7cc9a600e9aabfe73992fc\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 12 23:42:58.319942 containerd[1553]: time="2026-03-12T23:42:58.319795051Z" level=info msg="StartContainer for \"ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9\"" Mar 12 23:42:58.320902 containerd[1553]: time="2026-03-12T23:42:58.320842102Z" level=info msg="CreateContainer within sandbox \"ff9ac48b3bc9f7db88b960834697a7943a6f71d657e0432c3aac61f51d07df75\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 12 23:42:58.324559 containerd[1553]: time="2026-03-12T23:42:58.324298128Z" level=info msg="connecting to shim ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9" address="unix:///run/containerd/s/1c8fc9f385024105c1d2cf6b1e62dacf799660b09a2f0e0045180c9062be58eb" protocol=ttrpc version=3 Mar 12 23:42:58.333263 containerd[1553]: time="2026-03-12T23:42:58.333227023Z" level=info msg="Container 5ac80be102ae6b9c184635ce807bd4d19d3abef839605fd8700286b88d222346: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:58.341091 containerd[1553]: time="2026-03-12T23:42:58.341040492Z" level=info msg="Container 33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:58.350231 containerd[1553]: time="2026-03-12T23:42:58.349980069Z" level=info msg="CreateContainer within sandbox \"95b194ce972fc756a4c874b364798ecbc95bdc657f7cc9a600e9aabfe73992fc\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5ac80be102ae6b9c184635ce807bd4d19d3abef839605fd8700286b88d222346\"" Mar 12 23:42:58.350752 containerd[1553]: time="2026-03-12T23:42:58.350716085Z" level=info msg="StartContainer for \"5ac80be102ae6b9c184635ce807bd4d19d3abef839605fd8700286b88d222346\"" Mar 12 23:42:58.351871 containerd[1553]: time="2026-03-12T23:42:58.351802025Z" level=info msg="connecting to shim 5ac80be102ae6b9c184635ce807bd4d19d3abef839605fd8700286b88d222346" address="unix:///run/containerd/s/1e2f1af86840970fb70903ad564095fec802ed1a608e10ef06c7bd8552cf91e6" protocol=ttrpc version=3 Mar 12 23:42:58.352486 systemd[1]: Started cri-containerd-ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9.scope - libcontainer container ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9. Mar 12 23:42:58.355049 containerd[1553]: time="2026-03-12T23:42:58.354999190Z" level=info msg="CreateContainer within sandbox \"ff9ac48b3bc9f7db88b960834697a7943a6f71d657e0432c3aac61f51d07df75\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088\"" Mar 12 23:42:58.356497 containerd[1553]: time="2026-03-12T23:42:58.356050721Z" level=info msg="StartContainer for \"33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088\"" Mar 12 23:42:58.359878 containerd[1553]: time="2026-03-12T23:42:58.359833586Z" level=info msg="connecting to shim 33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088" address="unix:///run/containerd/s/bfb324e3106fd53b31d4e660353724edc697731ea1fe2ade6208b2b9d78840ed" protocol=ttrpc version=3 Mar 12 23:42:58.384101 systemd[1]: Started cri-containerd-5ac80be102ae6b9c184635ce807bd4d19d3abef839605fd8700286b88d222346.scope - libcontainer container 5ac80be102ae6b9c184635ce807bd4d19d3abef839605fd8700286b88d222346. Mar 12 23:42:58.394285 systemd[1]: Started cri-containerd-33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088.scope - libcontainer container 33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088. Mar 12 23:42:58.433970 containerd[1553]: time="2026-03-12T23:42:58.433614108Z" level=info msg="StartContainer for \"ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9\" returns successfully" Mar 12 23:42:58.462086 containerd[1553]: time="2026-03-12T23:42:58.462009858Z" level=info msg="StartContainer for \"5ac80be102ae6b9c184635ce807bd4d19d3abef839605fd8700286b88d222346\" returns successfully" Mar 12 23:42:58.481056 containerd[1553]: time="2026-03-12T23:42:58.481009562Z" level=info msg="StartContainer for \"33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088\" returns successfully" Mar 12 23:42:58.487700 kubelet[2412]: E0312 23:42:58.487648 2412 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.116.83:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-69ffcbf899?timeout=10s\": dial tcp 49.13.116.83:6443: connect: connection refused" interval="1.6s" Mar 12 23:42:58.689906 kubelet[2412]: I0312 23:42:58.688643 2412 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:59.133924 kubelet[2412]: E0312 23:42:59.133726 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:59.134599 kubelet[2412]: E0312 23:42:59.134428 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:42:59.138686 kubelet[2412]: E0312 23:42:59.138660 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.129236 kubelet[2412]: E0312 23:43:00.129175 2412 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.141459 kubelet[2412]: E0312 23:43:00.141258 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.141957 kubelet[2412]: E0312 23:43:00.141936 2412 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.158876 kubelet[2412]: I0312 23:43:00.156807 2412 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.159162 kubelet[2412]: E0312 23:43:00.159035 2412 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-n-69ffcbf899\": node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.189847 kubelet[2412]: E0312 23:43:00.187543 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.290653 kubelet[2412]: E0312 23:43:00.290534 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.392174 kubelet[2412]: E0312 23:43:00.391579 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.492288 kubelet[2412]: E0312 23:43:00.492226 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.592958 kubelet[2412]: E0312 23:43:00.592893 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.694251 kubelet[2412]: E0312 23:43:00.693675 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.794773 kubelet[2412]: E0312 23:43:00.794657 2412 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:00.884696 kubelet[2412]: I0312 23:43:00.883896 2412 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.893002 kubelet[2412]: E0312 23:43:00.892958 2412 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.893002 kubelet[2412]: I0312 23:43:00.892997 2412 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.897407 kubelet[2412]: E0312 23:43:00.897242 2412 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.897407 kubelet[2412]: I0312 23:43:00.897270 2412 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:00.900429 kubelet[2412]: E0312 23:43:00.900381 2412 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-69ffcbf899\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:01.064265 kubelet[2412]: I0312 23:43:01.064190 2412 apiserver.go:52] "Watching apiserver" Mar 12 23:43:01.084172 kubelet[2412]: I0312 23:43:01.084125 2412 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 23:43:01.140772 kubelet[2412]: I0312 23:43:01.140719 2412 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:02.462021 kubelet[2412]: I0312 23:43:02.461978 2412 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:02.648938 systemd[1]: Reload requested from client PID 2700 ('systemctl') (unit session-7.scope)... Mar 12 23:43:02.648963 systemd[1]: Reloading... Mar 12 23:43:02.766968 zram_generator::config[2744]: No configuration found. Mar 12 23:43:02.979973 systemd[1]: Reloading finished in 330 ms. Mar 12 23:43:03.010453 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:43:03.025745 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 23:43:03.026342 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:43:03.026439 systemd[1]: kubelet.service: Consumed 1.315s CPU time, 119.9M memory peak. Mar 12 23:43:03.028611 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:43:03.189638 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:43:03.203422 (kubelet)[2789]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:43:03.263837 kubelet[2789]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:43:03.276953 kubelet[2789]: I0312 23:43:03.275477 2789 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 12 23:43:03.276953 kubelet[2789]: I0312 23:43:03.275529 2789 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:43:03.276953 kubelet[2789]: I0312 23:43:03.275551 2789 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 23:43:03.276953 kubelet[2789]: I0312 23:43:03.275556 2789 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:43:03.276953 kubelet[2789]: I0312 23:43:03.275836 2789 server.go:951] "Client rotation is on, will bootstrap in background" Mar 12 23:43:03.277971 kubelet[2789]: I0312 23:43:03.277951 2789 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 12 23:43:03.280495 kubelet[2789]: I0312 23:43:03.280467 2789 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:43:03.285154 kubelet[2789]: I0312 23:43:03.285134 2789 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:43:03.287703 kubelet[2789]: I0312 23:43:03.287676 2789 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 23:43:03.288042 kubelet[2789]: I0312 23:43:03.288004 2789 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:43:03.288299 kubelet[2789]: I0312 23:43:03.288105 2789 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-69ffcbf899","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:43:03.288425 kubelet[2789]: I0312 23:43:03.288412 2789 topology_manager.go:143] "Creating topology manager with none policy" Mar 12 23:43:03.288482 kubelet[2789]: I0312 23:43:03.288474 2789 container_manager_linux.go:308] "Creating device plugin manager" Mar 12 23:43:03.288547 kubelet[2789]: I0312 23:43:03.288539 2789 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 23:43:03.288802 kubelet[2789]: I0312 23:43:03.288784 2789 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 12 23:43:03.289087 kubelet[2789]: I0312 23:43:03.289070 2789 kubelet.go:482] "Attempting to sync node with API server" Mar 12 23:43:03.289155 kubelet[2789]: I0312 23:43:03.289145 2789 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:43:03.289237 kubelet[2789]: I0312 23:43:03.289215 2789 kubelet.go:394] "Adding apiserver pod source" Mar 12 23:43:03.289300 kubelet[2789]: I0312 23:43:03.289291 2789 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:43:03.291941 kubelet[2789]: I0312 23:43:03.291904 2789 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:43:03.293186 kubelet[2789]: I0312 23:43:03.293168 2789 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:43:03.293347 kubelet[2789]: I0312 23:43:03.293334 2789 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 23:43:03.307764 kubelet[2789]: I0312 23:43:03.307477 2789 server.go:1257] "Started kubelet" Mar 12 23:43:03.310365 kubelet[2789]: I0312 23:43:03.310288 2789 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:43:03.318216 kubelet[2789]: I0312 23:43:03.318028 2789 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 12 23:43:03.322841 kubelet[2789]: I0312 23:43:03.318510 2789 server.go:317] "Adding debug handlers to kubelet server" Mar 12 23:43:03.323790 kubelet[2789]: I0312 23:43:03.318539 2789 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:43:03.328842 kubelet[2789]: I0312 23:43:03.328453 2789 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 23:43:03.328842 kubelet[2789]: I0312 23:43:03.328633 2789 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:43:03.328842 kubelet[2789]: E0312 23:43:03.324888 2789 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-69ffcbf899\" not found" Mar 12 23:43:03.328842 kubelet[2789]: I0312 23:43:03.326150 2789 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:43:03.328842 kubelet[2789]: I0312 23:43:03.323970 2789 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 12 23:43:03.329960 kubelet[2789]: I0312 23:43:03.323992 2789 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 23:43:03.330692 kubelet[2789]: I0312 23:43:03.330671 2789 reconciler.go:29] "Reconciler: start to sync state" Mar 12 23:43:03.335282 kubelet[2789]: I0312 23:43:03.335261 2789 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:43:03.335712 kubelet[2789]: I0312 23:43:03.335694 2789 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:43:03.336624 kubelet[2789]: I0312 23:43:03.336048 2789 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:43:03.369872 kubelet[2789]: I0312 23:43:03.368608 2789 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 23:43:03.373369 kubelet[2789]: I0312 23:43:03.373324 2789 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 23:43:03.373502 kubelet[2789]: I0312 23:43:03.373377 2789 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 12 23:43:03.373502 kubelet[2789]: I0312 23:43:03.373410 2789 kubelet.go:2501] "Starting kubelet main sync loop" Mar 12 23:43:03.373541 kubelet[2789]: E0312 23:43:03.373480 2789 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:43:03.404162 kubelet[2789]: I0312 23:43:03.404128 2789 cpu_manager.go:225] "Starting" policy="none" Mar 12 23:43:03.404162 kubelet[2789]: I0312 23:43:03.404193 2789 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 23:43:03.404162 kubelet[2789]: I0312 23:43:03.404225 2789 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 12 23:43:03.404162 kubelet[2789]: I0312 23:43:03.404379 2789 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 12 23:43:03.404162 kubelet[2789]: I0312 23:43:03.404390 2789 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 12 23:43:03.404690 kubelet[2789]: I0312 23:43:03.404680 2789 policy_none.go:50] "Start" Mar 12 23:43:03.404762 kubelet[2789]: I0312 23:43:03.404748 2789 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 23:43:03.404847 kubelet[2789]: I0312 23:43:03.404837 2789 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 23:43:03.405086 kubelet[2789]: I0312 23:43:03.405060 2789 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 12 23:43:03.405205 kubelet[2789]: I0312 23:43:03.405195 2789 policy_none.go:44] "Start" Mar 12 23:43:03.411274 kubelet[2789]: E0312 23:43:03.410621 2789 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:43:03.412505 kubelet[2789]: I0312 23:43:03.412453 2789 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 12 23:43:03.412505 kubelet[2789]: I0312 23:43:03.412468 2789 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:43:03.412966 kubelet[2789]: I0312 23:43:03.412758 2789 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 12 23:43:03.417836 kubelet[2789]: E0312 23:43:03.417742 2789 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:43:03.475860 kubelet[2789]: I0312 23:43:03.474476 2789 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.475860 kubelet[2789]: I0312 23:43:03.474644 2789 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.475860 kubelet[2789]: I0312 23:43:03.474493 2789 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.485981 kubelet[2789]: E0312 23:43:03.485933 2789 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-69ffcbf899\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.486600 kubelet[2789]: E0312 23:43:03.486542 2789 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.516041 kubelet[2789]: I0312 23:43:03.515889 2789 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.523973 kubelet[2789]: I0312 23:43:03.523926 2789 kubelet_node_status.go:123] "Node was previously registered" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.524090 kubelet[2789]: I0312 23:43:03.524028 2789 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535462 kubelet[2789]: I0312 23:43:03.534752 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535462 kubelet[2789]: I0312 23:43:03.534811 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3cc864ed7f58b447b1668cf08361ffec-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-69ffcbf899\" (UID: \"3cc864ed7f58b447b1668cf08361ffec\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535462 kubelet[2789]: I0312 23:43:03.534848 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3d3aff213b3453c9d4eaa5e96e8ec1be-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" (UID: \"3d3aff213b3453c9d4eaa5e96e8ec1be\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535462 kubelet[2789]: I0312 23:43:03.534869 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535462 kubelet[2789]: I0312 23:43:03.534887 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535662 kubelet[2789]: I0312 23:43:03.534903 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535662 kubelet[2789]: I0312 23:43:03.534918 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3d3aff213b3453c9d4eaa5e96e8ec1be-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" (UID: \"3d3aff213b3453c9d4eaa5e96e8ec1be\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535662 kubelet[2789]: I0312 23:43:03.534932 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3d3aff213b3453c9d4eaa5e96e8ec1be-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-69ffcbf899\" (UID: \"3d3aff213b3453c9d4eaa5e96e8ec1be\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:03.535662 kubelet[2789]: I0312 23:43:03.534954 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/27377fa7a365616cb833098711d9876a-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" (UID: \"27377fa7a365616cb833098711d9876a\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:04.290794 kubelet[2789]: I0312 23:43:04.289994 2789 apiserver.go:52] "Watching apiserver" Mar 12 23:43:04.330756 kubelet[2789]: I0312 23:43:04.330708 2789 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 23:43:04.398708 kubelet[2789]: I0312 23:43:04.398675 2789 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:04.409924 kubelet[2789]: E0312 23:43:04.409797 2789 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-69ffcbf899\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:04.434863 kubelet[2789]: I0312 23:43:04.434693 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-69ffcbf899" podStartSLOduration=2.434667902 podStartE2EDuration="2.434667902s" podCreationTimestamp="2026-03-12 23:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:04.424409136 +0000 UTC m=+1.214211897" watchObservedRunningTime="2026-03-12 23:43:04.434667902 +0000 UTC m=+1.224470663" Mar 12 23:43:04.435924 kubelet[2789]: I0312 23:43:04.435725 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-69ffcbf899" podStartSLOduration=3.435711226 podStartE2EDuration="3.435711226s" podCreationTimestamp="2026-03-12 23:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:04.434581966 +0000 UTC m=+1.224384687" watchObservedRunningTime="2026-03-12 23:43:04.435711226 +0000 UTC m=+1.225514027" Mar 12 23:43:04.447488 kubelet[2789]: I0312 23:43:04.447412 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-69ffcbf899" podStartSLOduration=1.447395711 podStartE2EDuration="1.447395711s" podCreationTimestamp="2026-03-12 23:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:04.446767748 +0000 UTC m=+1.236570509" watchObservedRunningTime="2026-03-12 23:43:04.447395711 +0000 UTC m=+1.237198512" Mar 12 23:43:08.341071 kubelet[2789]: I0312 23:43:08.341033 2789 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 12 23:43:08.342803 containerd[1553]: time="2026-03-12T23:43:08.342481656Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 12 23:43:08.343316 kubelet[2789]: I0312 23:43:08.342877 2789 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 12 23:43:09.540654 systemd[1]: Created slice kubepods-besteffort-pod1f1382d0_294e_4ac8_99c8_649782060d9b.slice - libcontainer container kubepods-besteffort-pod1f1382d0_294e_4ac8_99c8_649782060d9b.slice. Mar 12 23:43:09.573209 kubelet[2789]: I0312 23:43:09.573087 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f1382d0-294e-4ac8-99c8-649782060d9b-lib-modules\") pod \"kube-proxy-llxb6\" (UID: \"1f1382d0-294e-4ac8-99c8-649782060d9b\") " pod="kube-system/kube-proxy-llxb6" Mar 12 23:43:09.574279 kubelet[2789]: I0312 23:43:09.574070 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdgx2\" (UniqueName: \"kubernetes.io/projected/1f1382d0-294e-4ac8-99c8-649782060d9b-kube-api-access-wdgx2\") pod \"kube-proxy-llxb6\" (UID: \"1f1382d0-294e-4ac8-99c8-649782060d9b\") " pod="kube-system/kube-proxy-llxb6" Mar 12 23:43:09.574279 kubelet[2789]: I0312 23:43:09.574153 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1f1382d0-294e-4ac8-99c8-649782060d9b-kube-proxy\") pod \"kube-proxy-llxb6\" (UID: \"1f1382d0-294e-4ac8-99c8-649782060d9b\") " pod="kube-system/kube-proxy-llxb6" Mar 12 23:43:09.574279 kubelet[2789]: I0312 23:43:09.574185 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1f1382d0-294e-4ac8-99c8-649782060d9b-xtables-lock\") pod \"kube-proxy-llxb6\" (UID: \"1f1382d0-294e-4ac8-99c8-649782060d9b\") " pod="kube-system/kube-proxy-llxb6" Mar 12 23:43:09.673144 systemd[1]: Created slice kubepods-besteffort-pod1f3a1d64_5234_4924_9b1c_88f956a664b4.slice - libcontainer container kubepods-besteffort-pod1f3a1d64_5234_4924_9b1c_88f956a664b4.slice. Mar 12 23:43:09.775801 kubelet[2789]: I0312 23:43:09.775744 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvcr\" (UniqueName: \"kubernetes.io/projected/1f3a1d64-5234-4924-9b1c-88f956a664b4-kube-api-access-9mvcr\") pod \"tigera-operator-6cf4cccc57-6444c\" (UID: \"1f3a1d64-5234-4924-9b1c-88f956a664b4\") " pod="tigera-operator/tigera-operator-6cf4cccc57-6444c" Mar 12 23:43:09.776108 kubelet[2789]: I0312 23:43:09.776080 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1f3a1d64-5234-4924-9b1c-88f956a664b4-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-6444c\" (UID: \"1f3a1d64-5234-4924-9b1c-88f956a664b4\") " pod="tigera-operator/tigera-operator-6cf4cccc57-6444c" Mar 12 23:43:09.854910 containerd[1553]: time="2026-03-12T23:43:09.854728237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-llxb6,Uid:1f1382d0-294e-4ac8-99c8-649782060d9b,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:09.878560 containerd[1553]: time="2026-03-12T23:43:09.878184668Z" level=info msg="connecting to shim 8a5310505fcfcb4fa2110885f25a592c37537e8465eb79fd7daec36ef7553845" address="unix:///run/containerd/s/2e04ee50acadd7685fc541584ae3463d70b362b01468caf4412c00673d8d8885" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:09.914094 systemd[1]: Started cri-containerd-8a5310505fcfcb4fa2110885f25a592c37537e8465eb79fd7daec36ef7553845.scope - libcontainer container 8a5310505fcfcb4fa2110885f25a592c37537e8465eb79fd7daec36ef7553845. Mar 12 23:43:09.947903 containerd[1553]: time="2026-03-12T23:43:09.947769666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-llxb6,Uid:1f1382d0-294e-4ac8-99c8-649782060d9b,Namespace:kube-system,Attempt:0,} returns sandbox id \"8a5310505fcfcb4fa2110885f25a592c37537e8465eb79fd7daec36ef7553845\"" Mar 12 23:43:09.954446 containerd[1553]: time="2026-03-12T23:43:09.954320621Z" level=info msg="CreateContainer within sandbox \"8a5310505fcfcb4fa2110885f25a592c37537e8465eb79fd7daec36ef7553845\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 12 23:43:09.968856 containerd[1553]: time="2026-03-12T23:43:09.968093764Z" level=info msg="Container 5dad33eb763acad176cc80b408734881e0246b485c6996c32a03233d831ca42d: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:09.977191 containerd[1553]: time="2026-03-12T23:43:09.977134422Z" level=info msg="CreateContainer within sandbox \"8a5310505fcfcb4fa2110885f25a592c37537e8465eb79fd7daec36ef7553845\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5dad33eb763acad176cc80b408734881e0246b485c6996c32a03233d831ca42d\"" Mar 12 23:43:09.978892 containerd[1553]: time="2026-03-12T23:43:09.978414240Z" level=info msg="StartContainer for \"5dad33eb763acad176cc80b408734881e0246b485c6996c32a03233d831ca42d\"" Mar 12 23:43:09.980376 containerd[1553]: time="2026-03-12T23:43:09.980312163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-6444c,Uid:1f3a1d64-5234-4924-9b1c-88f956a664b4,Namespace:tigera-operator,Attempt:0,}" Mar 12 23:43:09.982132 containerd[1553]: time="2026-03-12T23:43:09.982012852Z" level=info msg="connecting to shim 5dad33eb763acad176cc80b408734881e0246b485c6996c32a03233d831ca42d" address="unix:///run/containerd/s/2e04ee50acadd7685fc541584ae3463d70b362b01468caf4412c00673d8d8885" protocol=ttrpc version=3 Mar 12 23:43:10.005013 systemd[1]: Started cri-containerd-5dad33eb763acad176cc80b408734881e0246b485c6996c32a03233d831ca42d.scope - libcontainer container 5dad33eb763acad176cc80b408734881e0246b485c6996c32a03233d831ca42d. Mar 12 23:43:10.007210 containerd[1553]: time="2026-03-12T23:43:10.007164425Z" level=info msg="connecting to shim dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f" address="unix:///run/containerd/s/cbfc2524916db6894ae7824a647d00be56d4e1e437112d6d1a0fd6019c651e35" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:10.035368 systemd[1]: Started cri-containerd-dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f.scope - libcontainer container dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f. Mar 12 23:43:10.096149 containerd[1553]: time="2026-03-12T23:43:10.096111667Z" level=info msg="StartContainer for \"5dad33eb763acad176cc80b408734881e0246b485c6996c32a03233d831ca42d\" returns successfully" Mar 12 23:43:10.102791 containerd[1553]: time="2026-03-12T23:43:10.102692039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-6444c,Uid:1f3a1d64-5234-4924-9b1c-88f956a664b4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f\"" Mar 12 23:43:10.107795 containerd[1553]: time="2026-03-12T23:43:10.107298764Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 12 23:43:10.435218 kubelet[2789]: I0312 23:43:10.435012 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-llxb6" podStartSLOduration=1.434994189 podStartE2EDuration="1.434994189s" podCreationTimestamp="2026-03-12 23:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:10.432139756 +0000 UTC m=+7.221942557" watchObservedRunningTime="2026-03-12 23:43:10.434994189 +0000 UTC m=+7.224796990" Mar 12 23:43:11.626637 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1025956027.mount: Deactivated successfully. Mar 12 23:43:12.098866 containerd[1553]: time="2026-03-12T23:43:12.098801667Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:12.100166 containerd[1553]: time="2026-03-12T23:43:12.100115715Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 12 23:43:12.100166 containerd[1553]: time="2026-03-12T23:43:12.100135438Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:12.102812 containerd[1553]: time="2026-03-12T23:43:12.102750453Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:12.103886 containerd[1553]: time="2026-03-12T23:43:12.103845746Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 1.996119111s" Mar 12 23:43:12.103886 containerd[1553]: time="2026-03-12T23:43:12.103884672Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 12 23:43:12.111122 containerd[1553]: time="2026-03-12T23:43:12.111063889Z" level=info msg="CreateContainer within sandbox \"dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 12 23:43:12.121662 containerd[1553]: time="2026-03-12T23:43:12.121500942Z" level=info msg="Container 3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:12.123360 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4075167862.mount: Deactivated successfully. Mar 12 23:43:12.134596 containerd[1553]: time="2026-03-12T23:43:12.134513003Z" level=info msg="CreateContainer within sandbox \"dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4\"" Mar 12 23:43:12.135961 containerd[1553]: time="2026-03-12T23:43:12.135516562Z" level=info msg="StartContainer for \"3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4\"" Mar 12 23:43:12.138568 containerd[1553]: time="2026-03-12T23:43:12.138384617Z" level=info msg="connecting to shim 3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4" address="unix:///run/containerd/s/cbfc2524916db6894ae7824a647d00be56d4e1e437112d6d1a0fd6019c651e35" protocol=ttrpc version=3 Mar 12 23:43:12.170437 systemd[1]: Started cri-containerd-3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4.scope - libcontainer container 3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4. Mar 12 23:43:12.208003 containerd[1553]: time="2026-03-12T23:43:12.207949035Z" level=info msg="StartContainer for \"3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4\" returns successfully" Mar 12 23:43:12.440375 kubelet[2789]: I0312 23:43:12.440157 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-6444c" podStartSLOduration=1.4402510849999999 podStartE2EDuration="3.44013713s" podCreationTimestamp="2026-03-12 23:43:09 +0000 UTC" firstStartedPulling="2026-03-12 23:43:10.105535151 +0000 UTC m=+6.895337872" lastFinishedPulling="2026-03-12 23:43:12.105421156 +0000 UTC m=+8.895223917" observedRunningTime="2026-03-12 23:43:12.439143613 +0000 UTC m=+9.228946374" watchObservedRunningTime="2026-03-12 23:43:12.44013713 +0000 UTC m=+9.229939891" Mar 12 23:43:18.468907 sudo[1849]: pam_unix(sudo:session): session closed for user root Mar 12 23:43:18.565838 sshd[1831]: Connection closed by 20.161.92.111 port 47420 Mar 12 23:43:18.566289 sshd-session[1828]: pam_unix(sshd:session): session closed for user core Mar 12 23:43:18.573404 systemd[1]: sshd@6-49.13.116.83:22-20.161.92.111:47420.service: Deactivated successfully. Mar 12 23:43:18.578534 systemd[1]: session-7.scope: Deactivated successfully. Mar 12 23:43:18.580371 systemd[1]: session-7.scope: Consumed 5.426s CPU time, 216.3M memory peak. Mar 12 23:43:18.583909 systemd-logind[1535]: Session 7 logged out. Waiting for processes to exit. Mar 12 23:43:18.586688 systemd-logind[1535]: Removed session 7. Mar 12 23:43:24.190119 systemd[1]: Created slice kubepods-besteffort-poda27c819e_a19f_48e3_893c_6481d6408731.slice - libcontainer container kubepods-besteffort-poda27c819e_a19f_48e3_893c_6481d6408731.slice. Mar 12 23:43:24.275668 kubelet[2789]: I0312 23:43:24.275586 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a27c819e-a19f-48e3-893c-6481d6408731-tigera-ca-bundle\") pod \"calico-typha-64c7ff778b-z79wz\" (UID: \"a27c819e-a19f-48e3-893c-6481d6408731\") " pod="calico-system/calico-typha-64c7ff778b-z79wz" Mar 12 23:43:24.275668 kubelet[2789]: I0312 23:43:24.275627 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a27c819e-a19f-48e3-893c-6481d6408731-typha-certs\") pod \"calico-typha-64c7ff778b-z79wz\" (UID: \"a27c819e-a19f-48e3-893c-6481d6408731\") " pod="calico-system/calico-typha-64c7ff778b-z79wz" Mar 12 23:43:24.275668 kubelet[2789]: I0312 23:43:24.275649 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c525\" (UniqueName: \"kubernetes.io/projected/a27c819e-a19f-48e3-893c-6481d6408731-kube-api-access-5c525\") pod \"calico-typha-64c7ff778b-z79wz\" (UID: \"a27c819e-a19f-48e3-893c-6481d6408731\") " pod="calico-system/calico-typha-64c7ff778b-z79wz" Mar 12 23:43:24.279798 systemd[1]: Created slice kubepods-besteffort-podb7dd7a6f_a1a7_450e_99f8_35d7fc223bfc.slice - libcontainer container kubepods-besteffort-podb7dd7a6f_a1a7_450e_99f8_35d7fc223bfc.slice. Mar 12 23:43:24.376370 kubelet[2789]: I0312 23:43:24.376312 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-cni-bin-dir\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376370 kubelet[2789]: I0312 23:43:24.376369 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-cni-log-dir\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376546 kubelet[2789]: I0312 23:43:24.376399 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-flexvol-driver-host\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376546 kubelet[2789]: I0312 23:43:24.376417 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-nodeproc\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376546 kubelet[2789]: I0312 23:43:24.376432 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-sys-fs\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376546 kubelet[2789]: I0312 23:43:24.376447 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-cni-net-dir\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376546 kubelet[2789]: I0312 23:43:24.376459 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-tigera-ca-bundle\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376697 kubelet[2789]: I0312 23:43:24.376495 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-var-lib-calico\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376697 kubelet[2789]: I0312 23:43:24.376524 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-policysync\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376697 kubelet[2789]: I0312 23:43:24.376538 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-var-run-calico\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376697 kubelet[2789]: I0312 23:43:24.376601 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-bpffs\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376697 kubelet[2789]: I0312 23:43:24.376617 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-lib-modules\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376791 kubelet[2789]: I0312 23:43:24.376630 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-node-certs\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376791 kubelet[2789]: I0312 23:43:24.376645 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-xtables-lock\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.376791 kubelet[2789]: I0312 23:43:24.376660 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvrl\" (UniqueName: \"kubernetes.io/projected/b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc-kube-api-access-8rvrl\") pod \"calico-node-8kgl2\" (UID: \"b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc\") " pod="calico-system/calico-node-8kgl2" Mar 12 23:43:24.391917 kubelet[2789]: E0312 23:43:24.390177 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szgbn" podUID="f06e482b-f74f-4302-8898-fd2753a17184" Mar 12 23:43:24.477610 kubelet[2789]: I0312 23:43:24.477566 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f06e482b-f74f-4302-8898-fd2753a17184-socket-dir\") pod \"csi-node-driver-szgbn\" (UID: \"f06e482b-f74f-4302-8898-fd2753a17184\") " pod="calico-system/csi-node-driver-szgbn" Mar 12 23:43:24.477750 kubelet[2789]: I0312 23:43:24.477627 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f06e482b-f74f-4302-8898-fd2753a17184-kubelet-dir\") pod \"csi-node-driver-szgbn\" (UID: \"f06e482b-f74f-4302-8898-fd2753a17184\") " pod="calico-system/csi-node-driver-szgbn" Mar 12 23:43:24.477750 kubelet[2789]: I0312 23:43:24.477692 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f06e482b-f74f-4302-8898-fd2753a17184-registration-dir\") pod \"csi-node-driver-szgbn\" (UID: \"f06e482b-f74f-4302-8898-fd2753a17184\") " pod="calico-system/csi-node-driver-szgbn" Mar 12 23:43:24.477750 kubelet[2789]: I0312 23:43:24.477732 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f06e482b-f74f-4302-8898-fd2753a17184-varrun\") pod \"csi-node-driver-szgbn\" (UID: \"f06e482b-f74f-4302-8898-fd2753a17184\") " pod="calico-system/csi-node-driver-szgbn" Mar 12 23:43:24.477837 kubelet[2789]: I0312 23:43:24.477751 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsph8\" (UniqueName: \"kubernetes.io/projected/f06e482b-f74f-4302-8898-fd2753a17184-kube-api-access-xsph8\") pod \"csi-node-driver-szgbn\" (UID: \"f06e482b-f74f-4302-8898-fd2753a17184\") " pod="calico-system/csi-node-driver-szgbn" Mar 12 23:43:24.479293 kubelet[2789]: E0312 23:43:24.479262 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.479293 kubelet[2789]: W0312 23:43:24.479284 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.479293 kubelet[2789]: E0312 23:43:24.479303 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.479902 kubelet[2789]: E0312 23:43:24.479796 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.479902 kubelet[2789]: W0312 23:43:24.479811 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.479902 kubelet[2789]: E0312 23:43:24.479844 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.481168 kubelet[2789]: E0312 23:43:24.480081 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.481168 kubelet[2789]: W0312 23:43:24.480093 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.481168 kubelet[2789]: E0312 23:43:24.480107 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.481168 kubelet[2789]: E0312 23:43:24.480372 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.481168 kubelet[2789]: W0312 23:43:24.480385 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.481168 kubelet[2789]: E0312 23:43:24.480399 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.481168 kubelet[2789]: E0312 23:43:24.480696 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.481168 kubelet[2789]: W0312 23:43:24.480710 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.481168 kubelet[2789]: E0312 23:43:24.480724 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.481168 kubelet[2789]: E0312 23:43:24.480945 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.481774 kubelet[2789]: W0312 23:43:24.480953 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.481774 kubelet[2789]: E0312 23:43:24.480961 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.481774 kubelet[2789]: E0312 23:43:24.481138 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.481774 kubelet[2789]: W0312 23:43:24.481147 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.481774 kubelet[2789]: E0312 23:43:24.481155 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.481774 kubelet[2789]: E0312 23:43:24.481316 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.481774 kubelet[2789]: W0312 23:43:24.481324 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.481774 kubelet[2789]: E0312 23:43:24.481331 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.481774 kubelet[2789]: E0312 23:43:24.481465 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.481774 kubelet[2789]: W0312 23:43:24.481472 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.482209 kubelet[2789]: E0312 23:43:24.481492 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.482209 kubelet[2789]: E0312 23:43:24.481648 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.482209 kubelet[2789]: W0312 23:43:24.481656 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.482209 kubelet[2789]: E0312 23:43:24.481664 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.482209 kubelet[2789]: E0312 23:43:24.481798 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.482209 kubelet[2789]: W0312 23:43:24.481807 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.482209 kubelet[2789]: E0312 23:43:24.481827 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.482209 kubelet[2789]: E0312 23:43:24.481965 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.482209 kubelet[2789]: W0312 23:43:24.481972 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.482209 kubelet[2789]: E0312 23:43:24.481980 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.482719 kubelet[2789]: E0312 23:43:24.482107 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.482719 kubelet[2789]: W0312 23:43:24.482114 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.482719 kubelet[2789]: E0312 23:43:24.482121 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.482719 kubelet[2789]: E0312 23:43:24.482261 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.482719 kubelet[2789]: W0312 23:43:24.482268 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.482719 kubelet[2789]: E0312 23:43:24.482275 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.482719 kubelet[2789]: E0312 23:43:24.482418 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.482719 kubelet[2789]: W0312 23:43:24.482425 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.482719 kubelet[2789]: E0312 23:43:24.482436 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.482719 kubelet[2789]: E0312 23:43:24.482605 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.484029 kubelet[2789]: W0312 23:43:24.482614 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.484029 kubelet[2789]: E0312 23:43:24.482621 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.484029 kubelet[2789]: E0312 23:43:24.482783 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.484029 kubelet[2789]: W0312 23:43:24.482791 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.484029 kubelet[2789]: E0312 23:43:24.482799 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.484029 kubelet[2789]: E0312 23:43:24.483177 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.484208 kubelet[2789]: W0312 23:43:24.483191 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.484320 kubelet[2789]: E0312 23:43:24.484306 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.484632 kubelet[2789]: E0312 23:43:24.484618 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.484705 kubelet[2789]: W0312 23:43:24.484694 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.484776 kubelet[2789]: E0312 23:43:24.484753 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.485791 kubelet[2789]: E0312 23:43:24.485774 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.486003 kubelet[2789]: W0312 23:43:24.485984 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.486552 kubelet[2789]: E0312 23:43:24.486530 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.486927 kubelet[2789]: E0312 23:43:24.486887 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.486927 kubelet[2789]: W0312 23:43:24.486901 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.486927 kubelet[2789]: E0312 23:43:24.486913 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.487410 kubelet[2789]: E0312 23:43:24.487265 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.487521 kubelet[2789]: W0312 23:43:24.487502 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.487596 kubelet[2789]: E0312 23:43:24.487581 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.488103 kubelet[2789]: E0312 23:43:24.488064 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.488103 kubelet[2789]: W0312 23:43:24.488078 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.488103 kubelet[2789]: E0312 23:43:24.488090 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.488831 kubelet[2789]: E0312 23:43:24.488773 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.488831 kubelet[2789]: W0312 23:43:24.488787 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.488831 kubelet[2789]: E0312 23:43:24.488798 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.489278 kubelet[2789]: E0312 23:43:24.489188 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.489278 kubelet[2789]: W0312 23:43:24.489201 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.489278 kubelet[2789]: E0312 23:43:24.489212 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.489607 kubelet[2789]: E0312 23:43:24.489458 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.489607 kubelet[2789]: W0312 23:43:24.489468 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.489607 kubelet[2789]: E0312 23:43:24.489528 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.489803 kubelet[2789]: E0312 23:43:24.489791 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.489925 kubelet[2789]: W0312 23:43:24.489860 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.489925 kubelet[2789]: E0312 23:43:24.489873 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.490287 kubelet[2789]: E0312 23:43:24.490243 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.490287 kubelet[2789]: W0312 23:43:24.490258 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.490287 kubelet[2789]: E0312 23:43:24.490270 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.490731 kubelet[2789]: E0312 23:43:24.490697 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.490731 kubelet[2789]: W0312 23:43:24.490709 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.490731 kubelet[2789]: E0312 23:43:24.490719 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.491050 kubelet[2789]: E0312 23:43:24.491037 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.491217 kubelet[2789]: W0312 23:43:24.491117 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.491217 kubelet[2789]: E0312 23:43:24.491134 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.491431 kubelet[2789]: E0312 23:43:24.491418 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.491800 kubelet[2789]: W0312 23:43:24.491503 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.491800 kubelet[2789]: E0312 23:43:24.491519 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.498753 kubelet[2789]: E0312 23:43:24.498732 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.500245 containerd[1553]: time="2026-03-12T23:43:24.499876804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64c7ff778b-z79wz,Uid:a27c819e-a19f-48e3-893c-6481d6408731,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:24.501099 kubelet[2789]: W0312 23:43:24.500576 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.501099 kubelet[2789]: E0312 23:43:24.500641 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.506412 kubelet[2789]: E0312 23:43:24.506393 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.506787 kubelet[2789]: W0312 23:43:24.506494 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.506787 kubelet[2789]: E0312 23:43:24.506528 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.528990 containerd[1553]: time="2026-03-12T23:43:24.528329388Z" level=info msg="connecting to shim 442833f08b525e051d433b6b1e4c37272a5dfad3f8e7bb1d757f1368c32898dc" address="unix:///run/containerd/s/b30388e269d6cc6c5205659ec2526796aeb6e9408bccad57d9a2260350e02b99" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:24.556064 systemd[1]: Started cri-containerd-442833f08b525e051d433b6b1e4c37272a5dfad3f8e7bb1d757f1368c32898dc.scope - libcontainer container 442833f08b525e051d433b6b1e4c37272a5dfad3f8e7bb1d757f1368c32898dc. Mar 12 23:43:24.579611 kubelet[2789]: E0312 23:43:24.579483 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.579611 kubelet[2789]: W0312 23:43:24.579551 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.579611 kubelet[2789]: E0312 23:43:24.579571 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.580389 kubelet[2789]: E0312 23:43:24.580357 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.580389 kubelet[2789]: W0312 23:43:24.580378 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.580389 kubelet[2789]: E0312 23:43:24.580391 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.580958 kubelet[2789]: E0312 23:43:24.580933 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.580958 kubelet[2789]: W0312 23:43:24.580947 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.581198 kubelet[2789]: E0312 23:43:24.580974 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.581372 kubelet[2789]: E0312 23:43:24.581242 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.581372 kubelet[2789]: W0312 23:43:24.581251 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.581372 kubelet[2789]: E0312 23:43:24.581260 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.582152 kubelet[2789]: E0312 23:43:24.581992 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.582152 kubelet[2789]: W0312 23:43:24.582010 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.582152 kubelet[2789]: E0312 23:43:24.582026 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.582314 kubelet[2789]: E0312 23:43:24.582300 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.582314 kubelet[2789]: W0312 23:43:24.582309 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.582733 kubelet[2789]: E0312 23:43:24.582322 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.583944 kubelet[2789]: E0312 23:43:24.583814 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.583944 kubelet[2789]: W0312 23:43:24.583924 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.583944 kubelet[2789]: E0312 23:43:24.583936 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.584367 kubelet[2789]: E0312 23:43:24.584326 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.584367 kubelet[2789]: W0312 23:43:24.584337 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.584367 kubelet[2789]: E0312 23:43:24.584348 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.584700 kubelet[2789]: E0312 23:43:24.584672 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.584700 kubelet[2789]: W0312 23:43:24.584697 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.585134 kubelet[2789]: E0312 23:43:24.584709 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.585134 kubelet[2789]: E0312 23:43:24.584920 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.585134 kubelet[2789]: W0312 23:43:24.584942 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.585134 kubelet[2789]: E0312 23:43:24.584951 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.585134 kubelet[2789]: E0312 23:43:24.585130 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.585239 kubelet[2789]: W0312 23:43:24.585138 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.585239 kubelet[2789]: E0312 23:43:24.585146 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.585851 kubelet[2789]: E0312 23:43:24.585359 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.585851 kubelet[2789]: W0312 23:43:24.585373 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.585851 kubelet[2789]: E0312 23:43:24.585383 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.586231 containerd[1553]: time="2026-03-12T23:43:24.586198638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8kgl2,Uid:b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:24.586742 kubelet[2789]: E0312 23:43:24.586682 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.586742 kubelet[2789]: W0312 23:43:24.586695 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.586742 kubelet[2789]: E0312 23:43:24.586705 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.587419 kubelet[2789]: E0312 23:43:24.587356 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.587419 kubelet[2789]: W0312 23:43:24.587371 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.587419 kubelet[2789]: E0312 23:43:24.587382 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.587925 kubelet[2789]: E0312 23:43:24.587910 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.588562 kubelet[2789]: W0312 23:43:24.588406 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.588562 kubelet[2789]: E0312 23:43:24.588436 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.588897 kubelet[2789]: E0312 23:43:24.588770 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.588897 kubelet[2789]: W0312 23:43:24.588783 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.588897 kubelet[2789]: E0312 23:43:24.588793 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.589111 kubelet[2789]: E0312 23:43:24.589099 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.589228 kubelet[2789]: W0312 23:43:24.589190 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.589228 kubelet[2789]: E0312 23:43:24.589208 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.589606 kubelet[2789]: E0312 23:43:24.589564 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.589697 kubelet[2789]: W0312 23:43:24.589685 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.589773 kubelet[2789]: E0312 23:43:24.589742 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.590208 kubelet[2789]: E0312 23:43:24.590194 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.590418 kubelet[2789]: W0312 23:43:24.590277 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.590418 kubelet[2789]: E0312 23:43:24.590291 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.590670 kubelet[2789]: E0312 23:43:24.590657 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.590808 kubelet[2789]: W0312 23:43:24.590738 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.590808 kubelet[2789]: E0312 23:43:24.590753 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.591030 kubelet[2789]: E0312 23:43:24.591017 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.591179 kubelet[2789]: W0312 23:43:24.591075 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.591179 kubelet[2789]: E0312 23:43:24.591088 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.591421 kubelet[2789]: E0312 23:43:24.591379 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.591421 kubelet[2789]: W0312 23:43:24.591389 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.591421 kubelet[2789]: E0312 23:43:24.591399 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.591723 kubelet[2789]: E0312 23:43:24.591709 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.591852 kubelet[2789]: W0312 23:43:24.591773 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.591852 kubelet[2789]: E0312 23:43:24.591786 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.592184 kubelet[2789]: E0312 23:43:24.592170 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.592987 kubelet[2789]: W0312 23:43:24.592852 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.592987 kubelet[2789]: E0312 23:43:24.592871 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.593782 kubelet[2789]: E0312 23:43:24.593114 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.593782 kubelet[2789]: W0312 23:43:24.593125 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.593782 kubelet[2789]: E0312 23:43:24.593134 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.608888 containerd[1553]: time="2026-03-12T23:43:24.608732459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64c7ff778b-z79wz,Uid:a27c819e-a19f-48e3-893c-6481d6408731,Namespace:calico-system,Attempt:0,} returns sandbox id \"442833f08b525e051d433b6b1e4c37272a5dfad3f8e7bb1d757f1368c32898dc\"" Mar 12 23:43:24.609263 kubelet[2789]: E0312 23:43:24.609250 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:24.609336 kubelet[2789]: W0312 23:43:24.609324 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:24.609397 kubelet[2789]: E0312 23:43:24.609385 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:24.611325 containerd[1553]: time="2026-03-12T23:43:24.611295949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 12 23:43:24.614070 containerd[1553]: time="2026-03-12T23:43:24.613273164Z" level=info msg="connecting to shim 2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc" address="unix:///run/containerd/s/75b657ec884074c3c980df5a3e814b79b7bf4de7a35498e2e5906d91ca6cade2" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:24.651064 systemd[1]: Started cri-containerd-2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc.scope - libcontainer container 2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc. Mar 12 23:43:24.685975 containerd[1553]: time="2026-03-12T23:43:24.685889793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8kgl2,Uid:b7dd7a6f-a1a7-450e-99f8-35d7fc223bfc,Namespace:calico-system,Attempt:0,} returns sandbox id \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\"" Mar 12 23:43:25.849848 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2738931795.mount: Deactivated successfully. Mar 12 23:43:26.374799 kubelet[2789]: E0312 23:43:26.374755 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szgbn" podUID="f06e482b-f74f-4302-8898-fd2753a17184" Mar 12 23:43:26.435533 containerd[1553]: time="2026-03-12T23:43:26.435445399Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:26.437517 containerd[1553]: time="2026-03-12T23:43:26.437456332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 12 23:43:26.438538 containerd[1553]: time="2026-03-12T23:43:26.438489102Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:26.442354 containerd[1553]: time="2026-03-12T23:43:26.442291379Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:26.444240 containerd[1553]: time="2026-03-12T23:43:26.444131891Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 1.832780895s" Mar 12 23:43:26.444389 containerd[1553]: time="2026-03-12T23:43:26.444217541Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 12 23:43:26.446094 containerd[1553]: time="2026-03-12T23:43:26.446065854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 12 23:43:26.473185 containerd[1553]: time="2026-03-12T23:43:26.473132575Z" level=info msg="CreateContainer within sandbox \"442833f08b525e051d433b6b1e4c37272a5dfad3f8e7bb1d757f1368c32898dc\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 12 23:43:26.485978 containerd[1553]: time="2026-03-12T23:43:26.485023110Z" level=info msg="Container 33d3bde5ae5d9f7f62e9fd8c59a45f36f3d034c52bbf1f0ef4b60a5fb9725a61: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:26.494682 containerd[1553]: time="2026-03-12T23:43:26.494599353Z" level=info msg="CreateContainer within sandbox \"442833f08b525e051d433b6b1e4c37272a5dfad3f8e7bb1d757f1368c32898dc\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"33d3bde5ae5d9f7f62e9fd8c59a45f36f3d034c52bbf1f0ef4b60a5fb9725a61\"" Mar 12 23:43:26.497291 containerd[1553]: time="2026-03-12T23:43:26.497190399Z" level=info msg="StartContainer for \"33d3bde5ae5d9f7f62e9fd8c59a45f36f3d034c52bbf1f0ef4b60a5fb9725a61\"" Mar 12 23:43:26.500241 containerd[1553]: time="2026-03-12T23:43:26.499503009Z" level=info msg="connecting to shim 33d3bde5ae5d9f7f62e9fd8c59a45f36f3d034c52bbf1f0ef4b60a5fb9725a61" address="unix:///run/containerd/s/b30388e269d6cc6c5205659ec2526796aeb6e9408bccad57d9a2260350e02b99" protocol=ttrpc version=3 Mar 12 23:43:26.531019 systemd[1]: Started cri-containerd-33d3bde5ae5d9f7f62e9fd8c59a45f36f3d034c52bbf1f0ef4b60a5fb9725a61.scope - libcontainer container 33d3bde5ae5d9f7f62e9fd8c59a45f36f3d034c52bbf1f0ef4b60a5fb9725a61. Mar 12 23:43:26.580183 containerd[1553]: time="2026-03-12T23:43:26.580135303Z" level=info msg="StartContainer for \"33d3bde5ae5d9f7f62e9fd8c59a45f36f3d034c52bbf1f0ef4b60a5fb9725a61\" returns successfully" Mar 12 23:43:27.573164 kubelet[2789]: E0312 23:43:27.573058 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.573164 kubelet[2789]: W0312 23:43:27.573099 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.575153 kubelet[2789]: E0312 23:43:27.573179 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.575153 kubelet[2789]: E0312 23:43:27.573386 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.575153 kubelet[2789]: W0312 23:43:27.573396 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.575153 kubelet[2789]: E0312 23:43:27.573407 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.575153 kubelet[2789]: E0312 23:43:27.573566 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.575153 kubelet[2789]: W0312 23:43:27.573574 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.575153 kubelet[2789]: E0312 23:43:27.573584 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.575153 kubelet[2789]: E0312 23:43:27.573724 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.575153 kubelet[2789]: W0312 23:43:27.573731 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.575153 kubelet[2789]: E0312 23:43:27.573741 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.575989 kubelet[2789]: E0312 23:43:27.573951 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.575989 kubelet[2789]: W0312 23:43:27.573961 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.575989 kubelet[2789]: E0312 23:43:27.573972 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.575989 kubelet[2789]: E0312 23:43:27.574121 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.575989 kubelet[2789]: W0312 23:43:27.574129 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.575989 kubelet[2789]: E0312 23:43:27.574139 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.575989 kubelet[2789]: E0312 23:43:27.574282 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.575989 kubelet[2789]: W0312 23:43:27.574291 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.575989 kubelet[2789]: E0312 23:43:27.574300 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.575989 kubelet[2789]: E0312 23:43:27.574444 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.576558 kubelet[2789]: W0312 23:43:27.574453 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.576558 kubelet[2789]: E0312 23:43:27.574462 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.576558 kubelet[2789]: E0312 23:43:27.574616 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.576558 kubelet[2789]: W0312 23:43:27.574625 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.576558 kubelet[2789]: E0312 23:43:27.574635 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.576558 kubelet[2789]: E0312 23:43:27.574771 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.576558 kubelet[2789]: W0312 23:43:27.574780 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.576558 kubelet[2789]: E0312 23:43:27.574789 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.576558 kubelet[2789]: E0312 23:43:27.574965 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.576558 kubelet[2789]: W0312 23:43:27.574974 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.576949 kubelet[2789]: E0312 23:43:27.574986 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.576949 kubelet[2789]: E0312 23:43:27.575343 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.576949 kubelet[2789]: W0312 23:43:27.575353 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.576949 kubelet[2789]: E0312 23:43:27.575364 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.576949 kubelet[2789]: E0312 23:43:27.575553 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.576949 kubelet[2789]: W0312 23:43:27.575562 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.576949 kubelet[2789]: E0312 23:43:27.575573 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.576949 kubelet[2789]: E0312 23:43:27.575710 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.576949 kubelet[2789]: W0312 23:43:27.575718 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.576949 kubelet[2789]: E0312 23:43:27.575726 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.577170 kubelet[2789]: E0312 23:43:27.575956 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.577170 kubelet[2789]: W0312 23:43:27.575967 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.577170 kubelet[2789]: E0312 23:43:27.575979 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.604768 kubelet[2789]: E0312 23:43:27.604573 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.604768 kubelet[2789]: W0312 23:43:27.604601 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.604768 kubelet[2789]: E0312 23:43:27.604627 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.605264 kubelet[2789]: E0312 23:43:27.605243 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.605648 kubelet[2789]: W0312 23:43:27.605400 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.605648 kubelet[2789]: E0312 23:43:27.605427 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.606333 kubelet[2789]: E0312 23:43:27.606209 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.606333 kubelet[2789]: W0312 23:43:27.606230 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.606333 kubelet[2789]: E0312 23:43:27.606249 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.606978 kubelet[2789]: E0312 23:43:27.606616 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.606978 kubelet[2789]: W0312 23:43:27.606652 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.606978 kubelet[2789]: E0312 23:43:27.606678 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.607169 kubelet[2789]: E0312 23:43:27.607012 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.607169 kubelet[2789]: W0312 23:43:27.607028 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.607169 kubelet[2789]: E0312 23:43:27.607047 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.607385 kubelet[2789]: E0312 23:43:27.607355 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.607385 kubelet[2789]: W0312 23:43:27.607379 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.607477 kubelet[2789]: E0312 23:43:27.607398 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.607775 kubelet[2789]: E0312 23:43:27.607740 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.607775 kubelet[2789]: W0312 23:43:27.607761 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.607775 kubelet[2789]: E0312 23:43:27.607780 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.608355 kubelet[2789]: E0312 23:43:27.608128 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.608355 kubelet[2789]: W0312 23:43:27.608154 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.608355 kubelet[2789]: E0312 23:43:27.608174 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.608643 kubelet[2789]: E0312 23:43:27.608622 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.608707 kubelet[2789]: W0312 23:43:27.608642 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.608707 kubelet[2789]: E0312 23:43:27.608660 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.609525 kubelet[2789]: E0312 23:43:27.609465 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.609525 kubelet[2789]: W0312 23:43:27.609482 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.609525 kubelet[2789]: E0312 23:43:27.609494 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.609771 kubelet[2789]: E0312 23:43:27.609629 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.609771 kubelet[2789]: W0312 23:43:27.609637 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.609771 kubelet[2789]: E0312 23:43:27.609645 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.609771 kubelet[2789]: E0312 23:43:27.609769 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.609771 kubelet[2789]: W0312 23:43:27.609775 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.610035 kubelet[2789]: E0312 23:43:27.609782 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.611485 kubelet[2789]: E0312 23:43:27.610119 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.611485 kubelet[2789]: W0312 23:43:27.610146 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.611485 kubelet[2789]: E0312 23:43:27.610157 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.611485 kubelet[2789]: E0312 23:43:27.610288 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.611485 kubelet[2789]: W0312 23:43:27.610296 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.611485 kubelet[2789]: E0312 23:43:27.610304 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.611485 kubelet[2789]: E0312 23:43:27.610394 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.611485 kubelet[2789]: W0312 23:43:27.610400 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.611485 kubelet[2789]: E0312 23:43:27.610409 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.611485 kubelet[2789]: E0312 23:43:27.610518 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.611775 kubelet[2789]: W0312 23:43:27.610524 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.611775 kubelet[2789]: E0312 23:43:27.610530 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.611775 kubelet[2789]: E0312 23:43:27.611170 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.611775 kubelet[2789]: W0312 23:43:27.611184 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.611775 kubelet[2789]: E0312 23:43:27.611199 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.613012 kubelet[2789]: E0312 23:43:27.612667 2789 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:43:27.613012 kubelet[2789]: W0312 23:43:27.612685 2789 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:43:27.613012 kubelet[2789]: E0312 23:43:27.612708 2789 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:43:27.711756 containerd[1553]: time="2026-03-12T23:43:27.711708603Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:27.713385 containerd[1553]: time="2026-03-12T23:43:27.713146502Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 12 23:43:27.714447 containerd[1553]: time="2026-03-12T23:43:27.714371054Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:27.717546 containerd[1553]: time="2026-03-12T23:43:27.717486401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:27.718384 containerd[1553]: time="2026-03-12T23:43:27.718114839Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.271884725s" Mar 12 23:43:27.718384 containerd[1553]: time="2026-03-12T23:43:27.718150444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 12 23:43:27.725353 containerd[1553]: time="2026-03-12T23:43:27.725312694Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 12 23:43:27.739152 containerd[1553]: time="2026-03-12T23:43:27.738063518Z" level=info msg="Container ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:27.751107 containerd[1553]: time="2026-03-12T23:43:27.751062734Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c\"" Mar 12 23:43:27.753334 containerd[1553]: time="2026-03-12T23:43:27.751812067Z" level=info msg="StartContainer for \"ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c\"" Mar 12 23:43:27.755191 containerd[1553]: time="2026-03-12T23:43:27.755162563Z" level=info msg="connecting to shim ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c" address="unix:///run/containerd/s/75b657ec884074c3c980df5a3e814b79b7bf4de7a35498e2e5906d91ca6cade2" protocol=ttrpc version=3 Mar 12 23:43:27.778010 systemd[1]: Started cri-containerd-ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c.scope - libcontainer container ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c. Mar 12 23:43:27.849666 containerd[1553]: time="2026-03-12T23:43:27.849054193Z" level=info msg="StartContainer for \"ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c\" returns successfully" Mar 12 23:43:27.863984 systemd[1]: cri-containerd-ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c.scope: Deactivated successfully. Mar 12 23:43:27.870275 containerd[1553]: time="2026-03-12T23:43:27.870220903Z" level=info msg="received container exit event container_id:\"ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c\" id:\"ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c\" pid:3445 exited_at:{seconds:1773359007 nanos:868391076}" Mar 12 23:43:27.893836 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ddc30969688306f49d02279a5ab311d03b691573b270ec5e5ebe8fe7a0995f0c-rootfs.mount: Deactivated successfully. Mar 12 23:43:28.375862 kubelet[2789]: E0312 23:43:28.374149 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szgbn" podUID="f06e482b-f74f-4302-8898-fd2753a17184" Mar 12 23:43:28.479571 kubelet[2789]: I0312 23:43:28.479139 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:28.484097 containerd[1553]: time="2026-03-12T23:43:28.483803370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 12 23:43:28.507773 kubelet[2789]: I0312 23:43:28.507691 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-64c7ff778b-z79wz" podStartSLOduration=2.673292928 podStartE2EDuration="4.507673785s" podCreationTimestamp="2026-03-12 23:43:24 +0000 UTC" firstStartedPulling="2026-03-12 23:43:24.610961626 +0000 UTC m=+21.400764387" lastFinishedPulling="2026-03-12 23:43:26.445342483 +0000 UTC m=+23.235145244" observedRunningTime="2026-03-12 23:43:27.486693318 +0000 UTC m=+24.276496119" watchObservedRunningTime="2026-03-12 23:43:28.507673785 +0000 UTC m=+25.297476546" Mar 12 23:43:30.373764 kubelet[2789]: E0312 23:43:30.373694 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szgbn" podUID="f06e482b-f74f-4302-8898-fd2753a17184" Mar 12 23:43:32.056917 kubelet[2789]: I0312 23:43:32.056864 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:32.374261 kubelet[2789]: E0312 23:43:32.374134 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szgbn" podUID="f06e482b-f74f-4302-8898-fd2753a17184" Mar 12 23:43:32.471249 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount37674822.mount: Deactivated successfully. Mar 12 23:43:32.501467 containerd[1553]: time="2026-03-12T23:43:32.501395368Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:32.504609 containerd[1553]: time="2026-03-12T23:43:32.504567544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 12 23:43:32.505964 containerd[1553]: time="2026-03-12T23:43:32.505921464Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:32.509693 containerd[1553]: time="2026-03-12T23:43:32.509623623Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:32.511315 containerd[1553]: time="2026-03-12T23:43:32.511259177Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 4.027392479s" Mar 12 23:43:32.511315 containerd[1553]: time="2026-03-12T23:43:32.511301262Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 12 23:43:32.516979 containerd[1553]: time="2026-03-12T23:43:32.516926489Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 12 23:43:32.533011 containerd[1553]: time="2026-03-12T23:43:32.531116011Z" level=info msg="Container 7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:32.549224 containerd[1553]: time="2026-03-12T23:43:32.549089861Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5\"" Mar 12 23:43:32.555712 containerd[1553]: time="2026-03-12T23:43:32.555660880Z" level=info msg="StartContainer for \"7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5\"" Mar 12 23:43:32.557988 containerd[1553]: time="2026-03-12T23:43:32.557947431Z" level=info msg="connecting to shim 7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5" address="unix:///run/containerd/s/75b657ec884074c3c980df5a3e814b79b7bf4de7a35498e2e5906d91ca6cade2" protocol=ttrpc version=3 Mar 12 23:43:32.588251 systemd[1]: Started cri-containerd-7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5.scope - libcontainer container 7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5. Mar 12 23:43:32.671313 containerd[1553]: time="2026-03-12T23:43:32.671211895Z" level=info msg="StartContainer for \"7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5\" returns successfully" Mar 12 23:43:32.776435 systemd[1]: cri-containerd-7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5.scope: Deactivated successfully. Mar 12 23:43:32.782408 containerd[1553]: time="2026-03-12T23:43:32.782349187Z" level=info msg="received container exit event container_id:\"7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5\" id:\"7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5\" pid:3508 exited_at:{seconds:1773359012 nanos:782089516}" Mar 12 23:43:33.471524 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7b67e6a3c0631a31b7b506769fcd51e2ddfb932e001e429d54c54232b9e334c5-rootfs.mount: Deactivated successfully. Mar 12 23:43:33.503643 containerd[1553]: time="2026-03-12T23:43:33.502460341Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 12 23:43:34.374477 kubelet[2789]: E0312 23:43:34.374394 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szgbn" podUID="f06e482b-f74f-4302-8898-fd2753a17184" Mar 12 23:43:35.784846 containerd[1553]: time="2026-03-12T23:43:35.784775424Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:35.786752 containerd[1553]: time="2026-03-12T23:43:35.786712333Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 12 23:43:35.788102 containerd[1553]: time="2026-03-12T23:43:35.787995792Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:35.794619 containerd[1553]: time="2026-03-12T23:43:35.793521868Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:35.794619 containerd[1553]: time="2026-03-12T23:43:35.794486963Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.291986579s" Mar 12 23:43:35.794619 containerd[1553]: time="2026-03-12T23:43:35.794521719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 12 23:43:35.800695 containerd[1553]: time="2026-03-12T23:43:35.800658008Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 12 23:43:35.815897 containerd[1553]: time="2026-03-12T23:43:35.815725162Z" level=info msg="Container f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:35.829017 containerd[1553]: time="2026-03-12T23:43:35.828974594Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394\"" Mar 12 23:43:35.830143 containerd[1553]: time="2026-03-12T23:43:35.830115709Z" level=info msg="StartContainer for \"f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394\"" Mar 12 23:43:35.832240 containerd[1553]: time="2026-03-12T23:43:35.832211680Z" level=info msg="connecting to shim f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394" address="unix:///run/containerd/s/75b657ec884074c3c980df5a3e814b79b7bf4de7a35498e2e5906d91ca6cade2" protocol=ttrpc version=3 Mar 12 23:43:35.863060 systemd[1]: Started cri-containerd-f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394.scope - libcontainer container f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394. Mar 12 23:43:35.941829 containerd[1553]: time="2026-03-12T23:43:35.941760107Z" level=info msg="StartContainer for \"f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394\" returns successfully" Mar 12 23:43:36.374040 kubelet[2789]: E0312 23:43:36.373967 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szgbn" podUID="f06e482b-f74f-4302-8898-fd2753a17184" Mar 12 23:43:36.491439 systemd[1]: cri-containerd-f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394.scope: Deactivated successfully. Mar 12 23:43:36.492151 systemd[1]: cri-containerd-f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394.scope: Consumed 530ms CPU time, 198.4M memory peak, 171.3M written to disk. Mar 12 23:43:36.497336 containerd[1553]: time="2026-03-12T23:43:36.496392450Z" level=info msg="received container exit event container_id:\"f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394\" id:\"f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394\" pid:3565 exited_at:{seconds:1773359016 nanos:496092441}" Mar 12 23:43:36.526159 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f80ecbba389b82491c3b34ff559a618a30dcb2ee4f3deaead10ce6866f58c394-rootfs.mount: Deactivated successfully. Mar 12 23:43:36.552369 kubelet[2789]: I0312 23:43:36.552111 2789 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 12 23:43:36.631435 systemd[1]: Created slice kubepods-burstable-pod54ede1dd_a4d1_40a1_a3b3_dd219f2c692d.slice - libcontainer container kubepods-burstable-pod54ede1dd_a4d1_40a1_a3b3_dd219f2c692d.slice. Mar 12 23:43:36.653969 systemd[1]: Created slice kubepods-besteffort-poda839e77d_37bc_4f89_a3a9_2958c9cba9f2.slice - libcontainer container kubepods-besteffort-poda839e77d_37bc_4f89_a3a9_2958c9cba9f2.slice. Mar 12 23:43:36.666156 systemd[1]: Created slice kubepods-besteffort-pod94660390_b097_4115_a00d_194fc136d3e9.slice - libcontainer container kubepods-besteffort-pod94660390_b097_4115_a00d_194fc136d3e9.slice. Mar 12 23:43:36.673267 systemd[1]: Created slice kubepods-besteffort-pod438f4726_557c_45f8_a55a_8da149168ef8.slice - libcontainer container kubepods-besteffort-pod438f4726_557c_45f8_a55a_8da149168ef8.slice. Mar 12 23:43:36.675054 kubelet[2789]: I0312 23:43:36.675009 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz922\" (UniqueName: \"kubernetes.io/projected/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-kube-api-access-mz922\") pod \"whisker-65fb8b5f9b-qmsvj\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " pod="calico-system/whisker-65fb8b5f9b-qmsvj" Mar 12 23:43:36.675054 kubelet[2789]: I0312 23:43:36.675051 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44fh\" (UniqueName: \"kubernetes.io/projected/438f4726-557c-45f8-a55a-8da149168ef8-kube-api-access-l44fh\") pod \"calico-apiserver-59f89dfd9f-fbrhp\" (UID: \"438f4726-557c-45f8-a55a-8da149168ef8\") " pod="calico-system/calico-apiserver-59f89dfd9f-fbrhp" Mar 12 23:43:36.675393 kubelet[2789]: I0312 23:43:36.675071 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54ede1dd-a4d1-40a1-a3b3-dd219f2c692d-config-volume\") pod \"coredns-7d764666f9-7ht8g\" (UID: \"54ede1dd-a4d1-40a1-a3b3-dd219f2c692d\") " pod="kube-system/coredns-7d764666f9-7ht8g" Mar 12 23:43:36.675393 kubelet[2789]: I0312 23:43:36.675088 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e009c2a-43f1-473c-ac28-09966919e16f-tigera-ca-bundle\") pod \"calico-kube-controllers-d87956b8c-nbrdc\" (UID: \"9e009c2a-43f1-473c-ac28-09966919e16f\") " pod="calico-system/calico-kube-controllers-d87956b8c-nbrdc" Mar 12 23:43:36.675393 kubelet[2789]: I0312 23:43:36.675103 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvw5l\" (UniqueName: \"kubernetes.io/projected/76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6-kube-api-access-bvw5l\") pod \"coredns-7d764666f9-cdwgt\" (UID: \"76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6\") " pod="kube-system/coredns-7d764666f9-cdwgt" Mar 12 23:43:36.675393 kubelet[2789]: I0312 23:43:36.675121 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtp2\" (UniqueName: \"kubernetes.io/projected/54ede1dd-a4d1-40a1-a3b3-dd219f2c692d-kube-api-access-hbtp2\") pod \"coredns-7d764666f9-7ht8g\" (UID: \"54ede1dd-a4d1-40a1-a3b3-dd219f2c692d\") " pod="kube-system/coredns-7d764666f9-7ht8g" Mar 12 23:43:36.675393 kubelet[2789]: I0312 23:43:36.675136 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/94660390-b097-4115-a00d-194fc136d3e9-calico-apiserver-certs\") pod \"calico-apiserver-59f89dfd9f-vdx54\" (UID: \"94660390-b097-4115-a00d-194fc136d3e9\") " pod="calico-system/calico-apiserver-59f89dfd9f-vdx54" Mar 12 23:43:36.676893 kubelet[2789]: I0312 23:43:36.675151 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw2k2\" (UniqueName: \"kubernetes.io/projected/94660390-b097-4115-a00d-194fc136d3e9-kube-api-access-dw2k2\") pod \"calico-apiserver-59f89dfd9f-vdx54\" (UID: \"94660390-b097-4115-a00d-194fc136d3e9\") " pod="calico-system/calico-apiserver-59f89dfd9f-vdx54" Mar 12 23:43:36.676893 kubelet[2789]: I0312 23:43:36.675167 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbf90c03-2c31-4265-97a4-01480ee945ec-config\") pod \"goldmane-9f7667bb8-tqf8l\" (UID: \"cbf90c03-2c31-4265-97a4-01480ee945ec\") " pod="calico-system/goldmane-9f7667bb8-tqf8l" Mar 12 23:43:36.676893 kubelet[2789]: I0312 23:43:36.675183 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6-config-volume\") pod \"coredns-7d764666f9-cdwgt\" (UID: \"76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6\") " pod="kube-system/coredns-7d764666f9-cdwgt" Mar 12 23:43:36.676893 kubelet[2789]: I0312 23:43:36.675205 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-ca-bundle\") pod \"whisker-65fb8b5f9b-qmsvj\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " pod="calico-system/whisker-65fb8b5f9b-qmsvj" Mar 12 23:43:36.676893 kubelet[2789]: I0312 23:43:36.675220 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-nginx-config\") pod \"whisker-65fb8b5f9b-qmsvj\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " pod="calico-system/whisker-65fb8b5f9b-qmsvj" Mar 12 23:43:36.677022 kubelet[2789]: I0312 23:43:36.675234 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/438f4726-557c-45f8-a55a-8da149168ef8-calico-apiserver-certs\") pod \"calico-apiserver-59f89dfd9f-fbrhp\" (UID: \"438f4726-557c-45f8-a55a-8da149168ef8\") " pod="calico-system/calico-apiserver-59f89dfd9f-fbrhp" Mar 12 23:43:36.677022 kubelet[2789]: I0312 23:43:36.675261 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbf90c03-2c31-4265-97a4-01480ee945ec-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-tqf8l\" (UID: \"cbf90c03-2c31-4265-97a4-01480ee945ec\") " pod="calico-system/goldmane-9f7667bb8-tqf8l" Mar 12 23:43:36.677022 kubelet[2789]: I0312 23:43:36.675276 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlddq\" (UniqueName: \"kubernetes.io/projected/cbf90c03-2c31-4265-97a4-01480ee945ec-kube-api-access-dlddq\") pod \"goldmane-9f7667bb8-tqf8l\" (UID: \"cbf90c03-2c31-4265-97a4-01480ee945ec\") " pod="calico-system/goldmane-9f7667bb8-tqf8l" Mar 12 23:43:36.677022 kubelet[2789]: I0312 23:43:36.675335 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-backend-key-pair\") pod \"whisker-65fb8b5f9b-qmsvj\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " pod="calico-system/whisker-65fb8b5f9b-qmsvj" Mar 12 23:43:36.677022 kubelet[2789]: I0312 23:43:36.675355 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkjln\" (UniqueName: \"kubernetes.io/projected/9e009c2a-43f1-473c-ac28-09966919e16f-kube-api-access-vkjln\") pod \"calico-kube-controllers-d87956b8c-nbrdc\" (UID: \"9e009c2a-43f1-473c-ac28-09966919e16f\") " pod="calico-system/calico-kube-controllers-d87956b8c-nbrdc" Mar 12 23:43:36.677122 kubelet[2789]: I0312 23:43:36.675375 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/cbf90c03-2c31-4265-97a4-01480ee945ec-goldmane-key-pair\") pod \"goldmane-9f7667bb8-tqf8l\" (UID: \"cbf90c03-2c31-4265-97a4-01480ee945ec\") " pod="calico-system/goldmane-9f7667bb8-tqf8l" Mar 12 23:43:36.693464 systemd[1]: Created slice kubepods-besteffort-pod9e009c2a_43f1_473c_ac28_09966919e16f.slice - libcontainer container kubepods-besteffort-pod9e009c2a_43f1_473c_ac28_09966919e16f.slice. Mar 12 23:43:36.700743 systemd[1]: Created slice kubepods-besteffort-podcbf90c03_2c31_4265_97a4_01480ee945ec.slice - libcontainer container kubepods-besteffort-podcbf90c03_2c31_4265_97a4_01480ee945ec.slice. Mar 12 23:43:36.711138 systemd[1]: Created slice kubepods-burstable-pod76a8ed3c_17a5_44b1_bdfc_3127aa1b88d6.slice - libcontainer container kubepods-burstable-pod76a8ed3c_17a5_44b1_bdfc_3127aa1b88d6.slice. Mar 12 23:43:36.944477 containerd[1553]: time="2026-03-12T23:43:36.944317134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-7ht8g,Uid:54ede1dd-a4d1-40a1-a3b3-dd219f2c692d,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:36.960159 containerd[1553]: time="2026-03-12T23:43:36.960125179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65fb8b5f9b-qmsvj,Uid:a839e77d-37bc-4f89-a3a9-2958c9cba9f2,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:36.971838 containerd[1553]: time="2026-03-12T23:43:36.971553797Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-vdx54,Uid:94660390-b097-4115-a00d-194fc136d3e9,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:36.988070 containerd[1553]: time="2026-03-12T23:43:36.988035613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-fbrhp,Uid:438f4726-557c-45f8-a55a-8da149168ef8,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:36.999719 containerd[1553]: time="2026-03-12T23:43:36.999677369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d87956b8c-nbrdc,Uid:9e009c2a-43f1-473c-ac28-09966919e16f,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:37.011850 containerd[1553]: time="2026-03-12T23:43:37.010971899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-tqf8l,Uid:cbf90c03-2c31-4265-97a4-01480ee945ec,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:37.021489 containerd[1553]: time="2026-03-12T23:43:37.021444276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-cdwgt,Uid:76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:37.149037 containerd[1553]: time="2026-03-12T23:43:37.148985616Z" level=error msg="Failed to destroy network for sandbox \"83ea6c4906e93f7c09200d50845d2a7a3705afbb30d523a3973b510dfe14b4dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.152239 containerd[1553]: time="2026-03-12T23:43:37.152184623Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d87956b8c-nbrdc,Uid:9e009c2a-43f1-473c-ac28-09966919e16f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ea6c4906e93f7c09200d50845d2a7a3705afbb30d523a3973b510dfe14b4dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.152656 kubelet[2789]: E0312 23:43:37.152599 2789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ea6c4906e93f7c09200d50845d2a7a3705afbb30d523a3973b510dfe14b4dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.152751 kubelet[2789]: E0312 23:43:37.152680 2789 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ea6c4906e93f7c09200d50845d2a7a3705afbb30d523a3973b510dfe14b4dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d87956b8c-nbrdc" Mar 12 23:43:37.152751 kubelet[2789]: E0312 23:43:37.152699 2789 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83ea6c4906e93f7c09200d50845d2a7a3705afbb30d523a3973b510dfe14b4dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d87956b8c-nbrdc" Mar 12 23:43:37.153947 kubelet[2789]: E0312 23:43:37.152799 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d87956b8c-nbrdc_calico-system(9e009c2a-43f1-473c-ac28-09966919e16f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d87956b8c-nbrdc_calico-system(9e009c2a-43f1-473c-ac28-09966919e16f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83ea6c4906e93f7c09200d50845d2a7a3705afbb30d523a3973b510dfe14b4dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d87956b8c-nbrdc" podUID="9e009c2a-43f1-473c-ac28-09966919e16f" Mar 12 23:43:37.173275 containerd[1553]: time="2026-03-12T23:43:37.173191171Z" level=error msg="Failed to destroy network for sandbox \"da1d1aa8bf01bfa6d909064ac3d1d157366fc7593358775bb7760b7cc4860ddd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.179888 containerd[1553]: time="2026-03-12T23:43:37.179035400Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-vdx54,Uid:94660390-b097-4115-a00d-194fc136d3e9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"da1d1aa8bf01bfa6d909064ac3d1d157366fc7593358775bb7760b7cc4860ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.180052 kubelet[2789]: E0312 23:43:37.179315 2789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da1d1aa8bf01bfa6d909064ac3d1d157366fc7593358775bb7760b7cc4860ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.180052 kubelet[2789]: E0312 23:43:37.179369 2789 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da1d1aa8bf01bfa6d909064ac3d1d157366fc7593358775bb7760b7cc4860ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59f89dfd9f-vdx54" Mar 12 23:43:37.180052 kubelet[2789]: E0312 23:43:37.179388 2789 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da1d1aa8bf01bfa6d909064ac3d1d157366fc7593358775bb7760b7cc4860ddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59f89dfd9f-vdx54" Mar 12 23:43:37.180146 kubelet[2789]: E0312 23:43:37.179458 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59f89dfd9f-vdx54_calico-system(94660390-b097-4115-a00d-194fc136d3e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59f89dfd9f-vdx54_calico-system(94660390-b097-4115-a00d-194fc136d3e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"da1d1aa8bf01bfa6d909064ac3d1d157366fc7593358775bb7760b7cc4860ddd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-59f89dfd9f-vdx54" podUID="94660390-b097-4115-a00d-194fc136d3e9" Mar 12 23:43:37.195429 containerd[1553]: time="2026-03-12T23:43:37.195148665Z" level=error msg="Failed to destroy network for sandbox \"2c0c81f9cc6647227237ed596bb464e73f371695f8db9bf31ff408002f9951a6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.197464 containerd[1553]: time="2026-03-12T23:43:37.197414764Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-7ht8g,Uid:54ede1dd-a4d1-40a1-a3b3-dd219f2c692d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c0c81f9cc6647227237ed596bb464e73f371695f8db9bf31ff408002f9951a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.197889 kubelet[2789]: E0312 23:43:37.197843 2789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c0c81f9cc6647227237ed596bb464e73f371695f8db9bf31ff408002f9951a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.197965 kubelet[2789]: E0312 23:43:37.197908 2789 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c0c81f9cc6647227237ed596bb464e73f371695f8db9bf31ff408002f9951a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-7ht8g" Mar 12 23:43:37.197965 kubelet[2789]: E0312 23:43:37.197938 2789 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c0c81f9cc6647227237ed596bb464e73f371695f8db9bf31ff408002f9951a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-7ht8g" Mar 12 23:43:37.198012 kubelet[2789]: E0312 23:43:37.197985 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-7ht8g_kube-system(54ede1dd-a4d1-40a1-a3b3-dd219f2c692d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-7ht8g_kube-system(54ede1dd-a4d1-40a1-a3b3-dd219f2c692d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c0c81f9cc6647227237ed596bb464e73f371695f8db9bf31ff408002f9951a6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-7ht8g" podUID="54ede1dd-a4d1-40a1-a3b3-dd219f2c692d" Mar 12 23:43:37.203192 containerd[1553]: time="2026-03-12T23:43:37.203071211Z" level=error msg="Failed to destroy network for sandbox \"22c20cb791914c59b170d5da603afdb37c3753121e8585459d1497b483967cad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.205290 containerd[1553]: time="2026-03-12T23:43:37.205199324Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65fb8b5f9b-qmsvj,Uid:a839e77d-37bc-4f89-a3a9-2958c9cba9f2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c20cb791914c59b170d5da603afdb37c3753121e8585459d1497b483967cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.205636 kubelet[2789]: E0312 23:43:37.205603 2789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c20cb791914c59b170d5da603afdb37c3753121e8585459d1497b483967cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.205929 kubelet[2789]: E0312 23:43:37.205744 2789 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c20cb791914c59b170d5da603afdb37c3753121e8585459d1497b483967cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65fb8b5f9b-qmsvj" Mar 12 23:43:37.205929 kubelet[2789]: E0312 23:43:37.205769 2789 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22c20cb791914c59b170d5da603afdb37c3753121e8585459d1497b483967cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65fb8b5f9b-qmsvj" Mar 12 23:43:37.206677 kubelet[2789]: E0312 23:43:37.206405 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-65fb8b5f9b-qmsvj_calico-system(a839e77d-37bc-4f89-a3a9-2958c9cba9f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-65fb8b5f9b-qmsvj_calico-system(a839e77d-37bc-4f89-a3a9-2958c9cba9f2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"22c20cb791914c59b170d5da603afdb37c3753121e8585459d1497b483967cad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-65fb8b5f9b-qmsvj" podUID="a839e77d-37bc-4f89-a3a9-2958c9cba9f2" Mar 12 23:43:37.209049 containerd[1553]: time="2026-03-12T23:43:37.209009751Z" level=error msg="Failed to destroy network for sandbox \"6c2c775b2362a283d54716f30ca48ec6e0f391964726afcac1d7994da1c72708\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.211235 containerd[1553]: time="2026-03-12T23:43:37.210782098Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-fbrhp,Uid:438f4726-557c-45f8-a55a-8da149168ef8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c2c775b2362a283d54716f30ca48ec6e0f391964726afcac1d7994da1c72708\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.211460 kubelet[2789]: E0312 23:43:37.211419 2789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c2c775b2362a283d54716f30ca48ec6e0f391964726afcac1d7994da1c72708\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.211519 kubelet[2789]: E0312 23:43:37.211468 2789 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c2c775b2362a283d54716f30ca48ec6e0f391964726afcac1d7994da1c72708\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59f89dfd9f-fbrhp" Mar 12 23:43:37.211519 kubelet[2789]: E0312 23:43:37.211493 2789 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c2c775b2362a283d54716f30ca48ec6e0f391964726afcac1d7994da1c72708\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-59f89dfd9f-fbrhp" Mar 12 23:43:37.212269 kubelet[2789]: E0312 23:43:37.211999 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59f89dfd9f-fbrhp_calico-system(438f4726-557c-45f8-a55a-8da149168ef8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59f89dfd9f-fbrhp_calico-system(438f4726-557c-45f8-a55a-8da149168ef8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c2c775b2362a283d54716f30ca48ec6e0f391964726afcac1d7994da1c72708\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-59f89dfd9f-fbrhp" podUID="438f4726-557c-45f8-a55a-8da149168ef8" Mar 12 23:43:37.224834 containerd[1553]: time="2026-03-12T23:43:37.224753653Z" level=error msg="Failed to destroy network for sandbox \"8d557deb3ab759861fa3d7f97bf3b4c52a1291274fe328fa0a4ce7bac5fe74ab\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.226484 containerd[1553]: time="2026-03-12T23:43:37.226144117Z" level=error msg="Failed to destroy network for sandbox \"21ea02238e6fba167ac563139cdb86b27d2c55df914340d6d12dfbb9a115dab3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.226743 containerd[1553]: time="2026-03-12T23:43:37.226712662Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-tqf8l,Uid:cbf90c03-2c31-4265-97a4-01480ee945ec,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d557deb3ab759861fa3d7f97bf3b4c52a1291274fe328fa0a4ce7bac5fe74ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.227226 kubelet[2789]: E0312 23:43:37.227071 2789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d557deb3ab759861fa3d7f97bf3b4c52a1291274fe328fa0a4ce7bac5fe74ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.227380 kubelet[2789]: E0312 23:43:37.227358 2789 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d557deb3ab759861fa3d7f97bf3b4c52a1291274fe328fa0a4ce7bac5fe74ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-tqf8l" Mar 12 23:43:37.227482 kubelet[2789]: E0312 23:43:37.227459 2789 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d557deb3ab759861fa3d7f97bf3b4c52a1291274fe328fa0a4ce7bac5fe74ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-tqf8l" Mar 12 23:43:37.227626 kubelet[2789]: E0312 23:43:37.227596 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-tqf8l_calico-system(cbf90c03-2c31-4265-97a4-01480ee945ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-tqf8l_calico-system(cbf90c03-2c31-4265-97a4-01480ee945ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d557deb3ab759861fa3d7f97bf3b4c52a1291274fe328fa0a4ce7bac5fe74ab\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-tqf8l" podUID="cbf90c03-2c31-4265-97a4-01480ee945ec" Mar 12 23:43:37.228419 containerd[1553]: time="2026-03-12T23:43:37.228332983Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-cdwgt,Uid:76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"21ea02238e6fba167ac563139cdb86b27d2c55df914340d6d12dfbb9a115dab3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.228925 kubelet[2789]: E0312 23:43:37.228741 2789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21ea02238e6fba167ac563139cdb86b27d2c55df914340d6d12dfbb9a115dab3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:37.228925 kubelet[2789]: E0312 23:43:37.228788 2789 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21ea02238e6fba167ac563139cdb86b27d2c55df914340d6d12dfbb9a115dab3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-cdwgt" Mar 12 23:43:37.228925 kubelet[2789]: E0312 23:43:37.228806 2789 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21ea02238e6fba167ac563139cdb86b27d2c55df914340d6d12dfbb9a115dab3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-cdwgt" Mar 12 23:43:37.229046 kubelet[2789]: E0312 23:43:37.228869 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-cdwgt_kube-system(76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-cdwgt_kube-system(76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"21ea02238e6fba167ac563139cdb86b27d2c55df914340d6d12dfbb9a115dab3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-cdwgt" podUID="76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6" Mar 12 23:43:37.544154 containerd[1553]: time="2026-03-12T23:43:37.544116652Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 12 23:43:37.552461 containerd[1553]: time="2026-03-12T23:43:37.552341608Z" level=info msg="Container 5335d56f4bb4db36dd14fb6e21299d096d3d2a8be3850051285cac31c0d7bd68: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:37.566429 containerd[1553]: time="2026-03-12T23:43:37.566364838Z" level=info msg="CreateContainer within sandbox \"2082df8f65f8ec314123e2319b5aa98a13ab57d0777187166510140c698420bc\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5335d56f4bb4db36dd14fb6e21299d096d3d2a8be3850051285cac31c0d7bd68\"" Mar 12 23:43:37.569000 containerd[1553]: time="2026-03-12T23:43:37.568960825Z" level=info msg="StartContainer for \"5335d56f4bb4db36dd14fb6e21299d096d3d2a8be3850051285cac31c0d7bd68\"" Mar 12 23:43:37.574688 containerd[1553]: time="2026-03-12T23:43:37.574609873Z" level=info msg="connecting to shim 5335d56f4bb4db36dd14fb6e21299d096d3d2a8be3850051285cac31c0d7bd68" address="unix:///run/containerd/s/75b657ec884074c3c980df5a3e814b79b7bf4de7a35498e2e5906d91ca6cade2" protocol=ttrpc version=3 Mar 12 23:43:37.598036 systemd[1]: Started cri-containerd-5335d56f4bb4db36dd14fb6e21299d096d3d2a8be3850051285cac31c0d7bd68.scope - libcontainer container 5335d56f4bb4db36dd14fb6e21299d096d3d2a8be3850051285cac31c0d7bd68. Mar 12 23:43:37.677780 containerd[1553]: time="2026-03-12T23:43:37.677730078Z" level=info msg="StartContainer for \"5335d56f4bb4db36dd14fb6e21299d096d3d2a8be3850051285cac31c0d7bd68\" returns successfully" Mar 12 23:43:37.886391 kubelet[2789]: I0312 23:43:37.886274 2789 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-ca-bundle\") pod \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " Mar 12 23:43:37.886391 kubelet[2789]: I0312 23:43:37.886330 2789 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-nginx-config\" (UniqueName: \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-nginx-config\") pod \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " Mar 12 23:43:37.886391 kubelet[2789]: I0312 23:43:37.886363 2789 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-backend-key-pair\") pod \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " Mar 12 23:43:37.886391 kubelet[2789]: I0312 23:43:37.886390 2789 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-kube-api-access-mz922\" (UniqueName: \"kubernetes.io/projected/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-kube-api-access-mz922\") pod \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\" (UID: \"a839e77d-37bc-4f89-a3a9-2958c9cba9f2\") " Mar 12 23:43:37.888156 kubelet[2789]: I0312 23:43:37.887266 2789 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-nginx-config" pod "a839e77d-37bc-4f89-a3a9-2958c9cba9f2" (UID: "a839e77d-37bc-4f89-a3a9-2958c9cba9f2"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:43:37.888156 kubelet[2789]: I0312 23:43:37.887481 2789 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-ca-bundle" pod "a839e77d-37bc-4f89-a3a9-2958c9cba9f2" (UID: "a839e77d-37bc-4f89-a3a9-2958c9cba9f2"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:43:37.894756 systemd[1]: var-lib-kubelet-pods-a839e77d\x2d37bc\x2d4f89\x2da3a9\x2d2958c9cba9f2-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmz922.mount: Deactivated successfully. Mar 12 23:43:37.899007 systemd[1]: var-lib-kubelet-pods-a839e77d\x2d37bc\x2d4f89\x2da3a9\x2d2958c9cba9f2-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 12 23:43:37.900608 kubelet[2789]: I0312 23:43:37.900131 2789 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-kube-api-access-mz922" pod "a839e77d-37bc-4f89-a3a9-2958c9cba9f2" (UID: "a839e77d-37bc-4f89-a3a9-2958c9cba9f2"). InnerVolumeSpecName "kube-api-access-mz922". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 23:43:37.900608 kubelet[2789]: I0312 23:43:37.900340 2789 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-backend-key-pair" pod "a839e77d-37bc-4f89-a3a9-2958c9cba9f2" (UID: "a839e77d-37bc-4f89-a3a9-2958c9cba9f2"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 23:43:37.986838 kubelet[2789]: I0312 23:43:37.986757 2789 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-nginx-config\") on node \"ci-4459-2-4-n-69ffcbf899\" DevicePath \"\"" Mar 12 23:43:37.987767 kubelet[2789]: I0312 23:43:37.987743 2789 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-69ffcbf899\" DevicePath \"\"" Mar 12 23:43:37.987959 kubelet[2789]: I0312 23:43:37.987921 2789 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mz922\" (UniqueName: \"kubernetes.io/projected/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-kube-api-access-mz922\") on node \"ci-4459-2-4-n-69ffcbf899\" DevicePath \"\"" Mar 12 23:43:37.987959 kubelet[2789]: I0312 23:43:37.987937 2789 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a839e77d-37bc-4f89-a3a9-2958c9cba9f2-whisker-ca-bundle\") on node \"ci-4459-2-4-n-69ffcbf899\" DevicePath \"\"" Mar 12 23:43:38.382526 systemd[1]: Created slice kubepods-besteffort-podf06e482b_f74f_4302_8898_fd2753a17184.slice - libcontainer container kubepods-besteffort-podf06e482b_f74f_4302_8898_fd2753a17184.slice. Mar 12 23:43:38.388174 containerd[1553]: time="2026-03-12T23:43:38.388056735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-szgbn,Uid:f06e482b-f74f-4302-8898-fd2753a17184,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:38.555269 systemd[1]: Removed slice kubepods-besteffort-poda839e77d_37bc_4f89_a3a9_2958c9cba9f2.slice - libcontainer container kubepods-besteffort-poda839e77d_37bc_4f89_a3a9_2958c9cba9f2.slice. Mar 12 23:43:38.591658 kubelet[2789]: I0312 23:43:38.591600 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-8kgl2" podStartSLOduration=1.745402441 podStartE2EDuration="14.591586535s" podCreationTimestamp="2026-03-12 23:43:24 +0000 UTC" firstStartedPulling="2026-03-12 23:43:24.688110598 +0000 UTC m=+21.477913359" lastFinishedPulling="2026-03-12 23:43:37.534294692 +0000 UTC m=+34.324097453" observedRunningTime="2026-03-12 23:43:38.571130541 +0000 UTC m=+35.360933302" watchObservedRunningTime="2026-03-12 23:43:38.591586535 +0000 UTC m=+35.381389256" Mar 12 23:43:38.657966 systemd-networkd[1432]: cali886de5907b2: Link UP Mar 12 23:43:38.658208 systemd-networkd[1432]: cali886de5907b2: Gained carrier Mar 12 23:43:38.675536 systemd[1]: Created slice kubepods-besteffort-podbe04fb10_f644_4f15_a75f_16e4ee3a8952.slice - libcontainer container kubepods-besteffort-podbe04fb10_f644_4f15_a75f_16e4ee3a8952.slice. Mar 12 23:43:38.693409 kubelet[2789]: I0312 23:43:38.693191 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be04fb10-f644-4f15-a75f-16e4ee3a8952-whisker-ca-bundle\") pod \"whisker-56bd795645-b4dtj\" (UID: \"be04fb10-f644-4f15-a75f-16e4ee3a8952\") " pod="calico-system/whisker-56bd795645-b4dtj" Mar 12 23:43:38.693409 kubelet[2789]: I0312 23:43:38.693250 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/be04fb10-f644-4f15-a75f-16e4ee3a8952-nginx-config\") pod \"whisker-56bd795645-b4dtj\" (UID: \"be04fb10-f644-4f15-a75f-16e4ee3a8952\") " pod="calico-system/whisker-56bd795645-b4dtj" Mar 12 23:43:38.693409 kubelet[2789]: I0312 23:43:38.693313 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/be04fb10-f644-4f15-a75f-16e4ee3a8952-whisker-backend-key-pair\") pod \"whisker-56bd795645-b4dtj\" (UID: \"be04fb10-f644-4f15-a75f-16e4ee3a8952\") " pod="calico-system/whisker-56bd795645-b4dtj" Mar 12 23:43:38.693409 kubelet[2789]: I0312 23:43:38.693329 2789 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw8tw\" (UniqueName: \"kubernetes.io/projected/be04fb10-f644-4f15-a75f-16e4ee3a8952-kube-api-access-jw8tw\") pod \"whisker-56bd795645-b4dtj\" (UID: \"be04fb10-f644-4f15-a75f-16e4ee3a8952\") " pod="calico-system/whisker-56bd795645-b4dtj" Mar 12 23:43:38.694268 containerd[1553]: 2026-03-12 23:43:38.419 [ERROR][3846] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 23:43:38.694268 containerd[1553]: 2026-03-12 23:43:38.440 [INFO][3846] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0 csi-node-driver- calico-system f06e482b-f74f-4302-8898-fd2753a17184 742 0 2026-03-12 23:43:24 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 csi-node-driver-szgbn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali886de5907b2 [] [] }} ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-" Mar 12 23:43:38.694268 containerd[1553]: 2026-03-12 23:43:38.441 [INFO][3846] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" Mar 12 23:43:38.694268 containerd[1553]: 2026-03-12 23:43:38.503 [INFO][3857] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" HandleID="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Workload="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.518 [INFO][3857] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" HandleID="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Workload="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400041c010), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"csi-node-driver-szgbn", "timestamp":"2026-03-12 23:43:38.503898458 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000fd340)} Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.518 [INFO][3857] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.518 [INFO][3857] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.518 [INFO][3857] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.576 [INFO][3857] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.585 [INFO][3857] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.599 [INFO][3857] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.605 [INFO][3857] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694524 containerd[1553]: 2026-03-12 23:43:38.611 [INFO][3857] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694700 containerd[1553]: 2026-03-12 23:43:38.612 [INFO][3857] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694700 containerd[1553]: 2026-03-12 23:43:38.618 [INFO][3857] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef Mar 12 23:43:38.694700 containerd[1553]: 2026-03-12 23:43:38.625 [INFO][3857] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694700 containerd[1553]: 2026-03-12 23:43:38.637 [INFO][3857] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.129/26] block=192.168.97.128/26 handle="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694700 containerd[1553]: 2026-03-12 23:43:38.637 [INFO][3857] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.129/26] handle="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:38.694700 containerd[1553]: 2026-03-12 23:43:38.637 [INFO][3857] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:38.694700 containerd[1553]: 2026-03-12 23:43:38.637 [INFO][3857] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.129/26] IPv6=[] ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" HandleID="k8s-pod-network.44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Workload="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" Mar 12 23:43:38.694838 containerd[1553]: 2026-03-12 23:43:38.640 [INFO][3846] cni-plugin/k8s.go 418: Populated endpoint ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f06e482b-f74f-4302-8898-fd2753a17184", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"csi-node-driver-szgbn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.97.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali886de5907b2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:38.694898 containerd[1553]: 2026-03-12 23:43:38.640 [INFO][3846] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.129/32] ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" Mar 12 23:43:38.694898 containerd[1553]: 2026-03-12 23:43:38.641 [INFO][3846] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali886de5907b2 ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" Mar 12 23:43:38.694898 containerd[1553]: 2026-03-12 23:43:38.659 [INFO][3846] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" Mar 12 23:43:38.694956 containerd[1553]: 2026-03-12 23:43:38.660 [INFO][3846] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f06e482b-f74f-4302-8898-fd2753a17184", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef", Pod:"csi-node-driver-szgbn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.97.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali886de5907b2", MAC:"96:5c:0b:17:0b:03", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:38.695004 containerd[1553]: 2026-03-12 23:43:38.686 [INFO][3846] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" Namespace="calico-system" Pod="csi-node-driver-szgbn" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-csi--node--driver--szgbn-eth0" Mar 12 23:43:38.722401 containerd[1553]: time="2026-03-12T23:43:38.722352842Z" level=info msg="connecting to shim 44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef" address="unix:///run/containerd/s/89e76a28260e90a663e9a0cda0ea148da276539294020ef70d250c71ab1c531a" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:38.748015 systemd[1]: Started cri-containerd-44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef.scope - libcontainer container 44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef. Mar 12 23:43:38.778337 containerd[1553]: time="2026-03-12T23:43:38.778290646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-szgbn,Uid:f06e482b-f74f-4302-8898-fd2753a17184,Namespace:calico-system,Attempt:0,} returns sandbox id \"44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef\"" Mar 12 23:43:38.781239 containerd[1553]: time="2026-03-12T23:43:38.781011116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 12 23:43:38.986520 containerd[1553]: time="2026-03-12T23:43:38.985996662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56bd795645-b4dtj,Uid:be04fb10-f644-4f15-a75f-16e4ee3a8952,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:39.123278 systemd-networkd[1432]: calia30c4877d3a: Link UP Mar 12 23:43:39.126800 systemd-networkd[1432]: calia30c4877d3a: Gained carrier Mar 12 23:43:39.163719 containerd[1553]: 2026-03-12 23:43:39.015 [ERROR][3922] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 23:43:39.163719 containerd[1553]: 2026-03-12 23:43:39.031 [INFO][3922] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0 whisker-56bd795645- calico-system be04fb10-f644-4f15-a75f-16e4ee3a8952 925 0 2026-03-12 23:43:38 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:56bd795645 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 whisker-56bd795645-b4dtj eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calia30c4877d3a [] [] }} ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-" Mar 12 23:43:39.163719 containerd[1553]: 2026-03-12 23:43:39.031 [INFO][3922] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" Mar 12 23:43:39.163719 containerd[1553]: 2026-03-12 23:43:39.063 [INFO][3935] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" HandleID="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Workload="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.077 [INFO][3935] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" HandleID="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Workload="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"whisker-56bd795645-b4dtj", "timestamp":"2026-03-12 23:43:39.063339302 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030edc0)} Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.077 [INFO][3935] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.077 [INFO][3935] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.077 [INFO][3935] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.080 [INFO][3935] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.086 [INFO][3935] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.093 [INFO][3935] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.096 [INFO][3935] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.163960 containerd[1553]: 2026-03-12 23:43:39.100 [INFO][3935] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.164153 containerd[1553]: 2026-03-12 23:43:39.100 [INFO][3935] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.164153 containerd[1553]: 2026-03-12 23:43:39.102 [INFO][3935] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0 Mar 12 23:43:39.164153 containerd[1553]: 2026-03-12 23:43:39.107 [INFO][3935] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.164153 containerd[1553]: 2026-03-12 23:43:39.114 [INFO][3935] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.130/26] block=192.168.97.128/26 handle="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.164153 containerd[1553]: 2026-03-12 23:43:39.114 [INFO][3935] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.130/26] handle="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:39.164153 containerd[1553]: 2026-03-12 23:43:39.114 [INFO][3935] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:39.164153 containerd[1553]: 2026-03-12 23:43:39.114 [INFO][3935] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.130/26] IPv6=[] ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" HandleID="k8s-pod-network.603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Workload="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" Mar 12 23:43:39.164272 containerd[1553]: 2026-03-12 23:43:39.120 [INFO][3922] cni-plugin/k8s.go 418: Populated endpoint ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0", GenerateName:"whisker-56bd795645-", Namespace:"calico-system", SelfLink:"", UID:"be04fb10-f644-4f15-a75f-16e4ee3a8952", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"56bd795645", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"whisker-56bd795645-b4dtj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.97.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia30c4877d3a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:39.164272 containerd[1553]: 2026-03-12 23:43:39.120 [INFO][3922] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.130/32] ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" Mar 12 23:43:39.164339 containerd[1553]: 2026-03-12 23:43:39.120 [INFO][3922] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia30c4877d3a ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" Mar 12 23:43:39.164339 containerd[1553]: 2026-03-12 23:43:39.128 [INFO][3922] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" Mar 12 23:43:39.164382 containerd[1553]: 2026-03-12 23:43:39.129 [INFO][3922] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0", GenerateName:"whisker-56bd795645-", Namespace:"calico-system", SelfLink:"", UID:"be04fb10-f644-4f15-a75f-16e4ee3a8952", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"56bd795645", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0", Pod:"whisker-56bd795645-b4dtj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.97.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia30c4877d3a", MAC:"12:bf:f9:26:4f:74", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:39.164427 containerd[1553]: 2026-03-12 23:43:39.146 [INFO][3922] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" Namespace="calico-system" Pod="whisker-56bd795645-b4dtj" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-whisker--56bd795645--b4dtj-eth0" Mar 12 23:43:39.194085 containerd[1553]: time="2026-03-12T23:43:39.193971561Z" level=info msg="connecting to shim 603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0" address="unix:///run/containerd/s/508b1261d7cc94ba1c90553c173a2b92bab0e7b53de80a50efe37060d509b159" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:39.239411 systemd[1]: Started cri-containerd-603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0.scope - libcontainer container 603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0. Mar 12 23:43:39.317150 containerd[1553]: time="2026-03-12T23:43:39.317010238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56bd795645-b4dtj,Uid:be04fb10-f644-4f15-a75f-16e4ee3a8952,Namespace:calico-system,Attempt:0,} returns sandbox id \"603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0\"" Mar 12 23:43:39.382091 kubelet[2789]: I0312 23:43:39.382050 2789 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="a839e77d-37bc-4f89-a3a9-2958c9cba9f2" path="/var/lib/kubelet/pods/a839e77d-37bc-4f89-a3a9-2958c9cba9f2/volumes" Mar 12 23:43:39.552033 kubelet[2789]: I0312 23:43:39.551247 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:40.022568 systemd-networkd[1432]: vxlan.calico: Link UP Mar 12 23:43:40.022576 systemd-networkd[1432]: vxlan.calico: Gained carrier Mar 12 23:43:40.660087 systemd-networkd[1432]: cali886de5907b2: Gained IPv6LL Mar 12 23:43:40.862849 containerd[1553]: time="2026-03-12T23:43:40.862486569Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:40.864153 containerd[1553]: time="2026-03-12T23:43:40.864122796Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 12 23:43:40.865335 containerd[1553]: time="2026-03-12T23:43:40.865291540Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:40.868849 containerd[1553]: time="2026-03-12T23:43:40.868736059Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:40.869330 containerd[1553]: time="2026-03-12T23:43:40.869304533Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 2.08825614s" Mar 12 23:43:40.869417 containerd[1553]: time="2026-03-12T23:43:40.869403885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 12 23:43:40.871789 containerd[1553]: time="2026-03-12T23:43:40.871503793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 12 23:43:40.877857 containerd[1553]: time="2026-03-12T23:43:40.877800439Z" level=info msg="CreateContainer within sandbox \"44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 12 23:43:40.890994 containerd[1553]: time="2026-03-12T23:43:40.889013124Z" level=info msg="Container 69141822cc43ad033d7fc12ca35e593fade1bfc960684d8df72b1d2b8adb89fe: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:40.909386 containerd[1553]: time="2026-03-12T23:43:40.909317426Z" level=info msg="CreateContainer within sandbox \"44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"69141822cc43ad033d7fc12ca35e593fade1bfc960684d8df72b1d2b8adb89fe\"" Mar 12 23:43:40.910225 containerd[1553]: time="2026-03-12T23:43:40.910186275Z" level=info msg="StartContainer for \"69141822cc43ad033d7fc12ca35e593fade1bfc960684d8df72b1d2b8adb89fe\"" Mar 12 23:43:40.912684 containerd[1553]: time="2026-03-12T23:43:40.912596639Z" level=info msg="connecting to shim 69141822cc43ad033d7fc12ca35e593fade1bfc960684d8df72b1d2b8adb89fe" address="unix:///run/containerd/s/89e76a28260e90a663e9a0cda0ea148da276539294020ef70d250c71ab1c531a" protocol=ttrpc version=3 Mar 12 23:43:40.937489 systemd[1]: Started cri-containerd-69141822cc43ad033d7fc12ca35e593fade1bfc960684d8df72b1d2b8adb89fe.scope - libcontainer container 69141822cc43ad033d7fc12ca35e593fade1bfc960684d8df72b1d2b8adb89fe. Mar 12 23:43:41.030133 containerd[1553]: time="2026-03-12T23:43:41.030051077Z" level=info msg="StartContainer for \"69141822cc43ad033d7fc12ca35e593fade1bfc960684d8df72b1d2b8adb89fe\" returns successfully" Mar 12 23:43:41.172158 systemd-networkd[1432]: calia30c4877d3a: Gained IPv6LL Mar 12 23:43:41.748183 systemd-networkd[1432]: vxlan.calico: Gained IPv6LL Mar 12 23:43:42.209613 containerd[1553]: time="2026-03-12T23:43:42.209232713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:42.211401 containerd[1553]: time="2026-03-12T23:43:42.211201772Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 12 23:43:42.212877 containerd[1553]: time="2026-03-12T23:43:42.212803217Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:42.216739 containerd[1553]: time="2026-03-12T23:43:42.215775444Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:42.216739 containerd[1553]: time="2026-03-12T23:43:42.216487633Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.344685823s" Mar 12 23:43:42.216739 containerd[1553]: time="2026-03-12T23:43:42.216520550Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 12 23:43:42.218504 containerd[1553]: time="2026-03-12T23:43:42.218463691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 12 23:43:42.224444 containerd[1553]: time="2026-03-12T23:43:42.224408904Z" level=info msg="CreateContainer within sandbox \"603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 12 23:43:42.237482 containerd[1553]: time="2026-03-12T23:43:42.236129544Z" level=info msg="Container 2d5bab60fb09f8d395fe85676260bd7ca98fc24ba8efb1404b206044d66d7940: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:42.243138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2688974019.mount: Deactivated successfully. Mar 12 23:43:42.249620 containerd[1553]: time="2026-03-12T23:43:42.249563540Z" level=info msg="CreateContainer within sandbox \"603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"2d5bab60fb09f8d395fe85676260bd7ca98fc24ba8efb1404b206044d66d7940\"" Mar 12 23:43:42.250525 containerd[1553]: time="2026-03-12T23:43:42.250422198Z" level=info msg="StartContainer for \"2d5bab60fb09f8d395fe85676260bd7ca98fc24ba8efb1404b206044d66d7940\"" Mar 12 23:43:42.252512 containerd[1553]: time="2026-03-12T23:43:42.252461052Z" level=info msg="connecting to shim 2d5bab60fb09f8d395fe85676260bd7ca98fc24ba8efb1404b206044d66d7940" address="unix:///run/containerd/s/508b1261d7cc94ba1c90553c173a2b92bab0e7b53de80a50efe37060d509b159" protocol=ttrpc version=3 Mar 12 23:43:42.279496 systemd[1]: Started cri-containerd-2d5bab60fb09f8d395fe85676260bd7ca98fc24ba8efb1404b206044d66d7940.scope - libcontainer container 2d5bab60fb09f8d395fe85676260bd7ca98fc24ba8efb1404b206044d66d7940. Mar 12 23:43:42.330219 containerd[1553]: time="2026-03-12T23:43:42.330152518Z" level=info msg="StartContainer for \"2d5bab60fb09f8d395fe85676260bd7ca98fc24ba8efb1404b206044d66d7940\" returns successfully" Mar 12 23:43:43.865501 containerd[1553]: time="2026-03-12T23:43:43.865439646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:43.866918 containerd[1553]: time="2026-03-12T23:43:43.866858271Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 12 23:43:43.867687 containerd[1553]: time="2026-03-12T23:43:43.867604021Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:43.871013 containerd[1553]: time="2026-03-12T23:43:43.870958357Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:43.871680 containerd[1553]: time="2026-03-12T23:43:43.871623312Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.653126823s" Mar 12 23:43:43.871737 containerd[1553]: time="2026-03-12T23:43:43.871658190Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 12 23:43:43.873168 containerd[1553]: time="2026-03-12T23:43:43.873066935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 12 23:43:43.879275 containerd[1553]: time="2026-03-12T23:43:43.879225842Z" level=info msg="CreateContainer within sandbox \"44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 12 23:43:43.892096 containerd[1553]: time="2026-03-12T23:43:43.891477861Z" level=info msg="Container 5107decaacdf38275d7cf8ac348226e755061136aa1c65a5e637d55a50c193cf: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:43.913333 containerd[1553]: time="2026-03-12T23:43:43.913190166Z" level=info msg="CreateContainer within sandbox \"44c4b07e2c7ef23ef2eb9593ce18f70bce6ee3f90143909e501a012c6bb10bef\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5107decaacdf38275d7cf8ac348226e755061136aa1c65a5e637d55a50c193cf\"" Mar 12 23:43:43.914456 containerd[1553]: time="2026-03-12T23:43:43.914425803Z" level=info msg="StartContainer for \"5107decaacdf38275d7cf8ac348226e755061136aa1c65a5e637d55a50c193cf\"" Mar 12 23:43:43.916391 containerd[1553]: time="2026-03-12T23:43:43.916343314Z" level=info msg="connecting to shim 5107decaacdf38275d7cf8ac348226e755061136aa1c65a5e637d55a50c193cf" address="unix:///run/containerd/s/89e76a28260e90a663e9a0cda0ea148da276539294020ef70d250c71ab1c531a" protocol=ttrpc version=3 Mar 12 23:43:43.938060 systemd[1]: Started cri-containerd-5107decaacdf38275d7cf8ac348226e755061136aa1c65a5e637d55a50c193cf.scope - libcontainer container 5107decaacdf38275d7cf8ac348226e755061136aa1c65a5e637d55a50c193cf. Mar 12 23:43:44.013504 containerd[1553]: time="2026-03-12T23:43:44.013430904Z" level=info msg="StartContainer for \"5107decaacdf38275d7cf8ac348226e755061136aa1c65a5e637d55a50c193cf\" returns successfully" Mar 12 23:43:44.441851 kubelet[2789]: I0312 23:43:44.441773 2789 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 12 23:43:44.441851 kubelet[2789]: I0312 23:43:44.441812 2789 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 12 23:43:44.593679 kubelet[2789]: I0312 23:43:44.593465 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-szgbn" podStartSLOduration=15.501216435 podStartE2EDuration="20.593446635s" podCreationTimestamp="2026-03-12 23:43:24 +0000 UTC" firstStartedPulling="2026-03-12 23:43:38.780702064 +0000 UTC m=+35.570504825" lastFinishedPulling="2026-03-12 23:43:43.872932224 +0000 UTC m=+40.662735025" observedRunningTime="2026-03-12 23:43:44.592212632 +0000 UTC m=+41.382015433" watchObservedRunningTime="2026-03-12 23:43:44.593446635 +0000 UTC m=+41.383249436" Mar 12 23:43:45.312563 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2568741657.mount: Deactivated successfully. Mar 12 23:43:45.334599 containerd[1553]: time="2026-03-12T23:43:45.334345512Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:45.336367 containerd[1553]: time="2026-03-12T23:43:45.336330197Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 12 23:43:45.340145 containerd[1553]: time="2026-03-12T23:43:45.338790574Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:45.342929 containerd[1553]: time="2026-03-12T23:43:45.342884377Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:45.346090 containerd[1553]: time="2026-03-12T23:43:45.345958838Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 1.472855506s" Mar 12 23:43:45.346090 containerd[1553]: time="2026-03-12T23:43:45.346070912Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 12 23:43:45.355766 containerd[1553]: time="2026-03-12T23:43:45.355199462Z" level=info msg="CreateContainer within sandbox \"603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 12 23:43:45.366110 containerd[1553]: time="2026-03-12T23:43:45.366063791Z" level=info msg="Container 1001454572766e4fb4e52033d3ddf519998766455f4af90fd6db1eb1649e4e03: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:45.384257 containerd[1553]: time="2026-03-12T23:43:45.383931034Z" level=info msg="CreateContainer within sandbox \"603dbb83fe62dcecfbe8ad78881872abdab3dd5fc5faf86d039e212131e6e2e0\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1001454572766e4fb4e52033d3ddf519998766455f4af90fd6db1eb1649e4e03\"" Mar 12 23:43:45.389346 containerd[1553]: time="2026-03-12T23:43:45.389032938Z" level=info msg="StartContainer for \"1001454572766e4fb4e52033d3ddf519998766455f4af90fd6db1eb1649e4e03\"" Mar 12 23:43:45.392840 containerd[1553]: time="2026-03-12T23:43:45.392412622Z" level=info msg="connecting to shim 1001454572766e4fb4e52033d3ddf519998766455f4af90fd6db1eb1649e4e03" address="unix:///run/containerd/s/508b1261d7cc94ba1c90553c173a2b92bab0e7b53de80a50efe37060d509b159" protocol=ttrpc version=3 Mar 12 23:43:45.423180 systemd[1]: Started cri-containerd-1001454572766e4fb4e52033d3ddf519998766455f4af90fd6db1eb1649e4e03.scope - libcontainer container 1001454572766e4fb4e52033d3ddf519998766455f4af90fd6db1eb1649e4e03. Mar 12 23:43:45.464979 containerd[1553]: time="2026-03-12T23:43:45.464919614Z" level=info msg="StartContainer for \"1001454572766e4fb4e52033d3ddf519998766455f4af90fd6db1eb1649e4e03\" returns successfully" Mar 12 23:43:45.601803 kubelet[2789]: I0312 23:43:45.601642 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-56bd795645-b4dtj" podStartSLOduration=1.5727969320000001 podStartE2EDuration="7.600522424s" podCreationTimestamp="2026-03-12 23:43:38 +0000 UTC" firstStartedPulling="2026-03-12 23:43:39.319144893 +0000 UTC m=+36.108947654" lastFinishedPulling="2026-03-12 23:43:45.346870385 +0000 UTC m=+42.136673146" observedRunningTime="2026-03-12 23:43:45.599099946 +0000 UTC m=+42.388902747" watchObservedRunningTime="2026-03-12 23:43:45.600522424 +0000 UTC m=+42.390325225" Mar 12 23:43:50.378826 containerd[1553]: time="2026-03-12T23:43:50.378758332Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-7ht8g,Uid:54ede1dd-a4d1-40a1-a3b3-dd219f2c692d,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:50.535022 systemd-networkd[1432]: calie8101e1b605: Link UP Mar 12 23:43:50.535940 systemd-networkd[1432]: calie8101e1b605: Gained carrier Mar 12 23:43:50.561256 containerd[1553]: 2026-03-12 23:43:50.437 [INFO][4400] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0 coredns-7d764666f9- kube-system 54ede1dd-a4d1-40a1-a3b3-dd219f2c692d 861 0 2026-03-12 23:43:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 coredns-7d764666f9-7ht8g eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie8101e1b605 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-" Mar 12 23:43:50.561256 containerd[1553]: 2026-03-12 23:43:50.437 [INFO][4400] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" Mar 12 23:43:50.561256 containerd[1553]: 2026-03-12 23:43:50.467 [INFO][4413] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" HandleID="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Workload="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.480 [INFO][4413] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" HandleID="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Workload="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbe60), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"coredns-7d764666f9-7ht8g", "timestamp":"2026-03-12 23:43:50.467481409 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002db760)} Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.480 [INFO][4413] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.480 [INFO][4413] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.480 [INFO][4413] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.484 [INFO][4413] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.491 [INFO][4413] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.500 [INFO][4413] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.502 [INFO][4413] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.561532 containerd[1553]: 2026-03-12 23:43:50.505 [INFO][4413] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.562985 containerd[1553]: 2026-03-12 23:43:50.505 [INFO][4413] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.562985 containerd[1553]: 2026-03-12 23:43:50.507 [INFO][4413] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4 Mar 12 23:43:50.562985 containerd[1553]: 2026-03-12 23:43:50.514 [INFO][4413] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.562985 containerd[1553]: 2026-03-12 23:43:50.523 [INFO][4413] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.131/26] block=192.168.97.128/26 handle="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.562985 containerd[1553]: 2026-03-12 23:43:50.523 [INFO][4413] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.131/26] handle="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:50.562985 containerd[1553]: 2026-03-12 23:43:50.523 [INFO][4413] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:50.562985 containerd[1553]: 2026-03-12 23:43:50.523 [INFO][4413] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.131/26] IPv6=[] ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" HandleID="k8s-pod-network.5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Workload="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" Mar 12 23:43:50.563128 containerd[1553]: 2026-03-12 23:43:50.527 [INFO][4400] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"54ede1dd-a4d1-40a1-a3b3-dd219f2c692d", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"coredns-7d764666f9-7ht8g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie8101e1b605", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:50.563128 containerd[1553]: 2026-03-12 23:43:50.528 [INFO][4400] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.131/32] ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" Mar 12 23:43:50.563128 containerd[1553]: 2026-03-12 23:43:50.528 [INFO][4400] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8101e1b605 ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" Mar 12 23:43:50.563128 containerd[1553]: 2026-03-12 23:43:50.537 [INFO][4400] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" Mar 12 23:43:50.563128 containerd[1553]: 2026-03-12 23:43:50.538 [INFO][4400] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"54ede1dd-a4d1-40a1-a3b3-dd219f2c692d", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4", Pod:"coredns-7d764666f9-7ht8g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie8101e1b605", MAC:"0e:d9:02:97:24:ef", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:50.563625 containerd[1553]: 2026-03-12 23:43:50.553 [INFO][4400] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" Namespace="kube-system" Pod="coredns-7d764666f9-7ht8g" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--7ht8g-eth0" Mar 12 23:43:50.599532 containerd[1553]: time="2026-03-12T23:43:50.599341730Z" level=info msg="connecting to shim 5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4" address="unix:///run/containerd/s/452a024c6ab46942675cd5f91ae5ff6f5afd7682b958f360bbd10f3e2fbee57e" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:50.635489 systemd[1]: Started cri-containerd-5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4.scope - libcontainer container 5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4. Mar 12 23:43:50.690970 containerd[1553]: time="2026-03-12T23:43:50.690790704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-7ht8g,Uid:54ede1dd-a4d1-40a1-a3b3-dd219f2c692d,Namespace:kube-system,Attempt:0,} returns sandbox id \"5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4\"" Mar 12 23:43:50.698208 containerd[1553]: time="2026-03-12T23:43:50.698143905Z" level=info msg="CreateContainer within sandbox \"5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:43:50.713844 containerd[1553]: time="2026-03-12T23:43:50.712124135Z" level=info msg="Container 9f3bd69bbf25da2630142d74c00a0149fc7b7f84b0a85a4766185c8abd357ba0: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:50.722489 containerd[1553]: time="2026-03-12T23:43:50.722439944Z" level=info msg="CreateContainer within sandbox \"5d17d683417e0c371c53d3f0cd2e4b1545b07570110339d78679f8f74fb2a3a4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9f3bd69bbf25da2630142d74c00a0149fc7b7f84b0a85a4766185c8abd357ba0\"" Mar 12 23:43:50.724160 containerd[1553]: time="2026-03-12T23:43:50.724129280Z" level=info msg="StartContainer for \"9f3bd69bbf25da2630142d74c00a0149fc7b7f84b0a85a4766185c8abd357ba0\"" Mar 12 23:43:50.727495 containerd[1553]: time="2026-03-12T23:43:50.727368357Z" level=info msg="connecting to shim 9f3bd69bbf25da2630142d74c00a0149fc7b7f84b0a85a4766185c8abd357ba0" address="unix:///run/containerd/s/452a024c6ab46942675cd5f91ae5ff6f5afd7682b958f360bbd10f3e2fbee57e" protocol=ttrpc version=3 Mar 12 23:43:50.751071 systemd[1]: Started cri-containerd-9f3bd69bbf25da2630142d74c00a0149fc7b7f84b0a85a4766185c8abd357ba0.scope - libcontainer container 9f3bd69bbf25da2630142d74c00a0149fc7b7f84b0a85a4766185c8abd357ba0. Mar 12 23:43:50.786237 containerd[1553]: time="2026-03-12T23:43:50.786199567Z" level=info msg="StartContainer for \"9f3bd69bbf25da2630142d74c00a0149fc7b7f84b0a85a4766185c8abd357ba0\" returns successfully" Mar 12 23:43:51.377748 containerd[1553]: time="2026-03-12T23:43:51.377684123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-cdwgt,Uid:76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:51.525810 systemd-networkd[1432]: cali4bee907e2a4: Link UP Mar 12 23:43:51.526632 systemd-networkd[1432]: cali4bee907e2a4: Gained carrier Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.431 [INFO][4523] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0 coredns-7d764666f9- kube-system 76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6 863 0 2026-03-12 23:43:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 coredns-7d764666f9-cdwgt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4bee907e2a4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.431 [INFO][4523] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.464 [INFO][4535] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" HandleID="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Workload="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.477 [INFO][4535] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" HandleID="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Workload="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ed4b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"coredns-7d764666f9-cdwgt", "timestamp":"2026-03-12 23:43:51.464162401 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400030d080)} Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.478 [INFO][4535] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.478 [INFO][4535] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.478 [INFO][4535] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.480 [INFO][4535] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.488 [INFO][4535] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.494 [INFO][4535] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.496 [INFO][4535] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.500 [INFO][4535] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.501 [INFO][4535] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.503 [INFO][4535] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012 Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.509 [INFO][4535] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.518 [INFO][4535] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.132/26] block=192.168.97.128/26 handle="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.518 [INFO][4535] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.132/26] handle="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.518 [INFO][4535] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:51.550333 containerd[1553]: 2026-03-12 23:43:51.518 [INFO][4535] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.132/26] IPv6=[] ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" HandleID="k8s-pod-network.a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Workload="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" Mar 12 23:43:51.551547 containerd[1553]: 2026-03-12 23:43:51.520 [INFO][4523] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"coredns-7d764666f9-cdwgt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bee907e2a4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:51.551547 containerd[1553]: 2026-03-12 23:43:51.521 [INFO][4523] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.132/32] ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" Mar 12 23:43:51.551547 containerd[1553]: 2026-03-12 23:43:51.521 [INFO][4523] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4bee907e2a4 ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" Mar 12 23:43:51.551547 containerd[1553]: 2026-03-12 23:43:51.525 [INFO][4523] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" Mar 12 23:43:51.551547 containerd[1553]: 2026-03-12 23:43:51.528 [INFO][4523] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012", Pod:"coredns-7d764666f9-cdwgt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bee907e2a4", MAC:"16:a5:51:a7:3a:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:51.551842 containerd[1553]: 2026-03-12 23:43:51.547 [INFO][4523] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" Namespace="kube-system" Pod="coredns-7d764666f9-cdwgt" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-coredns--7d764666f9--cdwgt-eth0" Mar 12 23:43:51.584634 containerd[1553]: time="2026-03-12T23:43:51.584576916Z" level=info msg="connecting to shim a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012" address="unix:///run/containerd/s/e19c22393e24871b7a157fed6a4f0a5cee872dce9ea7f759c4d93339846c962d" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:51.625040 systemd[1]: Started cri-containerd-a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012.scope - libcontainer container a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012. Mar 12 23:43:51.689057 containerd[1553]: time="2026-03-12T23:43:51.688987900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-cdwgt,Uid:76a8ed3c-17a5-44b1-bdfc-3127aa1b88d6,Namespace:kube-system,Attempt:0,} returns sandbox id \"a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012\"" Mar 12 23:43:51.693907 containerd[1553]: time="2026-03-12T23:43:51.693776576Z" level=info msg="CreateContainer within sandbox \"a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:43:51.707195 containerd[1553]: time="2026-03-12T23:43:51.707146318Z" level=info msg="Container 5bb6aa9eeb3e6e21433270318f7e1883eda1b747f5e9d059d1f8c4cab0f50d65: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:51.715836 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1304337826.mount: Deactivated successfully. Mar 12 23:43:51.717666 containerd[1553]: time="2026-03-12T23:43:51.717629759Z" level=info msg="CreateContainer within sandbox \"a2d2a444674532f51a14fa954478b5bef7d0ab4fb19116323770a7257e07e012\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5bb6aa9eeb3e6e21433270318f7e1883eda1b747f5e9d059d1f8c4cab0f50d65\"" Mar 12 23:43:51.718656 containerd[1553]: time="2026-03-12T23:43:51.718541888Z" level=info msg="StartContainer for \"5bb6aa9eeb3e6e21433270318f7e1883eda1b747f5e9d059d1f8c4cab0f50d65\"" Mar 12 23:43:51.720847 containerd[1553]: time="2026-03-12T23:43:51.719641770Z" level=info msg="connecting to shim 5bb6aa9eeb3e6e21433270318f7e1883eda1b747f5e9d059d1f8c4cab0f50d65" address="unix:///run/containerd/s/e19c22393e24871b7a157fed6a4f0a5cee872dce9ea7f759c4d93339846c962d" protocol=ttrpc version=3 Mar 12 23:43:51.746031 systemd[1]: Started cri-containerd-5bb6aa9eeb3e6e21433270318f7e1883eda1b747f5e9d059d1f8c4cab0f50d65.scope - libcontainer container 5bb6aa9eeb3e6e21433270318f7e1883eda1b747f5e9d059d1f8c4cab0f50d65. Mar 12 23:43:51.794140 containerd[1553]: time="2026-03-12T23:43:51.793866868Z" level=info msg="StartContainer for \"5bb6aa9eeb3e6e21433270318f7e1883eda1b747f5e9d059d1f8c4cab0f50d65\" returns successfully" Mar 12 23:43:52.209225 kubelet[2789]: I0312 23:43:52.209102 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:52.309033 systemd-networkd[1432]: calie8101e1b605: Gained IPv6LL Mar 12 23:43:52.326468 kubelet[2789]: I0312 23:43:52.325514 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-7ht8g" podStartSLOduration=43.325502011 podStartE2EDuration="43.325502011s" podCreationTimestamp="2026-03-12 23:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:51.630285511 +0000 UTC m=+48.420088272" watchObservedRunningTime="2026-03-12 23:43:52.325502011 +0000 UTC m=+49.115304772" Mar 12 23:43:52.377943 containerd[1553]: time="2026-03-12T23:43:52.377885083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-tqf8l,Uid:cbf90c03-2c31-4265-97a4-01480ee945ec,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:52.379142 containerd[1553]: time="2026-03-12T23:43:52.379101165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d87956b8c-nbrdc,Uid:9e009c2a-43f1-473c-ac28-09966919e16f,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:52.381635 containerd[1553]: time="2026-03-12T23:43:52.381440373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-fbrhp,Uid:438f4726-557c-45f8-a55a-8da149168ef8,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:52.387443 containerd[1553]: time="2026-03-12T23:43:52.387401190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-vdx54,Uid:94660390-b097-4115-a00d-194fc136d3e9,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:52.672002 kubelet[2789]: I0312 23:43:52.671946 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-cdwgt" podStartSLOduration=43.671921293 podStartE2EDuration="43.671921293s" podCreationTimestamp="2026-03-12 23:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:52.64468641 +0000 UTC m=+49.434489131" watchObservedRunningTime="2026-03-12 23:43:52.671921293 +0000 UTC m=+49.461724054" Mar 12 23:43:52.711313 systemd-networkd[1432]: calia87f2e7d58a: Link UP Mar 12 23:43:52.713882 systemd-networkd[1432]: calia87f2e7d58a: Gained carrier Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.508 [INFO][4699] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0 goldmane-9f7667bb8- calico-system cbf90c03-2c31-4265-97a4-01480ee945ec 869 0 2026-03-12 23:43:22 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 goldmane-9f7667bb8-tqf8l eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia87f2e7d58a [] [] }} ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.509 [INFO][4699] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.593 [INFO][4750] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" HandleID="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Workload="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.619 [INFO][4750] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" HandleID="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Workload="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000307f40), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"goldmane-9f7667bb8-tqf8l", "timestamp":"2026-03-12 23:43:52.593180751 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000349600)} Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.619 [INFO][4750] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.619 [INFO][4750] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.619 [INFO][4750] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.627 [INFO][4750] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.638 [INFO][4750] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.651 [INFO][4750] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.654 [INFO][4750] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.663 [INFO][4750] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.663 [INFO][4750] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.665 [INFO][4750] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.674 [INFO][4750] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.699 [INFO][4750] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.133/26] block=192.168.97.128/26 handle="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.699 [INFO][4750] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.133/26] handle="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.699 [INFO][4750] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:52.748538 containerd[1553]: 2026-03-12 23:43:52.699 [INFO][4750] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.133/26] IPv6=[] ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" HandleID="k8s-pod-network.0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Workload="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" Mar 12 23:43:52.751111 containerd[1553]: 2026-03-12 23:43:52.703 [INFO][4699] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"cbf90c03-2c31-4265-97a4-01480ee945ec", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"goldmane-9f7667bb8-tqf8l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.97.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia87f2e7d58a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:52.751111 containerd[1553]: 2026-03-12 23:43:52.704 [INFO][4699] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.133/32] ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" Mar 12 23:43:52.751111 containerd[1553]: 2026-03-12 23:43:52.704 [INFO][4699] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia87f2e7d58a ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" Mar 12 23:43:52.751111 containerd[1553]: 2026-03-12 23:43:52.714 [INFO][4699] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" Mar 12 23:43:52.751111 containerd[1553]: 2026-03-12 23:43:52.717 [INFO][4699] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"cbf90c03-2c31-4265-97a4-01480ee945ec", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb", Pod:"goldmane-9f7667bb8-tqf8l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.97.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia87f2e7d58a", MAC:"d6:a9:bc:ed:24:95", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:52.751111 containerd[1553]: 2026-03-12 23:43:52.746 [INFO][4699] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" Namespace="calico-system" Pod="goldmane-9f7667bb8-tqf8l" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-goldmane--9f7667bb8--tqf8l-eth0" Mar 12 23:43:52.784841 containerd[1553]: time="2026-03-12T23:43:52.784402359Z" level=info msg="connecting to shim 0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb" address="unix:///run/containerd/s/0fbd6b6bc8e909dcf5547b0b436457e0cfb5dcdb35e851cd34cd00161779bea6" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:52.817047 systemd-networkd[1432]: calia5f92b70799: Link UP Mar 12 23:43:52.821682 systemd-networkd[1432]: calia5f92b70799: Gained carrier Mar 12 23:43:52.845275 systemd[1]: Started cri-containerd-0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb.scope - libcontainer container 0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb. Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.511 [INFO][4713] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0 calico-apiserver-59f89dfd9f- calico-system 94660390-b097-4115-a00d-194fc136d3e9 870 0 2026-03-12 23:43:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59f89dfd9f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 calico-apiserver-59f89dfd9f-vdx54 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calia5f92b70799 [] [] }} ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.512 [INFO][4713] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.589 [INFO][4752] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" HandleID="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.626 [INFO][4752] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" HandleID="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004de30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"calico-apiserver-59f89dfd9f-vdx54", "timestamp":"2026-03-12 23:43:52.589534263 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400037a000)} Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.626 [INFO][4752] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.699 [INFO][4752] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.699 [INFO][4752] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.739 [INFO][4752] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.751 [INFO][4752] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.762 [INFO][4752] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.765 [INFO][4752] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.769 [INFO][4752] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.769 [INFO][4752] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.778 [INFO][4752] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90 Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.788 [INFO][4752] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.800 [INFO][4752] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.134/26] block=192.168.97.128/26 handle="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.800 [INFO][4752] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.134/26] handle="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.800 [INFO][4752] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:52.853433 containerd[1553]: 2026-03-12 23:43:52.800 [INFO][4752] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.134/26] IPv6=[] ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" HandleID="k8s-pod-network.be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" Mar 12 23:43:52.854964 containerd[1553]: 2026-03-12 23:43:52.807 [INFO][4713] cni-plugin/k8s.go 418: Populated endpoint ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0", GenerateName:"calico-apiserver-59f89dfd9f-", Namespace:"calico-system", SelfLink:"", UID:"94660390-b097-4115-a00d-194fc136d3e9", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f89dfd9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"calico-apiserver-59f89dfd9f-vdx54", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calia5f92b70799", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:52.854964 containerd[1553]: 2026-03-12 23:43:52.808 [INFO][4713] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.134/32] ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" Mar 12 23:43:52.854964 containerd[1553]: 2026-03-12 23:43:52.808 [INFO][4713] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia5f92b70799 ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" Mar 12 23:43:52.854964 containerd[1553]: 2026-03-12 23:43:52.821 [INFO][4713] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" Mar 12 23:43:52.854964 containerd[1553]: 2026-03-12 23:43:52.823 [INFO][4713] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0", GenerateName:"calico-apiserver-59f89dfd9f-", Namespace:"calico-system", SelfLink:"", UID:"94660390-b097-4115-a00d-194fc136d3e9", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f89dfd9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90", Pod:"calico-apiserver-59f89dfd9f-vdx54", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calia5f92b70799", MAC:"ca:32:af:e4:1d:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:52.854964 containerd[1553]: 2026-03-12 23:43:52.850 [INFO][4713] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-vdx54" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--vdx54-eth0" Mar 12 23:43:52.898804 containerd[1553]: time="2026-03-12T23:43:52.898659410Z" level=info msg="connecting to shim be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90" address="unix:///run/containerd/s/074ac01dc203d9e817afecddc646b7dcef5ca74f3a373cb90ef111f61ca2355b" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:52.918628 systemd-networkd[1432]: cali817afafe23b: Link UP Mar 12 23:43:52.921646 systemd-networkd[1432]: cali817afafe23b: Gained carrier Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.529 [INFO][4722] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0 calico-apiserver-59f89dfd9f- calico-system 438f4726-557c-45f8-a55a-8da149168ef8 866 0 2026-03-12 23:43:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59f89dfd9f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 calico-apiserver-59f89dfd9f-fbrhp eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali817afafe23b [] [] }} ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.529 [INFO][4722] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.604 [INFO][4761] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" HandleID="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.635 [INFO][4761] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" HandleID="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000313ea0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"calico-apiserver-59f89dfd9f-fbrhp", "timestamp":"2026-03-12 23:43:52.60430541 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400011c6e0)} Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.635 [INFO][4761] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.800 [INFO][4761] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.800 [INFO][4761] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.840 [INFO][4761] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.857 [INFO][4761] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.868 [INFO][4761] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.871 [INFO][4761] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.875 [INFO][4761] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.875 [INFO][4761] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.879 [INFO][4761] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23 Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.893 [INFO][4761] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.907 [INFO][4761] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.135/26] block=192.168.97.128/26 handle="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.907 [INFO][4761] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.135/26] handle="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.907 [INFO][4761] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:52.967504 containerd[1553]: 2026-03-12 23:43:52.907 [INFO][4761] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.135/26] IPv6=[] ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" HandleID="k8s-pod-network.eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" Mar 12 23:43:52.971223 containerd[1553]: 2026-03-12 23:43:52.913 [INFO][4722] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0", GenerateName:"calico-apiserver-59f89dfd9f-", Namespace:"calico-system", SelfLink:"", UID:"438f4726-557c-45f8-a55a-8da149168ef8", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f89dfd9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"calico-apiserver-59f89dfd9f-fbrhp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali817afafe23b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:52.971223 containerd[1553]: 2026-03-12 23:43:52.913 [INFO][4722] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.135/32] ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" Mar 12 23:43:52.971223 containerd[1553]: 2026-03-12 23:43:52.913 [INFO][4722] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali817afafe23b ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" Mar 12 23:43:52.971223 containerd[1553]: 2026-03-12 23:43:52.921 [INFO][4722] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" Mar 12 23:43:52.971223 containerd[1553]: 2026-03-12 23:43:52.923 [INFO][4722] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0", GenerateName:"calico-apiserver-59f89dfd9f-", Namespace:"calico-system", SelfLink:"", UID:"438f4726-557c-45f8-a55a-8da149168ef8", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f89dfd9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23", Pod:"calico-apiserver-59f89dfd9f-fbrhp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali817afafe23b", MAC:"46:e1:23:34:9a:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:52.971223 containerd[1553]: 2026-03-12 23:43:52.957 [INFO][4722] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" Namespace="calico-system" Pod="calico-apiserver-59f89dfd9f-fbrhp" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--apiserver--59f89dfd9f--fbrhp-eth0" Mar 12 23:43:52.973195 systemd[1]: Started cri-containerd-be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90.scope - libcontainer container be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90. Mar 12 23:43:52.996850 containerd[1553]: time="2026-03-12T23:43:52.996602363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-tqf8l,Uid:cbf90c03-2c31-4265-97a4-01480ee945ec,Namespace:calico-system,Attempt:0,} returns sandbox id \"0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb\"" Mar 12 23:43:53.004470 containerd[1553]: time="2026-03-12T23:43:53.004327298Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 12 23:43:53.021692 containerd[1553]: time="2026-03-12T23:43:53.021645105Z" level=info msg="connecting to shim eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23" address="unix:///run/containerd/s/b42f38a6c8dea27ed5719ccdb31427802c539729eaa532bce172c91b57273aaf" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:53.042304 systemd-networkd[1432]: cali4ebccf7e1d4: Link UP Mar 12 23:43:53.043750 systemd-networkd[1432]: cali4ebccf7e1d4: Gained carrier Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.539 [INFO][4709] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0 calico-kube-controllers-d87956b8c- calico-system 9e009c2a-43f1-473c-ac28-09966919e16f 867 0 2026-03-12 23:43:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:d87956b8c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-69ffcbf899 calico-kube-controllers-d87956b8c-nbrdc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4ebccf7e1d4 [] [] }} ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.539 [INFO][4709] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.608 [INFO][4763] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" HandleID="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.635 [INFO][4763] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" HandleID="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbe90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-69ffcbf899", "pod":"calico-kube-controllers-d87956b8c-nbrdc", "timestamp":"2026-03-12 23:43:52.608353845 +0000 UTC"}, Hostname:"ci-4459-2-4-n-69ffcbf899", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001866e0)} Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.635 [INFO][4763] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.907 [INFO][4763] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.907 [INFO][4763] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-69ffcbf899' Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.939 [INFO][4763] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.960 [INFO][4763] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.975 [INFO][4763] ipam/ipam.go 526: Trying affinity for 192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.981 [INFO][4763] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.988 [INFO][4763] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.128/26 host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.989 [INFO][4763] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.128/26 handle="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:52.994 [INFO][4763] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620 Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:53.016 [INFO][4763] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.128/26 handle="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:53.031 [INFO][4763] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.136/26] block=192.168.97.128/26 handle="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:53.031 [INFO][4763] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.136/26] handle="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" host="ci-4459-2-4-n-69ffcbf899" Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:53.031 [INFO][4763] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:53.071311 containerd[1553]: 2026-03-12 23:43:53.031 [INFO][4763] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.136/26] IPv6=[] ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" HandleID="k8s-pod-network.1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Workload="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" Mar 12 23:43:53.072143 containerd[1553]: 2026-03-12 23:43:53.035 [INFO][4709] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0", GenerateName:"calico-kube-controllers-d87956b8c-", Namespace:"calico-system", SelfLink:"", UID:"9e009c2a-43f1-473c-ac28-09966919e16f", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d87956b8c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"", Pod:"calico-kube-controllers-d87956b8c-nbrdc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.97.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4ebccf7e1d4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:53.072143 containerd[1553]: 2026-03-12 23:43:53.035 [INFO][4709] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.136/32] ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" Mar 12 23:43:53.072143 containerd[1553]: 2026-03-12 23:43:53.035 [INFO][4709] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4ebccf7e1d4 ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" Mar 12 23:43:53.072143 containerd[1553]: 2026-03-12 23:43:53.042 [INFO][4709] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" Mar 12 23:43:53.072143 containerd[1553]: 2026-03-12 23:43:53.043 [INFO][4709] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0", GenerateName:"calico-kube-controllers-d87956b8c-", Namespace:"calico-system", SelfLink:"", UID:"9e009c2a-43f1-473c-ac28-09966919e16f", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d87956b8c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-69ffcbf899", ContainerID:"1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620", Pod:"calico-kube-controllers-d87956b8c-nbrdc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.97.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4ebccf7e1d4", MAC:"3e:ae:a7:fe:9f:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:53.072143 containerd[1553]: 2026-03-12 23:43:53.061 [INFO][4709] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" Namespace="calico-system" Pod="calico-kube-controllers-d87956b8c-nbrdc" WorkloadEndpoint="ci--4459--2--4--n--69ffcbf899-k8s-calico--kube--controllers--d87956b8c--nbrdc-eth0" Mar 12 23:43:53.076027 systemd-networkd[1432]: cali4bee907e2a4: Gained IPv6LL Mar 12 23:43:53.097181 systemd[1]: Started cri-containerd-eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23.scope - libcontainer container eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23. Mar 12 23:43:53.132694 containerd[1553]: time="2026-03-12T23:43:53.132641838Z" level=info msg="connecting to shim 1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620" address="unix:///run/containerd/s/5d05c0b833820206f7eab3965a53495af0bea3480726f4f2fc505c0334c9ed60" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:53.171709 containerd[1553]: time="2026-03-12T23:43:53.171660693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-vdx54,Uid:94660390-b097-4115-a00d-194fc136d3e9,Namespace:calico-system,Attempt:0,} returns sandbox id \"be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90\"" Mar 12 23:43:53.174405 systemd[1]: Started cri-containerd-1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620.scope - libcontainer container 1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620. Mar 12 23:43:53.225330 containerd[1553]: time="2026-03-12T23:43:53.225271591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f89dfd9f-fbrhp,Uid:438f4726-557c-45f8-a55a-8da149168ef8,Namespace:calico-system,Attempt:0,} returns sandbox id \"eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23\"" Mar 12 23:43:53.249375 containerd[1553]: time="2026-03-12T23:43:53.249329215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d87956b8c-nbrdc,Uid:9e009c2a-43f1-473c-ac28-09966919e16f,Namespace:calico-system,Attempt:0,} returns sandbox id \"1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620\"" Mar 12 23:43:54.100108 systemd-networkd[1432]: cali817afafe23b: Gained IPv6LL Mar 12 23:43:54.229022 systemd-networkd[1432]: calia87f2e7d58a: Gained IPv6LL Mar 12 23:43:54.420219 systemd-networkd[1432]: cali4ebccf7e1d4: Gained IPv6LL Mar 12 23:43:54.612150 systemd-networkd[1432]: calia5f92b70799: Gained IPv6LL Mar 12 23:43:54.896530 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1671922676.mount: Deactivated successfully. Mar 12 23:43:55.415843 containerd[1553]: time="2026-03-12T23:43:55.415631247Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:55.417861 containerd[1553]: time="2026-03-12T23:43:55.417776722Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 12 23:43:55.418702 containerd[1553]: time="2026-03-12T23:43:55.418651384Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:55.422154 containerd[1553]: time="2026-03-12T23:43:55.422036754Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:55.423359 containerd[1553]: time="2026-03-12T23:43:55.423330087Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 2.41895603s" Mar 12 23:43:55.423777 containerd[1553]: time="2026-03-12T23:43:55.423460804Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 12 23:43:55.426108 containerd[1553]: time="2026-03-12T23:43:55.426084110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:43:55.430893 containerd[1553]: time="2026-03-12T23:43:55.430759893Z" level=info msg="CreateContainer within sandbox \"0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 12 23:43:55.452095 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1036447649.mount: Deactivated successfully. Mar 12 23:43:55.454083 containerd[1553]: time="2026-03-12T23:43:55.454040130Z" level=info msg="Container 5ed82836dd574674b85a8de5806287bb42f2098b27d9cff163b5a47849779f79: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:55.466300 containerd[1553]: time="2026-03-12T23:43:55.466238477Z" level=info msg="CreateContainer within sandbox \"0421e03842e092bb0c34344ba54e75e11181dbaf6122e7eaaa12131b2a5e0ecb\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"5ed82836dd574674b85a8de5806287bb42f2098b27d9cff163b5a47849779f79\"" Mar 12 23:43:55.469829 containerd[1553]: time="2026-03-12T23:43:55.469764004Z" level=info msg="StartContainer for \"5ed82836dd574674b85a8de5806287bb42f2098b27d9cff163b5a47849779f79\"" Mar 12 23:43:55.471651 containerd[1553]: time="2026-03-12T23:43:55.471579567Z" level=info msg="connecting to shim 5ed82836dd574674b85a8de5806287bb42f2098b27d9cff163b5a47849779f79" address="unix:///run/containerd/s/0fbd6b6bc8e909dcf5547b0b436457e0cfb5dcdb35e851cd34cd00161779bea6" protocol=ttrpc version=3 Mar 12 23:43:55.505988 systemd[1]: Started cri-containerd-5ed82836dd574674b85a8de5806287bb42f2098b27d9cff163b5a47849779f79.scope - libcontainer container 5ed82836dd574674b85a8de5806287bb42f2098b27d9cff163b5a47849779f79. Mar 12 23:43:55.572066 containerd[1553]: time="2026-03-12T23:43:55.571972125Z" level=info msg="StartContainer for \"5ed82836dd574674b85a8de5806287bb42f2098b27d9cff163b5a47849779f79\" returns successfully" Mar 12 23:43:57.789183 containerd[1553]: time="2026-03-12T23:43:57.789103238Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:57.791849 containerd[1553]: time="2026-03-12T23:43:57.791383125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 12 23:43:57.791849 containerd[1553]: time="2026-03-12T23:43:57.791924997Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:57.795530 containerd[1553]: time="2026-03-12T23:43:57.795446906Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:57.796634 containerd[1553]: time="2026-03-12T23:43:57.795965938Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 2.369340559s" Mar 12 23:43:57.796634 containerd[1553]: time="2026-03-12T23:43:57.796511690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:43:57.800568 containerd[1553]: time="2026-03-12T23:43:57.800538111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:43:57.806356 containerd[1553]: time="2026-03-12T23:43:57.806171029Z" level=info msg="CreateContainer within sandbox \"be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:43:57.817641 containerd[1553]: time="2026-03-12T23:43:57.817597502Z" level=info msg="Container 4bc0253160c9e315b86e0164656b6811395df8099b8bc3c25bf76e4eb6abff61: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:57.824375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3540096678.mount: Deactivated successfully. Mar 12 23:43:57.842787 containerd[1553]: time="2026-03-12T23:43:57.842723016Z" level=info msg="CreateContainer within sandbox \"be44ca86d2430589b210e51b963a44015bdcc20bba9d893c15279abb60caaf90\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4bc0253160c9e315b86e0164656b6811395df8099b8bc3c25bf76e4eb6abff61\"" Mar 12 23:43:57.844231 containerd[1553]: time="2026-03-12T23:43:57.844184315Z" level=info msg="StartContainer for \"4bc0253160c9e315b86e0164656b6811395df8099b8bc3c25bf76e4eb6abff61\"" Mar 12 23:43:57.847164 containerd[1553]: time="2026-03-12T23:43:57.847117192Z" level=info msg="connecting to shim 4bc0253160c9e315b86e0164656b6811395df8099b8bc3c25bf76e4eb6abff61" address="unix:///run/containerd/s/074ac01dc203d9e817afecddc646b7dcef5ca74f3a373cb90ef111f61ca2355b" protocol=ttrpc version=3 Mar 12 23:43:57.880945 systemd[1]: Started cri-containerd-4bc0253160c9e315b86e0164656b6811395df8099b8bc3c25bf76e4eb6abff61.scope - libcontainer container 4bc0253160c9e315b86e0164656b6811395df8099b8bc3c25bf76e4eb6abff61. Mar 12 23:43:57.949643 containerd[1553]: time="2026-03-12T23:43:57.948342595Z" level=info msg="StartContainer for \"4bc0253160c9e315b86e0164656b6811395df8099b8bc3c25bf76e4eb6abff61\" returns successfully" Mar 12 23:43:58.237048 containerd[1553]: time="2026-03-12T23:43:58.236959152Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:58.238002 containerd[1553]: time="2026-03-12T23:43:58.237797383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 12 23:43:58.241517 containerd[1553]: time="2026-03-12T23:43:58.241409181Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 440.691592ms" Mar 12 23:43:58.241748 containerd[1553]: time="2026-03-12T23:43:58.241649098Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:43:58.244997 containerd[1553]: time="2026-03-12T23:43:58.244874860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 12 23:43:58.250498 containerd[1553]: time="2026-03-12T23:43:58.250457915Z" level=info msg="CreateContainer within sandbox \"eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:43:58.260219 containerd[1553]: time="2026-03-12T23:43:58.260162562Z" level=info msg="Container 858936e5dd129095e550c17de963bf94a0abddb7a3bae2548d6ca7d5c5dbe24d: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:58.273557 containerd[1553]: time="2026-03-12T23:43:58.272577337Z" level=info msg="CreateContainer within sandbox \"eedf7d933fd60a7b4ffc357a25ed7b2083f3540176129564e16335f70242aa23\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"858936e5dd129095e550c17de963bf94a0abddb7a3bae2548d6ca7d5c5dbe24d\"" Mar 12 23:43:58.274666 containerd[1553]: time="2026-03-12T23:43:58.274603233Z" level=info msg="StartContainer for \"858936e5dd129095e550c17de963bf94a0abddb7a3bae2548d6ca7d5c5dbe24d\"" Mar 12 23:43:58.276674 containerd[1553]: time="2026-03-12T23:43:58.276028337Z" level=info msg="connecting to shim 858936e5dd129095e550c17de963bf94a0abddb7a3bae2548d6ca7d5c5dbe24d" address="unix:///run/containerd/s/b42f38a6c8dea27ed5719ccdb31427802c539729eaa532bce172c91b57273aaf" protocol=ttrpc version=3 Mar 12 23:43:58.308032 systemd[1]: Started cri-containerd-858936e5dd129095e550c17de963bf94a0abddb7a3bae2548d6ca7d5c5dbe24d.scope - libcontainer container 858936e5dd129095e550c17de963bf94a0abddb7a3bae2548d6ca7d5c5dbe24d. Mar 12 23:43:58.374130 containerd[1553]: time="2026-03-12T23:43:58.374091153Z" level=info msg="StartContainer for \"858936e5dd129095e550c17de963bf94a0abddb7a3bae2548d6ca7d5c5dbe24d\" returns successfully" Mar 12 23:43:58.682363 kubelet[2789]: I0312 23:43:58.682051 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-59f89dfd9f-fbrhp" podStartSLOduration=30.669548048 podStartE2EDuration="35.681813324s" podCreationTimestamp="2026-03-12 23:43:23 +0000 UTC" firstStartedPulling="2026-03-12 23:43:53.230463049 +0000 UTC m=+50.020265810" lastFinishedPulling="2026-03-12 23:43:58.242728325 +0000 UTC m=+55.032531086" observedRunningTime="2026-03-12 23:43:58.678540963 +0000 UTC m=+55.468343764" watchObservedRunningTime="2026-03-12 23:43:58.681813324 +0000 UTC m=+55.471616085" Mar 12 23:43:58.683936 kubelet[2789]: I0312 23:43:58.682393 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-tqf8l" podStartSLOduration=34.259689702 podStartE2EDuration="36.682385838s" podCreationTimestamp="2026-03-12 23:43:22 +0000 UTC" firstStartedPulling="2026-03-12 23:43:53.001794207 +0000 UTC m=+49.791596928" lastFinishedPulling="2026-03-12 23:43:55.424490303 +0000 UTC m=+52.214293064" observedRunningTime="2026-03-12 23:43:55.656436214 +0000 UTC m=+52.446239015" watchObservedRunningTime="2026-03-12 23:43:58.682385838 +0000 UTC m=+55.472188599" Mar 12 23:43:59.653811 kubelet[2789]: I0312 23:43:59.653124 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:59.654208 kubelet[2789]: I0312 23:43:59.653124 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:44:01.430561 containerd[1553]: time="2026-03-12T23:44:01.430491386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:44:01.432220 containerd[1553]: time="2026-03-12T23:44:01.432149901Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 12 23:44:01.433418 containerd[1553]: time="2026-03-12T23:44:01.433087138Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:44:01.436686 containerd[1553]: time="2026-03-12T23:44:01.436646765Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:44:01.438003 containerd[1553]: time="2026-03-12T23:44:01.437971441Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 3.193058261s" Mar 12 23:44:01.438157 containerd[1553]: time="2026-03-12T23:44:01.438139280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 12 23:44:01.471964 containerd[1553]: time="2026-03-12T23:44:01.471893965Z" level=info msg="CreateContainer within sandbox \"1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 12 23:44:01.480083 containerd[1553]: time="2026-03-12T23:44:01.479949097Z" level=info msg="Container 99b5bcbdde619020a15271fe3443e17b4c35cd94adf3c79c1202a43e23a95113: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:44:01.493620 containerd[1553]: time="2026-03-12T23:44:01.493551211Z" level=info msg="CreateContainer within sandbox \"1e9476fee98accbd1693809af878f7ccabe215a706b5c21ac9c110edf4290620\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"99b5bcbdde619020a15271fe3443e17b4c35cd94adf3c79c1202a43e23a95113\"" Mar 12 23:44:01.494672 containerd[1553]: time="2026-03-12T23:44:01.494598167Z" level=info msg="StartContainer for \"99b5bcbdde619020a15271fe3443e17b4c35cd94adf3c79c1202a43e23a95113\"" Mar 12 23:44:01.496967 containerd[1553]: time="2026-03-12T23:44:01.496897440Z" level=info msg="connecting to shim 99b5bcbdde619020a15271fe3443e17b4c35cd94adf3c79c1202a43e23a95113" address="unix:///run/containerd/s/5d05c0b833820206f7eab3965a53495af0bea3480726f4f2fc505c0334c9ed60" protocol=ttrpc version=3 Mar 12 23:44:01.526029 systemd[1]: Started cri-containerd-99b5bcbdde619020a15271fe3443e17b4c35cd94adf3c79c1202a43e23a95113.scope - libcontainer container 99b5bcbdde619020a15271fe3443e17b4c35cd94adf3c79c1202a43e23a95113. Mar 12 23:44:01.591088 containerd[1553]: time="2026-03-12T23:44:01.590960758Z" level=info msg="StartContainer for \"99b5bcbdde619020a15271fe3443e17b4c35cd94adf3c79c1202a43e23a95113\" returns successfully" Mar 12 23:44:01.701956 kubelet[2789]: I0312 23:44:01.701773 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-59f89dfd9f-vdx54" podStartSLOduration=34.076096835 podStartE2EDuration="38.70175974s" podCreationTimestamp="2026-03-12 23:43:23 +0000 UTC" firstStartedPulling="2026-03-12 23:43:53.174773408 +0000 UTC m=+49.964576169" lastFinishedPulling="2026-03-12 23:43:57.800436313 +0000 UTC m=+54.590239074" observedRunningTime="2026-03-12 23:43:58.696716671 +0000 UTC m=+55.486519432" watchObservedRunningTime="2026-03-12 23:44:01.70175974 +0000 UTC m=+58.491562501" Mar 12 23:44:02.733582 kubelet[2789]: I0312 23:44:02.732745 2789 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-d87956b8c-nbrdc" podStartSLOduration=30.546392735 podStartE2EDuration="38.732728142s" podCreationTimestamp="2026-03-12 23:43:24 +0000 UTC" firstStartedPulling="2026-03-12 23:43:53.253213989 +0000 UTC m=+50.043016750" lastFinishedPulling="2026-03-12 23:44:01.439549356 +0000 UTC m=+58.229352157" observedRunningTime="2026-03-12 23:44:01.703692333 +0000 UTC m=+58.493495094" watchObservedRunningTime="2026-03-12 23:44:02.732728142 +0000 UTC m=+59.522530903" Mar 12 23:44:23.347188 kubelet[2789]: I0312 23:44:23.347113 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:44:36.786169 kubelet[2789]: I0312 23:44:36.785785 2789 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:44:51.972885 systemd[1]: Started sshd@7-49.13.116.83:22-20.161.92.111:40436.service - OpenSSH per-connection server daemon (20.161.92.111:40436). Mar 12 23:44:52.527558 sshd[5469]: Accepted publickey for core from 20.161.92.111 port 40436 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:44:52.529280 sshd-session[5469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:44:52.536565 systemd-logind[1535]: New session 8 of user core. Mar 12 23:44:52.541992 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 12 23:44:52.926515 sshd[5501]: Connection closed by 20.161.92.111 port 40436 Mar 12 23:44:52.926372 sshd-session[5469]: pam_unix(sshd:session): session closed for user core Mar 12 23:44:52.933339 systemd[1]: sshd@7-49.13.116.83:22-20.161.92.111:40436.service: Deactivated successfully. Mar 12 23:44:52.937522 systemd[1]: session-8.scope: Deactivated successfully. Mar 12 23:44:52.941108 systemd-logind[1535]: Session 8 logged out. Waiting for processes to exit. Mar 12 23:44:52.943804 systemd-logind[1535]: Removed session 8. Mar 12 23:44:58.031069 systemd[1]: Started sshd@8-49.13.116.83:22-20.161.92.111:40446.service - OpenSSH per-connection server daemon (20.161.92.111:40446). Mar 12 23:44:58.550596 sshd[5537]: Accepted publickey for core from 20.161.92.111 port 40446 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:44:58.552780 sshd-session[5537]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:44:58.558617 systemd-logind[1535]: New session 9 of user core. Mar 12 23:44:58.569191 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 12 23:44:58.915496 sshd[5541]: Connection closed by 20.161.92.111 port 40446 Mar 12 23:44:58.917266 sshd-session[5537]: pam_unix(sshd:session): session closed for user core Mar 12 23:44:58.923507 systemd-logind[1535]: Session 9 logged out. Waiting for processes to exit. Mar 12 23:44:58.924487 systemd[1]: sshd@8-49.13.116.83:22-20.161.92.111:40446.service: Deactivated successfully. Mar 12 23:44:58.929044 systemd[1]: session-9.scope: Deactivated successfully. Mar 12 23:44:58.931876 systemd-logind[1535]: Removed session 9. Mar 12 23:45:04.032662 systemd[1]: Started sshd@9-49.13.116.83:22-20.161.92.111:49986.service - OpenSSH per-connection server daemon (20.161.92.111:49986). Mar 12 23:45:04.563499 sshd[5609]: Accepted publickey for core from 20.161.92.111 port 49986 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:04.565316 sshd-session[5609]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:04.570646 systemd-logind[1535]: New session 10 of user core. Mar 12 23:45:04.580518 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 12 23:45:04.934061 sshd[5612]: Connection closed by 20.161.92.111 port 49986 Mar 12 23:45:04.935884 sshd-session[5609]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:04.942286 systemd[1]: sshd@9-49.13.116.83:22-20.161.92.111:49986.service: Deactivated successfully. Mar 12 23:45:04.945127 systemd[1]: session-10.scope: Deactivated successfully. Mar 12 23:45:04.947061 systemd-logind[1535]: Session 10 logged out. Waiting for processes to exit. Mar 12 23:45:04.949495 systemd-logind[1535]: Removed session 10. Mar 12 23:45:10.048324 systemd[1]: Started sshd@10-49.13.116.83:22-20.161.92.111:49990.service - OpenSSH per-connection server daemon (20.161.92.111:49990). Mar 12 23:45:10.577510 sshd[5641]: Accepted publickey for core from 20.161.92.111 port 49990 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:10.579813 sshd-session[5641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:10.587736 systemd-logind[1535]: New session 11 of user core. Mar 12 23:45:10.596496 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 12 23:45:10.946443 sshd[5646]: Connection closed by 20.161.92.111 port 49990 Mar 12 23:45:10.947315 sshd-session[5641]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:10.954760 systemd[1]: sshd@10-49.13.116.83:22-20.161.92.111:49990.service: Deactivated successfully. Mar 12 23:45:10.958028 systemd[1]: session-11.scope: Deactivated successfully. Mar 12 23:45:10.959704 systemd-logind[1535]: Session 11 logged out. Waiting for processes to exit. Mar 12 23:45:10.961881 systemd-logind[1535]: Removed session 11. Mar 12 23:45:11.060757 systemd[1]: Started sshd@11-49.13.116.83:22-20.161.92.111:51034.service - OpenSSH per-connection server daemon (20.161.92.111:51034). Mar 12 23:45:11.605147 sshd[5659]: Accepted publickey for core from 20.161.92.111 port 51034 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:11.607950 sshd-session[5659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:11.613895 systemd-logind[1535]: New session 12 of user core. Mar 12 23:45:11.622109 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 12 23:45:12.026158 sshd[5662]: Connection closed by 20.161.92.111 port 51034 Mar 12 23:45:12.028004 sshd-session[5659]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:12.035176 systemd[1]: sshd@11-49.13.116.83:22-20.161.92.111:51034.service: Deactivated successfully. Mar 12 23:45:12.038881 systemd[1]: session-12.scope: Deactivated successfully. Mar 12 23:45:12.040923 systemd-logind[1535]: Session 12 logged out. Waiting for processes to exit. Mar 12 23:45:12.042744 systemd-logind[1535]: Removed session 12. Mar 12 23:45:12.143192 systemd[1]: Started sshd@12-49.13.116.83:22-20.161.92.111:51040.service - OpenSSH per-connection server daemon (20.161.92.111:51040). Mar 12 23:45:12.693920 sshd[5672]: Accepted publickey for core from 20.161.92.111 port 51040 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:12.696145 sshd-session[5672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:12.701700 systemd-logind[1535]: New session 13 of user core. Mar 12 23:45:12.708116 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 12 23:45:13.071342 sshd[5675]: Connection closed by 20.161.92.111 port 51040 Mar 12 23:45:13.071176 sshd-session[5672]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:13.078022 systemd-logind[1535]: Session 13 logged out. Waiting for processes to exit. Mar 12 23:45:13.078287 systemd[1]: sshd@12-49.13.116.83:22-20.161.92.111:51040.service: Deactivated successfully. Mar 12 23:45:13.083538 systemd[1]: session-13.scope: Deactivated successfully. Mar 12 23:45:13.089454 systemd-logind[1535]: Removed session 13. Mar 12 23:45:18.175127 systemd[1]: Started sshd@13-49.13.116.83:22-20.161.92.111:51042.service - OpenSSH per-connection server daemon (20.161.92.111:51042). Mar 12 23:45:18.695877 sshd[5726]: Accepted publickey for core from 20.161.92.111 port 51042 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:18.698299 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:18.705009 systemd-logind[1535]: New session 14 of user core. Mar 12 23:45:18.711074 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 12 23:45:19.061910 sshd[5729]: Connection closed by 20.161.92.111 port 51042 Mar 12 23:45:19.062956 sshd-session[5726]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:19.069456 systemd-logind[1535]: Session 14 logged out. Waiting for processes to exit. Mar 12 23:45:19.070099 systemd[1]: sshd@13-49.13.116.83:22-20.161.92.111:51042.service: Deactivated successfully. Mar 12 23:45:19.072462 systemd[1]: session-14.scope: Deactivated successfully. Mar 12 23:45:19.075590 systemd-logind[1535]: Removed session 14. Mar 12 23:45:19.170679 systemd[1]: Started sshd@14-49.13.116.83:22-20.161.92.111:51050.service - OpenSSH per-connection server daemon (20.161.92.111:51050). Mar 12 23:45:19.700734 sshd[5741]: Accepted publickey for core from 20.161.92.111 port 51050 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:19.702865 sshd-session[5741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:19.710806 systemd-logind[1535]: New session 15 of user core. Mar 12 23:45:19.715025 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 12 23:45:20.206852 sshd[5744]: Connection closed by 20.161.92.111 port 51050 Mar 12 23:45:20.207553 sshd-session[5741]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:20.213685 systemd[1]: sshd@14-49.13.116.83:22-20.161.92.111:51050.service: Deactivated successfully. Mar 12 23:45:20.219058 systemd[1]: session-15.scope: Deactivated successfully. Mar 12 23:45:20.222989 systemd-logind[1535]: Session 15 logged out. Waiting for processes to exit. Mar 12 23:45:20.225002 systemd-logind[1535]: Removed session 15. Mar 12 23:45:20.319222 systemd[1]: Started sshd@15-49.13.116.83:22-20.161.92.111:51088.service - OpenSSH per-connection server daemon (20.161.92.111:51088). Mar 12 23:45:20.849878 sshd[5777]: Accepted publickey for core from 20.161.92.111 port 51088 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:20.852549 sshd-session[5777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:20.857872 systemd-logind[1535]: New session 16 of user core. Mar 12 23:45:20.862014 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 12 23:45:21.974222 sshd[5780]: Connection closed by 20.161.92.111 port 51088 Mar 12 23:45:21.974124 sshd-session[5777]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:21.980407 systemd-logind[1535]: Session 16 logged out. Waiting for processes to exit. Mar 12 23:45:21.983318 systemd[1]: sshd@15-49.13.116.83:22-20.161.92.111:51088.service: Deactivated successfully. Mar 12 23:45:21.987492 systemd[1]: session-16.scope: Deactivated successfully. Mar 12 23:45:21.991443 systemd-logind[1535]: Removed session 16. Mar 12 23:45:22.081797 systemd[1]: Started sshd@16-49.13.116.83:22-20.161.92.111:51102.service - OpenSSH per-connection server daemon (20.161.92.111:51102). Mar 12 23:45:22.613141 sshd[5803]: Accepted publickey for core from 20.161.92.111 port 51102 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:22.616051 sshd-session[5803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:22.628399 systemd-logind[1535]: New session 17 of user core. Mar 12 23:45:22.635048 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 12 23:45:23.114148 sshd[5830]: Connection closed by 20.161.92.111 port 51102 Mar 12 23:45:23.114610 sshd-session[5803]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:23.121860 systemd[1]: sshd@16-49.13.116.83:22-20.161.92.111:51102.service: Deactivated successfully. Mar 12 23:45:23.124608 systemd[1]: session-17.scope: Deactivated successfully. Mar 12 23:45:23.125726 systemd-logind[1535]: Session 17 logged out. Waiting for processes to exit. Mar 12 23:45:23.128958 systemd-logind[1535]: Removed session 17. Mar 12 23:45:23.228077 systemd[1]: Started sshd@17-49.13.116.83:22-20.161.92.111:51114.service - OpenSSH per-connection server daemon (20.161.92.111:51114). Mar 12 23:45:23.752443 sshd[5842]: Accepted publickey for core from 20.161.92.111 port 51114 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:23.754612 sshd-session[5842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:23.759737 systemd-logind[1535]: New session 18 of user core. Mar 12 23:45:23.765003 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 12 23:45:24.117884 sshd[5845]: Connection closed by 20.161.92.111 port 51114 Mar 12 23:45:24.116955 sshd-session[5842]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:24.124237 systemd-logind[1535]: Session 18 logged out. Waiting for processes to exit. Mar 12 23:45:24.125078 systemd[1]: sshd@17-49.13.116.83:22-20.161.92.111:51114.service: Deactivated successfully. Mar 12 23:45:24.129363 systemd[1]: session-18.scope: Deactivated successfully. Mar 12 23:45:24.134390 systemd-logind[1535]: Removed session 18. Mar 12 23:45:29.232291 systemd[1]: Started sshd@18-49.13.116.83:22-20.161.92.111:51120.service - OpenSSH per-connection server daemon (20.161.92.111:51120). Mar 12 23:45:29.770966 sshd[5882]: Accepted publickey for core from 20.161.92.111 port 51120 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:29.773659 sshd-session[5882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:29.780907 systemd-logind[1535]: New session 19 of user core. Mar 12 23:45:29.788181 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 12 23:45:30.147232 sshd[5885]: Connection closed by 20.161.92.111 port 51120 Mar 12 23:45:30.147799 sshd-session[5882]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:30.153127 systemd-logind[1535]: Session 19 logged out. Waiting for processes to exit. Mar 12 23:45:30.153494 systemd[1]: sshd@18-49.13.116.83:22-20.161.92.111:51120.service: Deactivated successfully. Mar 12 23:45:30.156172 systemd[1]: session-19.scope: Deactivated successfully. Mar 12 23:45:30.160581 systemd-logind[1535]: Removed session 19. Mar 12 23:45:35.257450 systemd[1]: Started sshd@19-49.13.116.83:22-20.161.92.111:39832.service - OpenSSH per-connection server daemon (20.161.92.111:39832). Mar 12 23:45:35.802888 sshd[5918]: Accepted publickey for core from 20.161.92.111 port 39832 ssh2: RSA SHA256:efFLS9MdSfnBpQoXIlctriWXrDgGS/o5pOWMaZl9Yd4 Mar 12 23:45:35.804244 sshd-session[5918]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:35.809454 systemd-logind[1535]: New session 20 of user core. Mar 12 23:45:35.818138 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 12 23:45:36.173955 sshd[5921]: Connection closed by 20.161.92.111 port 39832 Mar 12 23:45:36.174150 sshd-session[5918]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:36.180403 systemd-logind[1535]: Session 20 logged out. Waiting for processes to exit. Mar 12 23:45:36.180973 systemd[1]: sshd@19-49.13.116.83:22-20.161.92.111:39832.service: Deactivated successfully. Mar 12 23:45:36.184333 systemd[1]: session-20.scope: Deactivated successfully. Mar 12 23:45:36.187646 systemd-logind[1535]: Removed session 20. Mar 12 23:45:50.985635 systemd[1]: cri-containerd-3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4.scope: Deactivated successfully. Mar 12 23:45:50.986219 systemd[1]: cri-containerd-3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4.scope: Consumed 19.398s CPU time, 127.5M memory peak, 2.9M read from disk. Mar 12 23:45:50.987776 containerd[1553]: time="2026-03-12T23:45:50.987608035Z" level=info msg="received container exit event container_id:\"3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4\" id:\"3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4\" pid:3116 exit_status:1 exited_at:{seconds:1773359150 nanos:986767702}" Mar 12 23:45:51.015378 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4-rootfs.mount: Deactivated successfully. Mar 12 23:45:51.073642 kubelet[2789]: I0312 23:45:51.073579 2789 scope.go:122] "RemoveContainer" containerID="3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4" Mar 12 23:45:51.078037 containerd[1553]: time="2026-03-12T23:45:51.077974605Z" level=info msg="CreateContainer within sandbox \"dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 12 23:45:51.088375 containerd[1553]: time="2026-03-12T23:45:51.087602135Z" level=info msg="Container d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:45:51.093676 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2950839633.mount: Deactivated successfully. Mar 12 23:45:51.100401 containerd[1553]: time="2026-03-12T23:45:51.100353422Z" level=info msg="CreateContainer within sandbox \"dc8a8e89f3cfa6dae4dc741b1f3eaab1c1f38145da63caadc88eb7f3682a512f\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e\"" Mar 12 23:45:51.101300 containerd[1553]: time="2026-03-12T23:45:51.101156199Z" level=info msg="StartContainer for \"d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e\"" Mar 12 23:45:51.102179 containerd[1553]: time="2026-03-12T23:45:51.102147202Z" level=info msg="connecting to shim d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e" address="unix:///run/containerd/s/cbfc2524916db6894ae7824a647d00be56d4e1e437112d6d1a0fd6019c651e35" protocol=ttrpc version=3 Mar 12 23:45:51.132171 systemd[1]: Started cri-containerd-d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e.scope - libcontainer container d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e. Mar 12 23:45:51.167793 containerd[1553]: time="2026-03-12T23:45:51.167746411Z" level=info msg="StartContainer for \"d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e\" returns successfully" Mar 12 23:45:51.423947 kubelet[2789]: E0312 23:45:51.422416 2789 controller.go:251] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:48342->10.0.0.2:2379: read: connection timed out" Mar 12 23:45:52.057157 systemd[1]: cri-containerd-ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9.scope: Deactivated successfully. Mar 12 23:45:52.057859 systemd[1]: cri-containerd-ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9.scope: Consumed 3.322s CPU time, 67.7M memory peak, 2.3M read from disk. Mar 12 23:45:52.060836 containerd[1553]: time="2026-03-12T23:45:52.060539299Z" level=info msg="received container exit event container_id:\"ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9\" id:\"ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9\" pid:2614 exit_status:1 exited_at:{seconds:1773359152 nanos:59133246}" Mar 12 23:45:52.095286 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9-rootfs.mount: Deactivated successfully. Mar 12 23:45:53.088260 kubelet[2789]: I0312 23:45:53.088222 2789 scope.go:122] "RemoveContainer" containerID="ae769e76a32136550f59f8668702ed4b9cd34f10ba1fdfc9d3b57644050739b9" Mar 12 23:45:53.091478 containerd[1553]: time="2026-03-12T23:45:53.091430422Z" level=info msg="CreateContainer within sandbox \"ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 12 23:45:53.106853 containerd[1553]: time="2026-03-12T23:45:53.105751607Z" level=info msg="Container 43d75b3ace53173c1b055e2da0592960d92025d226b0a7536667407da8afed2a: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:45:53.112159 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1230590270.mount: Deactivated successfully. Mar 12 23:45:53.116354 containerd[1553]: time="2026-03-12T23:45:53.116288791Z" level=info msg="CreateContainer within sandbox \"ecb3ff78cf5f786007cb6cbe96eae3cadc3d74e152312ce129c35cde99f5d181\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"43d75b3ace53173c1b055e2da0592960d92025d226b0a7536667407da8afed2a\"" Mar 12 23:45:53.117050 containerd[1553]: time="2026-03-12T23:45:53.116989740Z" level=info msg="StartContainer for \"43d75b3ace53173c1b055e2da0592960d92025d226b0a7536667407da8afed2a\"" Mar 12 23:45:53.118368 containerd[1553]: time="2026-03-12T23:45:53.118326401Z" level=info msg="connecting to shim 43d75b3ace53173c1b055e2da0592960d92025d226b0a7536667407da8afed2a" address="unix:///run/containerd/s/1c8fc9f385024105c1d2cf6b1e62dacf799660b09a2f0e0045180c9062be58eb" protocol=ttrpc version=3 Mar 12 23:45:53.141035 systemd[1]: Started cri-containerd-43d75b3ace53173c1b055e2da0592960d92025d226b0a7536667407da8afed2a.scope - libcontainer container 43d75b3ace53173c1b055e2da0592960d92025d226b0a7536667407da8afed2a. Mar 12 23:45:53.188791 containerd[1553]: time="2026-03-12T23:45:53.188716577Z" level=info msg="StartContainer for \"43d75b3ace53173c1b055e2da0592960d92025d226b0a7536667407da8afed2a\" returns successfully" Mar 12 23:45:54.999281 kubelet[2789]: E0312 23:45:54.998042 2789 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:47948->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-69ffcbf899.189c3cbf0d5542af kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-69ffcbf899,UID:3d3aff213b3453c9d4eaa5e96e8ec1be,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-69ffcbf899,},FirstTimestamp:2026-03-12 23:45:44.537244335 +0000 UTC m=+161.327047136,LastTimestamp:2026-03-12 23:45:44.537244335 +0000 UTC m=+161.327047136,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-69ffcbf899,}" Mar 12 23:45:55.323257 systemd[1]: cri-containerd-d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e.scope: Deactivated successfully. Mar 12 23:45:55.324291 systemd[1]: cri-containerd-d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e.scope: Consumed 269ms CPU time, 35.4M memory peak, 1.2M read from disk. Mar 12 23:45:55.324964 containerd[1553]: time="2026-03-12T23:45:55.324134181Z" level=info msg="received container exit event container_id:\"d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e\" id:\"d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e\" pid:5960 exit_status:1 exited_at:{seconds:1773359155 nanos:323365074}" Mar 12 23:45:55.348364 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e-rootfs.mount: Deactivated successfully. Mar 12 23:45:56.104068 kubelet[2789]: I0312 23:45:56.103933 2789 scope.go:122] "RemoveContainer" containerID="3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4" Mar 12 23:45:56.105082 kubelet[2789]: I0312 23:45:56.104176 2789 scope.go:122] "RemoveContainer" containerID="d0218f7227856b0948b0db3e49c7d4a18039d5e2d5b727d890d9383bca8c653e" Mar 12 23:45:56.105082 kubelet[2789]: E0312 23:45:56.104356 2789 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6cf4cccc57-6444c_tigera-operator(1f3a1d64-5234-4924-9b1c-88f956a664b4)\"" pod="tigera-operator/tigera-operator-6cf4cccc57-6444c" podUID="1f3a1d64-5234-4924-9b1c-88f956a664b4" Mar 12 23:45:56.107090 containerd[1553]: time="2026-03-12T23:45:56.106855971Z" level=info msg="RemoveContainer for \"3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4\"" Mar 12 23:45:56.113657 containerd[1553]: time="2026-03-12T23:45:56.113516201Z" level=info msg="RemoveContainer for \"3470369e6df303b09cb093b913610b426fe16a8e969fea7114da4fede6e030f4\" returns successfully" Mar 12 23:45:56.227723 systemd[1]: cri-containerd-33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088.scope: Deactivated successfully. Mar 12 23:45:56.229016 systemd[1]: cri-containerd-33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088.scope: Consumed 2.244s CPU time, 25.6M memory peak, 2.3M read from disk. Mar 12 23:45:56.234334 containerd[1553]: time="2026-03-12T23:45:56.234279447Z" level=info msg="received container exit event container_id:\"33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088\" id:\"33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088\" pid:2649 exit_status:1 exited_at:{seconds:1773359156 nanos:232700114}" Mar 12 23:45:56.260470 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-33b86cca6ea1efa8be3ba6e8327c8e1d1a8fadb1be282917f01ac44d4a88a088-rootfs.mount: Deactivated successfully.