Mar 12 23:47:58.768923 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 12 23:47:58.768943 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Mar 12 22:07:21 -00 2026 Mar 12 23:47:58.768953 kernel: KASLR enabled Mar 12 23:47:58.768958 kernel: efi: EFI v2.7 by EDK II Mar 12 23:47:58.768964 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Mar 12 23:47:58.768969 kernel: random: crng init done Mar 12 23:47:58.768976 kernel: secureboot: Secure boot disabled Mar 12 23:47:58.768982 kernel: ACPI: Early table checksum verification disabled Mar 12 23:47:58.768987 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Mar 12 23:47:58.768993 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Mar 12 23:47:58.769000 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769006 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769011 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769017 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769024 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769030 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769038 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769044 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769050 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769056 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:47:58.769062 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Mar 12 23:47:58.769068 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Mar 12 23:47:58.769074 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 12 23:47:58.769080 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Mar 12 23:47:58.769086 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Mar 12 23:47:58.769092 kernel: Zone ranges: Mar 12 23:47:58.769099 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 12 23:47:58.769105 kernel: DMA32 empty Mar 12 23:47:58.769111 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Mar 12 23:47:58.769117 kernel: Device empty Mar 12 23:47:58.769123 kernel: Movable zone start for each node Mar 12 23:47:58.769129 kernel: Early memory node ranges Mar 12 23:47:58.769135 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Mar 12 23:47:58.769141 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Mar 12 23:47:58.769147 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Mar 12 23:47:58.769153 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Mar 12 23:47:58.769159 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Mar 12 23:47:58.769165 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Mar 12 23:47:58.769172 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Mar 12 23:47:58.769178 kernel: psci: probing for conduit method from ACPI. Mar 12 23:47:58.769187 kernel: psci: PSCIv1.3 detected in firmware. Mar 12 23:47:58.769193 kernel: psci: Using standard PSCI v0.2 function IDs Mar 12 23:47:58.769200 kernel: psci: Trusted OS migration not required Mar 12 23:47:58.769207 kernel: psci: SMC Calling Convention v1.1 Mar 12 23:47:58.769214 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 12 23:47:58.769221 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 12 23:47:58.769227 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 12 23:47:58.769233 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Mar 12 23:47:58.769240 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Mar 12 23:47:58.769246 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 12 23:47:58.769252 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 12 23:47:58.769259 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 12 23:47:58.769265 kernel: Detected PIPT I-cache on CPU0 Mar 12 23:47:58.769271 kernel: CPU features: detected: GIC system register CPU interface Mar 12 23:47:58.769278 kernel: CPU features: detected: Spectre-v4 Mar 12 23:47:58.769285 kernel: CPU features: detected: Spectre-BHB Mar 12 23:47:58.769304 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 12 23:47:58.769312 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 12 23:47:58.769318 kernel: CPU features: detected: ARM erratum 1418040 Mar 12 23:47:58.769325 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 12 23:47:58.769331 kernel: alternatives: applying boot alternatives Mar 12 23:47:58.769339 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:47:58.769346 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Mar 12 23:47:58.769352 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Mar 12 23:47:58.769358 kernel: Fallback order for Node 0: 0 Mar 12 23:47:58.769367 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Mar 12 23:47:58.769373 kernel: Policy zone: Normal Mar 12 23:47:58.769379 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 12 23:47:58.769386 kernel: software IO TLB: area num 4. Mar 12 23:47:58.769392 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Mar 12 23:47:58.769399 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 12 23:47:58.769405 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 12 23:47:58.769412 kernel: rcu: RCU event tracing is enabled. Mar 12 23:47:58.769419 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 12 23:47:58.769425 kernel: Trampoline variant of Tasks RCU enabled. Mar 12 23:47:58.769432 kernel: Tracing variant of Tasks RCU enabled. Mar 12 23:47:58.769438 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 12 23:47:58.769446 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 12 23:47:58.769452 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 12 23:47:58.769459 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 12 23:47:58.769465 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 12 23:47:58.769472 kernel: GICv3: 256 SPIs implemented Mar 12 23:47:58.769478 kernel: GICv3: 0 Extended SPIs implemented Mar 12 23:47:58.769485 kernel: Root IRQ handler: gic_handle_irq Mar 12 23:47:58.769491 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 12 23:47:58.769497 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Mar 12 23:47:58.769504 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 12 23:47:58.769510 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 12 23:47:58.769517 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Mar 12 23:47:58.769525 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Mar 12 23:47:58.769531 kernel: GICv3: using LPI property table @0x0000000100130000 Mar 12 23:47:58.769538 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Mar 12 23:47:58.769544 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 12 23:47:58.769551 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:47:58.769557 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 12 23:47:58.769563 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 12 23:47:58.769570 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 12 23:47:58.769576 kernel: arm-pv: using stolen time PV Mar 12 23:47:58.769583 kernel: Console: colour dummy device 80x25 Mar 12 23:47:58.769591 kernel: ACPI: Core revision 20240827 Mar 12 23:47:58.769598 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 12 23:47:58.769604 kernel: pid_max: default: 32768 minimum: 301 Mar 12 23:47:58.769611 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 12 23:47:58.769617 kernel: landlock: Up and running. Mar 12 23:47:58.769624 kernel: SELinux: Initializing. Mar 12 23:47:58.769630 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 23:47:58.769637 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 23:47:58.769644 kernel: rcu: Hierarchical SRCU implementation. Mar 12 23:47:58.769650 kernel: rcu: Max phase no-delay instances is 400. Mar 12 23:47:58.769658 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 12 23:47:58.769665 kernel: Remapping and enabling EFI services. Mar 12 23:47:58.769672 kernel: smp: Bringing up secondary CPUs ... Mar 12 23:47:58.769678 kernel: Detected PIPT I-cache on CPU1 Mar 12 23:47:58.769685 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 12 23:47:58.769692 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Mar 12 23:47:58.769698 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:47:58.769705 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 12 23:47:58.769711 kernel: Detected PIPT I-cache on CPU2 Mar 12 23:47:58.769723 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 12 23:47:58.769730 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Mar 12 23:47:58.769737 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:47:58.769745 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 12 23:47:58.769752 kernel: Detected PIPT I-cache on CPU3 Mar 12 23:47:58.769759 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 12 23:47:58.769766 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Mar 12 23:47:58.769773 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:47:58.769781 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 12 23:47:58.769788 kernel: smp: Brought up 1 node, 4 CPUs Mar 12 23:47:58.769795 kernel: SMP: Total of 4 processors activated. Mar 12 23:47:58.769802 kernel: CPU: All CPU(s) started at EL1 Mar 12 23:47:58.769809 kernel: CPU features: detected: 32-bit EL0 Support Mar 12 23:47:58.769816 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 12 23:47:58.769823 kernel: CPU features: detected: Common not Private translations Mar 12 23:47:58.769830 kernel: CPU features: detected: CRC32 instructions Mar 12 23:47:58.769837 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 12 23:47:58.769845 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 12 23:47:58.769852 kernel: CPU features: detected: LSE atomic instructions Mar 12 23:47:58.769859 kernel: CPU features: detected: Privileged Access Never Mar 12 23:47:58.769866 kernel: CPU features: detected: RAS Extension Support Mar 12 23:47:58.769873 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 12 23:47:58.769880 kernel: alternatives: applying system-wide alternatives Mar 12 23:47:58.769886 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Mar 12 23:47:58.769894 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Mar 12 23:47:58.769901 kernel: devtmpfs: initialized Mar 12 23:47:58.769909 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 12 23:47:58.769916 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 12 23:47:58.769923 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 12 23:47:58.769930 kernel: 0 pages in range for non-PLT usage Mar 12 23:47:58.769937 kernel: 508400 pages in range for PLT usage Mar 12 23:47:58.769944 kernel: pinctrl core: initialized pinctrl subsystem Mar 12 23:47:58.769950 kernel: SMBIOS 3.0.0 present. Mar 12 23:47:58.769957 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Mar 12 23:47:58.769964 kernel: DMI: Memory slots populated: 1/1 Mar 12 23:47:58.769972 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 12 23:47:58.769979 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Mar 12 23:47:58.769986 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 12 23:47:58.769993 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 12 23:47:58.770000 kernel: audit: initializing netlink subsys (disabled) Mar 12 23:47:58.770007 kernel: audit: type=2000 audit(0.040:1): state=initialized audit_enabled=0 res=1 Mar 12 23:47:58.770014 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 12 23:47:58.770021 kernel: cpuidle: using governor menu Mar 12 23:47:58.770028 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 12 23:47:58.770037 kernel: ASID allocator initialised with 32768 entries Mar 12 23:47:58.770044 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 12 23:47:58.770051 kernel: Serial: AMBA PL011 UART driver Mar 12 23:47:58.770058 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 12 23:47:58.770065 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 12 23:47:58.770072 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 12 23:47:58.770079 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 12 23:47:58.770086 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 12 23:47:58.770093 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 12 23:47:58.770101 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 12 23:47:58.770108 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 12 23:47:58.770114 kernel: ACPI: Added _OSI(Module Device) Mar 12 23:47:58.770121 kernel: ACPI: Added _OSI(Processor Device) Mar 12 23:47:58.770128 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 12 23:47:58.770135 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 12 23:47:58.770142 kernel: ACPI: Interpreter enabled Mar 12 23:47:58.770149 kernel: ACPI: Using GIC for interrupt routing Mar 12 23:47:58.770156 kernel: ACPI: MCFG table detected, 1 entries Mar 12 23:47:58.770164 kernel: ACPI: CPU0 has been hot-added Mar 12 23:47:58.770171 kernel: ACPI: CPU1 has been hot-added Mar 12 23:47:58.770178 kernel: ACPI: CPU2 has been hot-added Mar 12 23:47:58.770184 kernel: ACPI: CPU3 has been hot-added Mar 12 23:47:58.770191 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 12 23:47:58.770198 kernel: printk: legacy console [ttyAMA0] enabled Mar 12 23:47:58.770205 kernel: ACPI: PCI: Interrupt link L000 configured for IRQ 35 Mar 12 23:47:58.770212 kernel: ACPI: PCI: Interrupt link L001 configured for IRQ 36 Mar 12 23:47:58.770219 kernel: ACPI: PCI: Interrupt link L002 configured for IRQ 37 Mar 12 23:47:58.770227 kernel: ACPI: PCI: Interrupt link L003 configured for IRQ 38 Mar 12 23:47:58.770234 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 12 23:47:58.770391 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 12 23:47:58.770457 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 12 23:47:58.770516 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 12 23:47:58.770573 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 12 23:47:58.770629 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 12 23:47:58.770640 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 12 23:47:58.770648 kernel: PCI host bridge to bus 0000:00 Mar 12 23:47:58.770711 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 12 23:47:58.770764 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 12 23:47:58.770816 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 12 23:47:58.770868 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 12 23:47:58.770949 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Mar 12 23:47:58.771020 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.771081 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Mar 12 23:47:58.771140 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 12 23:47:58.771198 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Mar 12 23:47:58.771255 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Mar 12 23:47:58.771333 kernel: pci 0000:00:01.0: enabling Extended Tags Mar 12 23:47:58.771402 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.771466 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Mar 12 23:47:58.771526 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Mar 12 23:47:58.771584 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Mar 12 23:47:58.771641 kernel: pci 0000:00:01.1: enabling Extended Tags Mar 12 23:47:58.771706 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.771764 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Mar 12 23:47:58.771823 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Mar 12 23:47:58.771880 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Mar 12 23:47:58.771938 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Mar 12 23:47:58.771995 kernel: pci 0000:00:01.2: enabling Extended Tags Mar 12 23:47:58.772060 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.772118 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Mar 12 23:47:58.772176 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Mar 12 23:47:58.772233 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Mar 12 23:47:58.772299 kernel: pci 0000:00:01.3: enabling Extended Tags Mar 12 23:47:58.772370 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.772430 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Mar 12 23:47:58.772487 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Mar 12 23:47:58.772544 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Mar 12 23:47:58.772621 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Mar 12 23:47:58.772682 kernel: pci 0000:00:01.4: enabling Extended Tags Mar 12 23:47:58.772751 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.772811 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Mar 12 23:47:58.772868 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Mar 12 23:47:58.772926 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Mar 12 23:47:58.772983 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Mar 12 23:47:58.773040 kernel: pci 0000:00:01.5: enabling Extended Tags Mar 12 23:47:58.773105 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.773166 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Mar 12 23:47:58.773223 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Mar 12 23:47:58.773280 kernel: pci 0000:00:01.6: enabling Extended Tags Mar 12 23:47:58.773361 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.773421 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Mar 12 23:47:58.773478 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Mar 12 23:47:58.773537 kernel: pci 0000:00:01.7: enabling Extended Tags Mar 12 23:47:58.773601 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.773659 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Mar 12 23:47:58.773716 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Mar 12 23:47:58.773773 kernel: pci 0000:00:02.0: enabling Extended Tags Mar 12 23:47:58.773837 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.773895 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Mar 12 23:47:58.773954 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Mar 12 23:47:58.774012 kernel: pci 0000:00:02.1: enabling Extended Tags Mar 12 23:47:58.774076 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.774134 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Mar 12 23:47:58.774191 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Mar 12 23:47:58.774248 kernel: pci 0000:00:02.2: enabling Extended Tags Mar 12 23:47:58.774331 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.774398 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Mar 12 23:47:58.774456 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Mar 12 23:47:58.774513 kernel: pci 0000:00:02.3: enabling Extended Tags Mar 12 23:47:58.774577 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.774635 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Mar 12 23:47:58.774692 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Mar 12 23:47:58.774749 kernel: pci 0000:00:02.4: enabling Extended Tags Mar 12 23:47:58.774816 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.774875 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Mar 12 23:47:58.774932 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Mar 12 23:47:58.774989 kernel: pci 0000:00:02.5: enabling Extended Tags Mar 12 23:47:58.775054 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.775114 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Mar 12 23:47:58.775173 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Mar 12 23:47:58.775240 kernel: pci 0000:00:02.6: enabling Extended Tags Mar 12 23:47:58.775319 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.775390 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Mar 12 23:47:58.775452 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Mar 12 23:47:58.775510 kernel: pci 0000:00:02.7: enabling Extended Tags Mar 12 23:47:58.775578 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.775636 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Mar 12 23:47:58.775700 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Mar 12 23:47:58.775758 kernel: pci 0000:00:03.0: enabling Extended Tags Mar 12 23:47:58.775821 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.775879 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Mar 12 23:47:58.775936 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Mar 12 23:47:58.775994 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Mar 12 23:47:58.776051 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Mar 12 23:47:58.776108 kernel: pci 0000:00:03.1: enabling Extended Tags Mar 12 23:47:58.776174 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.776232 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Mar 12 23:47:58.776289 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Mar 12 23:47:58.776355 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Mar 12 23:47:58.776413 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Mar 12 23:47:58.776471 kernel: pci 0000:00:03.2: enabling Extended Tags Mar 12 23:47:58.776535 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.776611 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Mar 12 23:47:58.776672 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Mar 12 23:47:58.776729 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Mar 12 23:47:58.776786 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Mar 12 23:47:58.776843 kernel: pci 0000:00:03.3: enabling Extended Tags Mar 12 23:47:58.776907 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.776967 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Mar 12 23:47:58.777024 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Mar 12 23:47:58.777081 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Mar 12 23:47:58.777138 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Mar 12 23:47:58.777196 kernel: pci 0000:00:03.4: enabling Extended Tags Mar 12 23:47:58.777262 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.777336 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Mar 12 23:47:58.777399 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Mar 12 23:47:58.777456 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Mar 12 23:47:58.777513 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Mar 12 23:47:58.777570 kernel: pci 0000:00:03.5: enabling Extended Tags Mar 12 23:47:58.777635 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.777694 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Mar 12 23:47:58.777751 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Mar 12 23:47:58.777809 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Mar 12 23:47:58.777866 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Mar 12 23:47:58.777924 kernel: pci 0000:00:03.6: enabling Extended Tags Mar 12 23:47:58.777988 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.778045 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Mar 12 23:47:58.778102 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Mar 12 23:47:58.778159 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Mar 12 23:47:58.778218 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Mar 12 23:47:58.778275 kernel: pci 0000:00:03.7: enabling Extended Tags Mar 12 23:47:58.778351 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.778412 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Mar 12 23:47:58.778472 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Mar 12 23:47:58.778531 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Mar 12 23:47:58.778600 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Mar 12 23:47:58.778664 kernel: pci 0000:00:04.0: enabling Extended Tags Mar 12 23:47:58.778737 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.778795 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Mar 12 23:47:58.778853 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Mar 12 23:47:58.778916 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Mar 12 23:47:58.778974 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Mar 12 23:47:58.779033 kernel: pci 0000:00:04.1: enabling Extended Tags Mar 12 23:47:58.779102 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.779176 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Mar 12 23:47:58.779239 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Mar 12 23:47:58.779308 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Mar 12 23:47:58.779371 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Mar 12 23:47:58.779429 kernel: pci 0000:00:04.2: enabling Extended Tags Mar 12 23:47:58.779501 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.779561 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Mar 12 23:47:58.779619 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Mar 12 23:47:58.779676 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Mar 12 23:47:58.779733 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Mar 12 23:47:58.779791 kernel: pci 0000:00:04.3: enabling Extended Tags Mar 12 23:47:58.779857 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.779923 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Mar 12 23:47:58.779982 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Mar 12 23:47:58.780039 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Mar 12 23:47:58.780098 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Mar 12 23:47:58.780166 kernel: pci 0000:00:04.4: enabling Extended Tags Mar 12 23:47:58.780235 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.780308 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Mar 12 23:47:58.780370 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Mar 12 23:47:58.780427 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Mar 12 23:47:58.780486 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Mar 12 23:47:58.780544 kernel: pci 0000:00:04.5: enabling Extended Tags Mar 12 23:47:58.780659 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.780723 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Mar 12 23:47:58.780781 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Mar 12 23:47:58.780840 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Mar 12 23:47:58.780899 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Mar 12 23:47:58.780958 kernel: pci 0000:00:04.6: enabling Extended Tags Mar 12 23:47:58.781027 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.781086 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Mar 12 23:47:58.781144 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Mar 12 23:47:58.781202 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Mar 12 23:47:58.781259 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Mar 12 23:47:58.781333 kernel: pci 0000:00:04.7: enabling Extended Tags Mar 12 23:47:58.781402 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:47:58.781464 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Mar 12 23:47:58.781522 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Mar 12 23:47:58.781579 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Mar 12 23:47:58.781637 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Mar 12 23:47:58.781695 kernel: pci 0000:00:05.0: enabling Extended Tags Mar 12 23:47:58.781773 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 12 23:47:58.781835 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Mar 12 23:47:58.781899 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Mar 12 23:47:58.781960 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 12 23:47:58.782021 kernel: pci 0000:01:00.0: enabling Extended Tags Mar 12 23:47:58.782091 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 12 23:47:58.782152 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Mar 12 23:47:58.782211 kernel: pci 0000:02:00.0: enabling Extended Tags Mar 12 23:47:58.782279 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Mar 12 23:47:58.782357 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Mar 12 23:47:58.782424 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Mar 12 23:47:58.782485 kernel: pci 0000:03:00.0: enabling Extended Tags Mar 12 23:47:58.782553 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 12 23:47:58.782614 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Mar 12 23:47:58.782674 kernel: pci 0000:04:00.0: enabling Extended Tags Mar 12 23:47:58.782741 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 12 23:47:58.782805 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Mar 12 23:47:58.782867 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Mar 12 23:47:58.782928 kernel: pci 0000:05:00.0: enabling Extended Tags Mar 12 23:47:58.782997 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Mar 12 23:47:58.783059 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Mar 12 23:47:58.783120 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Mar 12 23:47:58.783181 kernel: pci 0000:06:00.0: enabling Extended Tags Mar 12 23:47:58.783245 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Mar 12 23:47:58.783314 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Mar 12 23:47:58.783374 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Mar 12 23:47:58.783437 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Mar 12 23:47:58.783495 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Mar 12 23:47:58.783553 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Mar 12 23:47:58.783615 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 12 23:47:58.783678 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Mar 12 23:47:58.783738 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Mar 12 23:47:58.783800 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 12 23:47:58.783860 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Mar 12 23:47:58.783918 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Mar 12 23:47:58.783981 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 12 23:47:58.784040 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Mar 12 23:47:58.784101 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Mar 12 23:47:58.784162 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 12 23:47:58.784221 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Mar 12 23:47:58.784280 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Mar 12 23:47:58.784366 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 12 23:47:58.784427 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Mar 12 23:47:58.784488 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Mar 12 23:47:58.784557 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 12 23:47:58.784635 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Mar 12 23:47:58.784698 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Mar 12 23:47:58.784761 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 12 23:47:58.784820 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Mar 12 23:47:58.784877 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Mar 12 23:47:58.784940 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Mar 12 23:47:58.785002 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Mar 12 23:47:58.785061 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Mar 12 23:47:58.785123 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Mar 12 23:47:58.785182 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Mar 12 23:47:58.785240 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Mar 12 23:47:58.785310 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Mar 12 23:47:58.785370 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Mar 12 23:47:58.785431 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Mar 12 23:47:58.785493 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Mar 12 23:47:58.785552 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Mar 12 23:47:58.785610 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Mar 12 23:47:58.785672 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Mar 12 23:47:58.785729 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Mar 12 23:47:58.785791 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Mar 12 23:47:58.785853 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Mar 12 23:47:58.785911 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Mar 12 23:47:58.785969 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Mar 12 23:47:58.786031 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Mar 12 23:47:58.786089 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Mar 12 23:47:58.786147 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Mar 12 23:47:58.786211 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Mar 12 23:47:58.786270 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Mar 12 23:47:58.786336 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Mar 12 23:47:58.786399 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Mar 12 23:47:58.786458 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Mar 12 23:47:58.786515 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Mar 12 23:47:58.786577 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Mar 12 23:47:58.786638 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Mar 12 23:47:58.786696 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Mar 12 23:47:58.786758 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Mar 12 23:47:58.786818 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Mar 12 23:47:58.786877 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Mar 12 23:47:58.786941 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Mar 12 23:47:58.787000 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Mar 12 23:47:58.787057 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Mar 12 23:47:58.787122 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Mar 12 23:47:58.787180 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Mar 12 23:47:58.787238 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Mar 12 23:47:58.787312 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Mar 12 23:47:58.787374 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Mar 12 23:47:58.787432 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Mar 12 23:47:58.787493 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Mar 12 23:47:58.787553 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Mar 12 23:47:58.787611 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Mar 12 23:47:58.787673 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Mar 12 23:47:58.787732 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Mar 12 23:47:58.787790 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Mar 12 23:47:58.787851 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Mar 12 23:47:58.787909 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Mar 12 23:47:58.787966 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Mar 12 23:47:58.788029 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Mar 12 23:47:58.788086 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Mar 12 23:47:58.788144 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Mar 12 23:47:58.788206 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Mar 12 23:47:58.788264 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Mar 12 23:47:58.788332 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Mar 12 23:47:58.788395 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Mar 12 23:47:58.788459 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Mar 12 23:47:58.788517 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Mar 12 23:47:58.788596 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Mar 12 23:47:58.788664 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Mar 12 23:47:58.788725 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Mar 12 23:47:58.788787 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Mar 12 23:47:58.788847 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Mar 12 23:47:58.788908 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Mar 12 23:47:58.788970 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Mar 12 23:47:58.789028 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Mar 12 23:47:58.789086 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Mar 12 23:47:58.789145 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Mar 12 23:47:58.789204 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Mar 12 23:47:58.789262 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Mar 12 23:47:58.789336 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Mar 12 23:47:58.789415 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Mar 12 23:47:58.789481 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Mar 12 23:47:58.789541 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Mar 12 23:47:58.789602 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Mar 12 23:47:58.789661 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Mar 12 23:47:58.789734 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Mar 12 23:47:58.789798 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Mar 12 23:47:58.789860 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Mar 12 23:47:58.789925 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Mar 12 23:47:58.789986 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Mar 12 23:47:58.790045 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Mar 12 23:47:58.790104 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Mar 12 23:47:58.790163 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Mar 12 23:47:58.790223 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Mar 12 23:47:58.790282 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Mar 12 23:47:58.790362 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Mar 12 23:47:58.790422 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Mar 12 23:47:58.790483 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Mar 12 23:47:58.790542 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Mar 12 23:47:58.790601 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Mar 12 23:47:58.790660 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Mar 12 23:47:58.790720 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Mar 12 23:47:58.790778 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Mar 12 23:47:58.790840 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Mar 12 23:47:58.790899 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Mar 12 23:47:58.790958 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Mar 12 23:47:58.791017 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Mar 12 23:47:58.791077 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Mar 12 23:47:58.791135 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Mar 12 23:47:58.791195 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Mar 12 23:47:58.791253 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Mar 12 23:47:58.791324 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Mar 12 23:47:58.791385 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Mar 12 23:47:58.791444 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Mar 12 23:47:58.791502 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Mar 12 23:47:58.791561 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Mar 12 23:47:58.791619 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Mar 12 23:47:58.791678 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Mar 12 23:47:58.791737 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Mar 12 23:47:58.791799 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Mar 12 23:47:58.791858 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Mar 12 23:47:58.791916 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Mar 12 23:47:58.791975 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Mar 12 23:47:58.792034 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Mar 12 23:47:58.792092 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Mar 12 23:47:58.792152 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Mar 12 23:47:58.792212 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Mar 12 23:47:58.792274 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Mar 12 23:47:58.792347 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Mar 12 23:47:58.792411 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Mar 12 23:47:58.792470 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Mar 12 23:47:58.792533 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Mar 12 23:47:58.792609 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Mar 12 23:47:58.792675 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Mar 12 23:47:58.792739 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Mar 12 23:47:58.792801 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Mar 12 23:47:58.792861 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Mar 12 23:47:58.792921 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Mar 12 23:47:58.792980 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Mar 12 23:47:58.793041 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Mar 12 23:47:58.793100 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Mar 12 23:47:58.793159 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Mar 12 23:47:58.793219 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Mar 12 23:47:58.793278 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Mar 12 23:47:58.793351 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Mar 12 23:47:58.793413 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Mar 12 23:47:58.793473 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Mar 12 23:47:58.793536 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Mar 12 23:47:58.793594 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Mar 12 23:47:58.793654 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Mar 12 23:47:58.793713 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Mar 12 23:47:58.793773 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Mar 12 23:47:58.793832 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Mar 12 23:47:58.793892 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Mar 12 23:47:58.793950 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Mar 12 23:47:58.794011 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Mar 12 23:47:58.794070 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Mar 12 23:47:58.794129 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Mar 12 23:47:58.794187 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Mar 12 23:47:58.794247 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Mar 12 23:47:58.794322 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Mar 12 23:47:58.794385 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Mar 12 23:47:58.794443 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Mar 12 23:47:58.794507 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Mar 12 23:47:58.794565 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Mar 12 23:47:58.794624 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Mar 12 23:47:58.794682 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Mar 12 23:47:58.794741 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Mar 12 23:47:58.794799 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Mar 12 23:47:58.794860 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Mar 12 23:47:58.794918 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Mar 12 23:47:58.794978 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Mar 12 23:47:58.795036 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Mar 12 23:47:58.795095 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Mar 12 23:47:58.795154 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Mar 12 23:47:58.795214 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Mar 12 23:47:58.795273 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.795342 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.795406 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Mar 12 23:47:58.795468 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.795526 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.795585 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Mar 12 23:47:58.795643 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.795700 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.795760 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Mar 12 23:47:58.795819 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.795878 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.795938 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Mar 12 23:47:58.795996 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.796054 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.796114 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Mar 12 23:47:58.796172 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.796245 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.796316 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Mar 12 23:47:58.796378 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.796444 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.796505 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Mar 12 23:47:58.796570 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.796648 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.796709 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Mar 12 23:47:58.796767 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.796826 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.796886 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Mar 12 23:47:58.796945 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.797005 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.797064 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Mar 12 23:47:58.797122 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.797180 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.797240 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Mar 12 23:47:58.797315 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.797379 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.797442 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Mar 12 23:47:58.797500 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.797558 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.797617 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Mar 12 23:47:58.797675 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.797736 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.797795 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Mar 12 23:47:58.797853 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.797911 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.797971 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Mar 12 23:47:58.798030 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.798090 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.798150 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Mar 12 23:47:58.798208 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.798266 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.798336 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Mar 12 23:47:58.798396 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.798456 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.798515 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Mar 12 23:47:58.798573 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Mar 12 23:47:58.798631 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Mar 12 23:47:58.798689 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Mar 12 23:47:58.798747 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Mar 12 23:47:58.798805 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Mar 12 23:47:58.798865 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Mar 12 23:47:58.798923 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Mar 12 23:47:58.798982 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Mar 12 23:47:58.799041 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Mar 12 23:47:58.799099 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Mar 12 23:47:58.799158 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Mar 12 23:47:58.799217 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Mar 12 23:47:58.799277 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Mar 12 23:47:58.799351 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Mar 12 23:47:58.799412 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.799471 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.799530 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.799590 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.799651 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.799709 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.799770 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.799831 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.799890 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.799949 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800009 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800067 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800127 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800187 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800249 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800319 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800382 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800443 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800503 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800561 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800635 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800695 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800759 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800817 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800878 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.800937 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.800998 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.801059 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.801121 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.801179 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.801238 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.801311 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.801373 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.801433 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.801495 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:47:58.804429 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:47:58.804544 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Mar 12 23:47:58.804624 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Mar 12 23:47:58.804689 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Mar 12 23:47:58.804748 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 12 23:47:58.804815 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Mar 12 23:47:58.804874 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Mar 12 23:47:58.804939 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Mar 12 23:47:58.804999 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Mar 12 23:47:58.805058 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Mar 12 23:47:58.805115 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Mar 12 23:47:58.805182 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Mar 12 23:47:58.805242 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Mar 12 23:47:58.805325 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Mar 12 23:47:58.805388 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Mar 12 23:47:58.805448 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Mar 12 23:47:58.805519 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Mar 12 23:47:58.805582 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Mar 12 23:47:58.805641 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Mar 12 23:47:58.805700 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Mar 12 23:47:58.805766 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Mar 12 23:47:58.805829 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Mar 12 23:47:58.805888 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Mar 12 23:47:58.805948 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Mar 12 23:47:58.806005 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Mar 12 23:47:58.806070 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Mar 12 23:47:58.806130 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Mar 12 23:47:58.806190 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Mar 12 23:47:58.806249 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Mar 12 23:47:58.806319 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 12 23:47:58.806381 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Mar 12 23:47:58.806440 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Mar 12 23:47:58.806498 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 12 23:47:58.806558 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Mar 12 23:47:58.806617 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Mar 12 23:47:58.806677 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 12 23:47:58.806738 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Mar 12 23:47:58.806797 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Mar 12 23:47:58.806855 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Mar 12 23:47:58.806915 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Mar 12 23:47:58.806974 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Mar 12 23:47:58.807032 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Mar 12 23:47:58.807094 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Mar 12 23:47:58.807153 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Mar 12 23:47:58.807211 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Mar 12 23:47:58.807271 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Mar 12 23:47:58.807345 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Mar 12 23:47:58.807406 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Mar 12 23:47:58.807469 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Mar 12 23:47:58.807529 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Mar 12 23:47:58.807587 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Mar 12 23:47:58.807648 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Mar 12 23:47:58.807707 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Mar 12 23:47:58.807764 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Mar 12 23:47:58.807828 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Mar 12 23:47:58.807888 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Mar 12 23:47:58.807946 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Mar 12 23:47:58.808008 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Mar 12 23:47:58.808069 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Mar 12 23:47:58.808127 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Mar 12 23:47:58.808190 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Mar 12 23:47:58.808251 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Mar 12 23:47:58.808334 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Mar 12 23:47:58.808400 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Mar 12 23:47:58.808460 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Mar 12 23:47:58.808519 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Mar 12 23:47:58.808599 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Mar 12 23:47:58.808667 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Mar 12 23:47:58.808727 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Mar 12 23:47:58.808786 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Mar 12 23:47:58.808851 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Mar 12 23:47:58.808910 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Mar 12 23:47:58.808968 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Mar 12 23:47:58.809027 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Mar 12 23:47:58.809087 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Mar 12 23:47:58.809146 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Mar 12 23:47:58.809204 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Mar 12 23:47:58.809262 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Mar 12 23:47:58.809345 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Mar 12 23:47:58.809407 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Mar 12 23:47:58.809465 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Mar 12 23:47:58.809524 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Mar 12 23:47:58.809585 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Mar 12 23:47:58.809644 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Mar 12 23:47:58.809702 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Mar 12 23:47:58.809759 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Mar 12 23:47:58.809821 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Mar 12 23:47:58.809882 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Mar 12 23:47:58.809940 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Mar 12 23:47:58.809998 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Mar 12 23:47:58.810059 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Mar 12 23:47:58.810118 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Mar 12 23:47:58.810175 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Mar 12 23:47:58.810233 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Mar 12 23:47:58.810301 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Mar 12 23:47:58.810366 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Mar 12 23:47:58.810424 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Mar 12 23:47:58.810484 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Mar 12 23:47:58.810545 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Mar 12 23:47:58.810604 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Mar 12 23:47:58.810665 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Mar 12 23:47:58.810723 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Mar 12 23:47:58.810784 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Mar 12 23:47:58.810846 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Mar 12 23:47:58.810904 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Mar 12 23:47:58.810962 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Mar 12 23:47:58.811023 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Mar 12 23:47:58.811082 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Mar 12 23:47:58.811140 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Mar 12 23:47:58.811198 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Mar 12 23:47:58.811259 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Mar 12 23:47:58.811327 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Mar 12 23:47:58.811389 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Mar 12 23:47:58.811448 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Mar 12 23:47:58.811509 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Mar 12 23:47:58.811569 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Mar 12 23:47:58.811627 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Mar 12 23:47:58.811686 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Mar 12 23:47:58.811747 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Mar 12 23:47:58.811805 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Mar 12 23:47:58.811865 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Mar 12 23:47:58.811924 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Mar 12 23:47:58.811986 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Mar 12 23:47:58.812045 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Mar 12 23:47:58.812103 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Mar 12 23:47:58.812160 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Mar 12 23:47:58.812223 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 12 23:47:58.812278 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 12 23:47:58.812341 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 12 23:47:58.812408 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Mar 12 23:47:58.812464 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Mar 12 23:47:58.812525 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Mar 12 23:47:58.812594 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Mar 12 23:47:58.812664 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Mar 12 23:47:58.812723 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Mar 12 23:47:58.812790 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Mar 12 23:47:58.812847 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Mar 12 23:47:58.812907 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Mar 12 23:47:58.812962 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Mar 12 23:47:58.813022 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Mar 12 23:47:58.813079 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 12 23:47:58.813139 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Mar 12 23:47:58.813195 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 12 23:47:58.813256 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Mar 12 23:47:58.813344 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 12 23:47:58.813414 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Mar 12 23:47:58.813472 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Mar 12 23:47:58.813534 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Mar 12 23:47:58.813588 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Mar 12 23:47:58.813648 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Mar 12 23:47:58.813703 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Mar 12 23:47:58.813763 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Mar 12 23:47:58.813820 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Mar 12 23:47:58.813885 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Mar 12 23:47:58.813939 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Mar 12 23:47:58.814010 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Mar 12 23:47:58.814066 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Mar 12 23:47:58.814127 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Mar 12 23:47:58.814181 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Mar 12 23:47:58.814244 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Mar 12 23:47:58.814324 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Mar 12 23:47:58.814390 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Mar 12 23:47:58.814445 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Mar 12 23:47:58.814506 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Mar 12 23:47:58.814563 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Mar 12 23:47:58.814623 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Mar 12 23:47:58.814677 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Mar 12 23:47:58.814730 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Mar 12 23:47:58.814790 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Mar 12 23:47:58.814845 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Mar 12 23:47:58.814901 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Mar 12 23:47:58.814966 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Mar 12 23:47:58.815021 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Mar 12 23:47:58.815075 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Mar 12 23:47:58.815134 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Mar 12 23:47:58.815189 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Mar 12 23:47:58.815242 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Mar 12 23:47:58.815343 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Mar 12 23:47:58.815401 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Mar 12 23:47:58.815455 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Mar 12 23:47:58.815515 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Mar 12 23:47:58.815582 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Mar 12 23:47:58.815637 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Mar 12 23:47:58.815697 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Mar 12 23:47:58.815755 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Mar 12 23:47:58.815809 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Mar 12 23:47:58.815868 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Mar 12 23:47:58.815923 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Mar 12 23:47:58.815976 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Mar 12 23:47:58.816036 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Mar 12 23:47:58.816090 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Mar 12 23:47:58.816146 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Mar 12 23:47:58.816205 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Mar 12 23:47:58.816260 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Mar 12 23:47:58.816330 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Mar 12 23:47:58.816392 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Mar 12 23:47:58.816446 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Mar 12 23:47:58.816500 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Mar 12 23:47:58.816568 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Mar 12 23:47:58.816653 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Mar 12 23:47:58.816711 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Mar 12 23:47:58.816773 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Mar 12 23:47:58.816827 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Mar 12 23:47:58.816881 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Mar 12 23:47:58.816944 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Mar 12 23:47:58.816998 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Mar 12 23:47:58.817052 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Mar 12 23:47:58.817113 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Mar 12 23:47:58.817168 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Mar 12 23:47:58.817222 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Mar 12 23:47:58.817231 kernel: iommu: Default domain type: Translated Mar 12 23:47:58.817240 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 12 23:47:58.817248 kernel: efivars: Registered efivars operations Mar 12 23:47:58.817255 kernel: vgaarb: loaded Mar 12 23:47:58.817263 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 12 23:47:58.817270 kernel: VFS: Disk quotas dquot_6.6.0 Mar 12 23:47:58.817278 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 12 23:47:58.817285 kernel: pnp: PnP ACPI init Mar 12 23:47:58.817374 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 12 23:47:58.817387 kernel: pnp: PnP ACPI: found 1 devices Mar 12 23:47:58.817397 kernel: NET: Registered PF_INET protocol family Mar 12 23:47:58.817404 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 12 23:47:58.817412 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Mar 12 23:47:58.817420 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 12 23:47:58.817428 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 12 23:47:58.817435 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Mar 12 23:47:58.817443 kernel: TCP: Hash tables configured (established 131072 bind 65536) Mar 12 23:47:58.817451 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Mar 12 23:47:58.817458 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Mar 12 23:47:58.817467 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 12 23:47:58.817533 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Mar 12 23:47:58.817543 kernel: PCI: CLS 0 bytes, default 64 Mar 12 23:47:58.817551 kernel: kvm [1]: HYP mode not available Mar 12 23:47:58.817558 kernel: Initialise system trusted keyrings Mar 12 23:47:58.817565 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Mar 12 23:47:58.817573 kernel: Key type asymmetric registered Mar 12 23:47:58.817580 kernel: Asymmetric key parser 'x509' registered Mar 12 23:47:58.817587 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 12 23:47:58.817597 kernel: io scheduler mq-deadline registered Mar 12 23:47:58.817604 kernel: io scheduler kyber registered Mar 12 23:47:58.817611 kernel: io scheduler bfq registered Mar 12 23:47:58.817619 kernel: ACPI: \_SB_.L001: Enabled at IRQ 36 Mar 12 23:47:58.817680 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Mar 12 23:47:58.817740 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Mar 12 23:47:58.817810 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.817873 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Mar 12 23:47:58.817935 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Mar 12 23:47:58.817992 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.818053 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Mar 12 23:47:58.818111 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Mar 12 23:47:58.818170 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.818230 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Mar 12 23:47:58.818288 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Mar 12 23:47:58.818365 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.818431 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Mar 12 23:47:58.818490 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Mar 12 23:47:58.818548 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.818608 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Mar 12 23:47:58.818666 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Mar 12 23:47:58.818725 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.818786 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Mar 12 23:47:58.818846 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Mar 12 23:47:58.818916 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.818978 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Mar 12 23:47:58.819036 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Mar 12 23:47:58.819095 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.819105 kernel: ACPI: \_SB_.L002: Enabled at IRQ 37 Mar 12 23:47:58.819164 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Mar 12 23:47:58.819223 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Mar 12 23:47:58.819282 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.819370 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Mar 12 23:47:58.819430 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Mar 12 23:47:58.819488 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.819551 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Mar 12 23:47:58.819610 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Mar 12 23:47:58.819668 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.819728 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Mar 12 23:47:58.819789 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Mar 12 23:47:58.819847 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.819908 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Mar 12 23:47:58.819966 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Mar 12 23:47:58.820024 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.820084 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Mar 12 23:47:58.820143 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Mar 12 23:47:58.820201 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.820263 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Mar 12 23:47:58.820334 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Mar 12 23:47:58.820394 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.820466 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Mar 12 23:47:58.820529 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Mar 12 23:47:58.820601 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.820612 kernel: ACPI: \_SB_.L003: Enabled at IRQ 38 Mar 12 23:47:58.820675 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Mar 12 23:47:58.820738 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Mar 12 23:47:58.820797 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.820857 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Mar 12 23:47:58.820921 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Mar 12 23:47:58.820981 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.821043 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Mar 12 23:47:58.821101 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Mar 12 23:47:58.821159 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.821228 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Mar 12 23:47:58.821289 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Mar 12 23:47:58.821358 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.821420 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Mar 12 23:47:58.821479 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Mar 12 23:47:58.821538 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.821598 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Mar 12 23:47:58.821659 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Mar 12 23:47:58.821718 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.821779 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Mar 12 23:47:58.821837 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Mar 12 23:47:58.821895 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.821956 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Mar 12 23:47:58.822015 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Mar 12 23:47:58.822076 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.822088 kernel: ACPI: \_SB_.L000: Enabled at IRQ 35 Mar 12 23:47:58.822148 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Mar 12 23:47:58.822206 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Mar 12 23:47:58.822264 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.822335 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Mar 12 23:47:58.822397 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Mar 12 23:47:58.822455 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.822516 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Mar 12 23:47:58.822578 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Mar 12 23:47:58.822636 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.822697 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Mar 12 23:47:58.822755 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Mar 12 23:47:58.822813 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.822874 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Mar 12 23:47:58.822933 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Mar 12 23:47:58.822991 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.823053 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Mar 12 23:47:58.823112 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Mar 12 23:47:58.823170 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.823231 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Mar 12 23:47:58.823290 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Mar 12 23:47:58.823359 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.823421 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Mar 12 23:47:58.823482 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Mar 12 23:47:58.823540 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.823601 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Mar 12 23:47:58.823659 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Mar 12 23:47:58.823717 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:47:58.823727 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 12 23:47:58.823735 kernel: ACPI: button: Power Button [PWRB] Mar 12 23:47:58.823798 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Mar 12 23:47:58.823864 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Mar 12 23:47:58.823875 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 12 23:47:58.823882 kernel: thunder_xcv, ver 1.0 Mar 12 23:47:58.823890 kernel: thunder_bgx, ver 1.0 Mar 12 23:47:58.823897 kernel: nicpf, ver 1.0 Mar 12 23:47:58.823904 kernel: nicvf, ver 1.0 Mar 12 23:47:58.823975 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 12 23:47:58.824032 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-12T23:47:58 UTC (1773359278) Mar 12 23:47:58.824055 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 12 23:47:58.824064 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 12 23:47:58.824072 kernel: watchdog: NMI not fully supported Mar 12 23:47:58.824080 kernel: watchdog: Hard watchdog permanently disabled Mar 12 23:47:58.824088 kernel: NET: Registered PF_INET6 protocol family Mar 12 23:47:58.824096 kernel: Segment Routing with IPv6 Mar 12 23:47:58.824103 kernel: In-situ OAM (IOAM) with IPv6 Mar 12 23:47:58.824111 kernel: NET: Registered PF_PACKET protocol family Mar 12 23:47:58.824119 kernel: Key type dns_resolver registered Mar 12 23:47:58.824127 kernel: registered taskstats version 1 Mar 12 23:47:58.824135 kernel: Loading compiled-in X.509 certificates Mar 12 23:47:58.824143 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 653709f5ad64856a37b70c07139630123477ee1c' Mar 12 23:47:58.824151 kernel: Demotion targets for Node 0: null Mar 12 23:47:58.824159 kernel: Key type .fscrypt registered Mar 12 23:47:58.824168 kernel: Key type fscrypt-provisioning registered Mar 12 23:47:58.824175 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 12 23:47:58.824183 kernel: ima: Allocated hash algorithm: sha1 Mar 12 23:47:58.824191 kernel: ima: No architecture policies found Mar 12 23:47:58.824200 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 12 23:47:58.824208 kernel: clk: Disabling unused clocks Mar 12 23:47:58.824216 kernel: PM: genpd: Disabling unused power domains Mar 12 23:47:58.824223 kernel: Warning: unable to open an initial console. Mar 12 23:47:58.824231 kernel: Freeing unused kernel memory: 39552K Mar 12 23:47:58.824239 kernel: Run /init as init process Mar 12 23:47:58.824247 kernel: with arguments: Mar 12 23:47:58.824255 kernel: /init Mar 12 23:47:58.824262 kernel: with environment: Mar 12 23:47:58.824271 kernel: HOME=/ Mar 12 23:47:58.824279 kernel: TERM=linux Mar 12 23:47:58.824288 systemd[1]: Successfully made /usr/ read-only. Mar 12 23:47:58.824311 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:47:58.824319 systemd[1]: Detected virtualization kvm. Mar 12 23:47:58.824327 systemd[1]: Detected architecture arm64. Mar 12 23:47:58.824335 systemd[1]: Running in initrd. Mar 12 23:47:58.824344 systemd[1]: No hostname configured, using default hostname. Mar 12 23:47:58.824352 systemd[1]: Hostname set to . Mar 12 23:47:58.824360 systemd[1]: Initializing machine ID from VM UUID. Mar 12 23:47:58.824368 systemd[1]: Queued start job for default target initrd.target. Mar 12 23:47:58.824377 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:47:58.824385 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:47:58.824393 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 12 23:47:58.824401 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:47:58.824411 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 12 23:47:58.824419 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 12 23:47:58.824428 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 12 23:47:58.824436 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 12 23:47:58.824444 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:47:58.824453 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:47:58.824461 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:47:58.824470 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:47:58.824478 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:47:58.824486 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:47:58.824494 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:47:58.824502 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:47:58.824510 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 12 23:47:58.824518 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 12 23:47:58.824526 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:47:58.824535 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:47:58.824544 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:47:58.824552 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:47:58.824560 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 12 23:47:58.824568 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:47:58.824587 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 12 23:47:58.824597 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 12 23:47:58.824606 systemd[1]: Starting systemd-fsck-usr.service... Mar 12 23:47:58.824616 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:47:58.824625 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:47:58.824633 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:47:58.824642 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 12 23:47:58.824650 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:47:58.824658 systemd[1]: Finished systemd-fsck-usr.service. Mar 12 23:47:58.824668 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 23:47:58.824701 systemd-journald[310]: Collecting audit messages is disabled. Mar 12 23:47:58.824721 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:47:58.824731 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 12 23:47:58.824739 kernel: Bridge firewalling registered Mar 12 23:47:58.824747 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 23:47:58.824755 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:47:58.824764 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:47:58.824772 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:47:58.824781 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:47:58.824789 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:47:58.824798 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:47:58.824807 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:47:58.824815 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 12 23:47:58.824825 systemd-journald[310]: Journal started Mar 12 23:47:58.824843 systemd-journald[310]: Runtime Journal (/run/log/journal/fc66c3c669d143f09d1941ddea271ad6) is 8M, max 319.5M, 311.5M free. Mar 12 23:47:58.764421 systemd-modules-load[312]: Inserted module 'overlay' Mar 12 23:47:58.826572 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:47:58.779213 systemd-modules-load[312]: Inserted module 'br_netfilter' Mar 12 23:47:58.828497 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:47:58.831666 dracut-cmdline[342]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:47:58.840832 systemd-tmpfiles[353]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 12 23:47:58.843937 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:47:58.847018 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:47:58.884895 systemd-resolved[386]: Positive Trust Anchors: Mar 12 23:47:58.884915 systemd-resolved[386]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:47:58.884945 systemd-resolved[386]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:47:58.890413 systemd-resolved[386]: Defaulting to hostname 'linux'. Mar 12 23:47:58.891638 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:47:58.893947 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:47:58.908315 kernel: SCSI subsystem initialized Mar 12 23:47:58.913308 kernel: Loading iSCSI transport class v2.0-870. Mar 12 23:47:58.921340 kernel: iscsi: registered transport (tcp) Mar 12 23:47:58.933548 kernel: iscsi: registered transport (qla4xxx) Mar 12 23:47:58.933631 kernel: QLogic iSCSI HBA Driver Mar 12 23:47:58.949742 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:47:58.971181 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:47:58.972778 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:47:59.017889 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 12 23:47:59.019777 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 12 23:47:59.077354 kernel: raid6: neonx8 gen() 15722 MB/s Mar 12 23:47:59.094356 kernel: raid6: neonx4 gen() 15763 MB/s Mar 12 23:47:59.111345 kernel: raid6: neonx2 gen() 13167 MB/s Mar 12 23:47:59.128345 kernel: raid6: neonx1 gen() 10422 MB/s Mar 12 23:47:59.145343 kernel: raid6: int64x8 gen() 6878 MB/s Mar 12 23:47:59.162343 kernel: raid6: int64x4 gen() 7312 MB/s Mar 12 23:47:59.179343 kernel: raid6: int64x2 gen() 6087 MB/s Mar 12 23:47:59.196392 kernel: raid6: int64x1 gen() 5017 MB/s Mar 12 23:47:59.196409 kernel: raid6: using algorithm neonx4 gen() 15763 MB/s Mar 12 23:47:59.214357 kernel: raid6: .... xor() 12309 MB/s, rmw enabled Mar 12 23:47:59.214408 kernel: raid6: using neon recovery algorithm Mar 12 23:47:59.219317 kernel: xor: measuring software checksum speed Mar 12 23:47:59.219335 kernel: 8regs : 18995 MB/sec Mar 12 23:47:59.220425 kernel: 32regs : 21687 MB/sec Mar 12 23:47:59.221600 kernel: arm64_neon : 28109 MB/sec Mar 12 23:47:59.221616 kernel: xor: using function: arm64_neon (28109 MB/sec) Mar 12 23:47:59.274334 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 12 23:47:59.280401 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:47:59.282765 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:47:59.310363 systemd-udevd[566]: Using default interface naming scheme 'v255'. Mar 12 23:47:59.314436 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:47:59.317607 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 12 23:47:59.347802 dracut-pre-trigger[578]: rd.md=0: removing MD RAID activation Mar 12 23:47:59.369044 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:47:59.371364 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:47:59.455167 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:47:59.459198 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 12 23:47:59.498325 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Mar 12 23:47:59.501432 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Mar 12 23:47:59.510698 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 12 23:47:59.510738 kernel: GPT:17805311 != 104857599 Mar 12 23:47:59.510762 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 12 23:47:59.512653 kernel: GPT:17805311 != 104857599 Mar 12 23:47:59.514313 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 12 23:47:59.515311 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 23:47:59.526833 kernel: ACPI: bus type USB registered Mar 12 23:47:59.526869 kernel: usbcore: registered new interface driver usbfs Mar 12 23:47:59.526880 kernel: usbcore: registered new interface driver hub Mar 12 23:47:59.526890 kernel: usbcore: registered new device driver usb Mar 12 23:47:59.545322 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 12 23:47:59.545518 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 12 23:47:59.545599 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 12 23:47:59.547869 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:47:59.552922 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 12 23:47:59.553078 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 12 23:47:59.553155 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 12 23:47:59.553236 kernel: hub 1-0:1.0: USB hub found Mar 12 23:47:59.553364 kernel: hub 1-0:1.0: 4 ports detected Mar 12 23:47:59.553439 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 12 23:47:59.553521 kernel: hub 2-0:1.0: USB hub found Mar 12 23:47:59.553599 kernel: hub 2-0:1.0: 4 ports detected Mar 12 23:47:59.548046 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:47:59.552947 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:47:59.555607 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:47:59.577386 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:47:59.613356 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 12 23:47:59.614732 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 12 23:47:59.623167 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 12 23:47:59.631736 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 23:47:59.638396 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 12 23:47:59.639495 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 12 23:47:59.641592 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:47:59.644341 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:47:59.646447 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:47:59.649053 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 12 23:47:59.650767 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 12 23:47:59.671042 disk-uuid[665]: Primary Header is updated. Mar 12 23:47:59.671042 disk-uuid[665]: Secondary Entries is updated. Mar 12 23:47:59.671042 disk-uuid[665]: Secondary Header is updated. Mar 12 23:47:59.675360 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:47:59.679317 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 23:47:59.795344 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 12 23:47:59.924316 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Mar 12 23:47:59.926762 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 12 23:47:59.926946 kernel: usbcore: registered new interface driver usbhid Mar 12 23:47:59.926957 kernel: usbhid: USB HID core driver Mar 12 23:48:00.032339 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Mar 12 23:48:00.158328 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Mar 12 23:48:00.211361 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Mar 12 23:48:00.695315 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 23:48:00.696103 disk-uuid[670]: The operation has completed successfully. Mar 12 23:48:00.755311 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 12 23:48:00.755418 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 12 23:48:00.781234 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 12 23:48:00.804247 sh[688]: Success Mar 12 23:48:00.819088 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 12 23:48:00.819132 kernel: device-mapper: uevent: version 1.0.3 Mar 12 23:48:00.819143 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 12 23:48:00.826314 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 12 23:48:00.907042 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 12 23:48:00.908804 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 12 23:48:00.920686 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 12 23:48:00.943371 kernel: BTRFS: device fsid fcbb17b2-5053-44fc-82f0-b24e4919d6d8 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (700) Mar 12 23:48:00.946841 kernel: BTRFS info (device dm-0): first mount of filesystem fcbb17b2-5053-44fc-82f0-b24e4919d6d8 Mar 12 23:48:00.946912 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:00.970542 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 12 23:48:00.970623 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 12 23:48:00.974199 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 12 23:48:00.975529 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:48:00.976792 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 12 23:48:00.977570 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 12 23:48:00.978993 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 12 23:48:01.018367 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (729) Mar 12 23:48:01.021346 kernel: BTRFS info (device vda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:01.021382 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:01.029595 kernel: BTRFS info (device vda6): turning on async discard Mar 12 23:48:01.029631 kernel: BTRFS info (device vda6): enabling free space tree Mar 12 23:48:01.034387 kernel: BTRFS info (device vda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:01.034481 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 12 23:48:01.036761 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 12 23:48:01.076152 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:48:01.080681 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:48:01.120066 systemd-networkd[872]: lo: Link UP Mar 12 23:48:01.120079 systemd-networkd[872]: lo: Gained carrier Mar 12 23:48:01.121054 systemd-networkd[872]: Enumeration completed Mar 12 23:48:01.121162 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:48:01.121826 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:01.121830 systemd-networkd[872]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:48:01.122318 systemd[1]: Reached target network.target - Network. Mar 12 23:48:01.123013 systemd-networkd[872]: eth0: Link UP Mar 12 23:48:01.123112 systemd-networkd[872]: eth0: Gained carrier Mar 12 23:48:01.123121 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:01.151374 systemd-networkd[872]: eth0: DHCPv4 address 10.0.8.7/25, gateway 10.0.8.1 acquired from 10.0.8.1 Mar 12 23:48:01.212429 ignition[810]: Ignition 2.22.0 Mar 12 23:48:01.213226 ignition[810]: Stage: fetch-offline Mar 12 23:48:01.213269 ignition[810]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:01.213276 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:48:01.215257 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:48:01.213385 ignition[810]: parsed url from cmdline: "" Mar 12 23:48:01.218541 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 12 23:48:01.213388 ignition[810]: no config URL provided Mar 12 23:48:01.213393 ignition[810]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:48:01.213401 ignition[810]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:48:01.213405 ignition[810]: failed to fetch config: resource requires networking Mar 12 23:48:01.213568 ignition[810]: Ignition finished successfully Mar 12 23:48:01.249494 ignition[886]: Ignition 2.22.0 Mar 12 23:48:01.249513 ignition[886]: Stage: fetch Mar 12 23:48:01.249640 ignition[886]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:01.249648 ignition[886]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:48:01.249722 ignition[886]: parsed url from cmdline: "" Mar 12 23:48:01.249726 ignition[886]: no config URL provided Mar 12 23:48:01.249730 ignition[886]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:48:01.249736 ignition[886]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:48:01.250150 ignition[886]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 12 23:48:01.250365 ignition[886]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 12 23:48:01.250547 ignition[886]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 12 23:48:02.112890 ignition[886]: GET result: OK Mar 12 23:48:02.113140 ignition[886]: parsing config with SHA512: af0022feafe05c2cbbb635eaa4a10a4045b77775ae2832e01b91eda31adf288f90790c367a77578b49e1d770263a56cd1696d45107c428e869a4896d50cfb3fb Mar 12 23:48:02.118621 unknown[886]: fetched base config from "system" Mar 12 23:48:02.118638 unknown[886]: fetched base config from "system" Mar 12 23:48:02.119024 ignition[886]: fetch: fetch complete Mar 12 23:48:02.118643 unknown[886]: fetched user config from "openstack" Mar 12 23:48:02.119029 ignition[886]: fetch: fetch passed Mar 12 23:48:02.121073 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 12 23:48:02.119079 ignition[886]: Ignition finished successfully Mar 12 23:48:02.123655 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 12 23:48:02.159541 ignition[894]: Ignition 2.22.0 Mar 12 23:48:02.159561 ignition[894]: Stage: kargs Mar 12 23:48:02.159682 ignition[894]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:02.159691 ignition[894]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:48:02.160375 ignition[894]: kargs: kargs passed Mar 12 23:48:02.160415 ignition[894]: Ignition finished successfully Mar 12 23:48:02.164353 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 12 23:48:02.166846 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 12 23:48:02.195768 ignition[902]: Ignition 2.22.0 Mar 12 23:48:02.195788 ignition[902]: Stage: disks Mar 12 23:48:02.195914 ignition[902]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:02.195923 ignition[902]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:48:02.196672 ignition[902]: disks: disks passed Mar 12 23:48:02.199370 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 12 23:48:02.196716 ignition[902]: Ignition finished successfully Mar 12 23:48:02.201511 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 12 23:48:02.203017 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 12 23:48:02.205039 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:48:02.206713 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:48:02.208616 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:48:02.211350 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 12 23:48:02.251313 systemd-fsck[912]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 12 23:48:02.255419 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 12 23:48:02.257778 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 12 23:48:02.403333 kernel: EXT4-fs (vda9): mounted filesystem 4b09db19-3beb-48c2-8dcb-3eec5602206c r/w with ordered data mode. Quota mode: none. Mar 12 23:48:02.403725 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 12 23:48:02.405045 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 12 23:48:02.408639 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:48:02.410519 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 12 23:48:02.411434 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 12 23:48:02.411982 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 12 23:48:02.414666 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 12 23:48:02.414693 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:48:02.427115 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 12 23:48:02.429262 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 12 23:48:02.449331 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (920) Mar 12 23:48:02.452953 kernel: BTRFS info (device vda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:02.452971 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:02.460395 kernel: BTRFS info (device vda6): turning on async discard Mar 12 23:48:02.460456 kernel: BTRFS info (device vda6): enabling free space tree Mar 12 23:48:02.461831 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:48:02.499339 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:02.504456 initrd-setup-root[948]: cut: /sysroot/etc/passwd: No such file or directory Mar 12 23:48:02.512521 initrd-setup-root[955]: cut: /sysroot/etc/group: No such file or directory Mar 12 23:48:02.517336 initrd-setup-root[962]: cut: /sysroot/etc/shadow: No such file or directory Mar 12 23:48:02.521198 initrd-setup-root[969]: cut: /sysroot/etc/gshadow: No such file or directory Mar 12 23:48:02.614825 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 12 23:48:02.617167 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 12 23:48:02.618726 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 12 23:48:02.632019 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 12 23:48:02.634324 kernel: BTRFS info (device vda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:02.655507 ignition[1038]: INFO : Ignition 2.22.0 Mar 12 23:48:02.655507 ignition[1038]: INFO : Stage: mount Mar 12 23:48:02.657254 ignition[1038]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:02.657254 ignition[1038]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:48:02.657254 ignition[1038]: INFO : mount: mount passed Mar 12 23:48:02.657254 ignition[1038]: INFO : Ignition finished successfully Mar 12 23:48:02.657542 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 12 23:48:02.664202 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 12 23:48:02.791566 systemd-networkd[872]: eth0: Gained IPv6LL Mar 12 23:48:03.544320 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:05.552317 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:09.562319 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:09.569752 coreos-metadata[922]: Mar 12 23:48:09.569 WARN failed to locate config-drive, using the metadata service API instead Mar 12 23:48:09.588631 coreos-metadata[922]: Mar 12 23:48:09.588 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 23:48:10.288702 coreos-metadata[922]: Mar 12 23:48:10.288 INFO Fetch successful Mar 12 23:48:10.289781 coreos-metadata[922]: Mar 12 23:48:10.288 INFO wrote hostname ci-4459-2-4-n-9e79e0a9ae to /sysroot/etc/hostname Mar 12 23:48:10.291723 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 12 23:48:10.291821 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 12 23:48:10.293956 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 12 23:48:10.312000 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:48:10.337313 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1057) Mar 12 23:48:10.342593 kernel: BTRFS info (device vda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:48:10.342675 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:48:10.350804 kernel: BTRFS info (device vda6): turning on async discard Mar 12 23:48:10.350865 kernel: BTRFS info (device vda6): enabling free space tree Mar 12 23:48:10.352527 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:48:10.389850 ignition[1075]: INFO : Ignition 2.22.0 Mar 12 23:48:10.389850 ignition[1075]: INFO : Stage: files Mar 12 23:48:10.391510 ignition[1075]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:10.391510 ignition[1075]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:48:10.391510 ignition[1075]: DEBUG : files: compiled without relabeling support, skipping Mar 12 23:48:10.394617 ignition[1075]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 12 23:48:10.394617 ignition[1075]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 12 23:48:10.398184 ignition[1075]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 12 23:48:10.399504 ignition[1075]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 12 23:48:10.399504 ignition[1075]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 12 23:48:10.398802 unknown[1075]: wrote ssh authorized keys file for user: core Mar 12 23:48:10.405238 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:48:10.407044 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 12 23:48:10.459095 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 12 23:48:10.578365 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:48:10.578365 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:48:10.581952 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:10.597721 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:10.597721 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:10.597721 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-arm64.raw: attempt #1 Mar 12 23:48:10.874482 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 12 23:48:11.420466 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Mar 12 23:48:11.420466 ignition[1075]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 12 23:48:11.424740 ignition[1075]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:48:11.427418 ignition[1075]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:48:11.427418 ignition[1075]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 12 23:48:11.427418 ignition[1075]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 12 23:48:11.427418 ignition[1075]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 12 23:48:11.427418 ignition[1075]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:48:11.427418 ignition[1075]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:48:11.427418 ignition[1075]: INFO : files: files passed Mar 12 23:48:11.427418 ignition[1075]: INFO : Ignition finished successfully Mar 12 23:48:11.428142 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 12 23:48:11.431256 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 12 23:48:11.432874 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 12 23:48:11.448491 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 12 23:48:11.448582 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 12 23:48:11.453926 initrd-setup-root-after-ignition[1106]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:48:11.453926 initrd-setup-root-after-ignition[1106]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:48:11.456870 initrd-setup-root-after-ignition[1110]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:48:11.455796 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:48:11.459565 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 12 23:48:11.462508 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 12 23:48:11.506051 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 12 23:48:11.506179 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 12 23:48:11.508578 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 12 23:48:11.510220 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 12 23:48:11.511990 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 12 23:48:11.512804 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 12 23:48:11.550508 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:48:11.552867 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 12 23:48:11.576064 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:48:11.577270 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:48:11.579239 systemd[1]: Stopped target timers.target - Timer Units. Mar 12 23:48:11.580938 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 12 23:48:11.581050 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:48:11.583303 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 12 23:48:11.585256 systemd[1]: Stopped target basic.target - Basic System. Mar 12 23:48:11.586811 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 12 23:48:11.588343 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:48:11.590206 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 12 23:48:11.592054 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:48:11.593838 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 12 23:48:11.595471 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:48:11.597262 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 12 23:48:11.599119 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 12 23:48:11.600725 systemd[1]: Stopped target swap.target - Swaps. Mar 12 23:48:11.602064 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 12 23:48:11.602187 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:48:11.604244 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:48:11.606111 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:48:11.607897 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 12 23:48:11.608782 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:48:11.609956 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 12 23:48:11.610071 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 12 23:48:11.612666 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 12 23:48:11.612793 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:48:11.614485 systemd[1]: ignition-files.service: Deactivated successfully. Mar 12 23:48:11.614579 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 12 23:48:11.616966 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 12 23:48:11.619418 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 12 23:48:11.620198 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 12 23:48:11.620352 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:48:11.622091 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 12 23:48:11.622184 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:48:11.626777 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 12 23:48:11.627440 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 12 23:48:11.639958 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 12 23:48:11.643995 ignition[1130]: INFO : Ignition 2.22.0 Mar 12 23:48:11.643995 ignition[1130]: INFO : Stage: umount Mar 12 23:48:11.648118 ignition[1130]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:48:11.648118 ignition[1130]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:48:11.648118 ignition[1130]: INFO : umount: umount passed Mar 12 23:48:11.648118 ignition[1130]: INFO : Ignition finished successfully Mar 12 23:48:11.646976 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 12 23:48:11.648333 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 12 23:48:11.651561 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 12 23:48:11.651669 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 12 23:48:11.656312 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 12 23:48:11.656418 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 12 23:48:11.658081 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 12 23:48:11.658124 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 12 23:48:11.659973 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 12 23:48:11.660015 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 12 23:48:11.664268 systemd[1]: Stopped target network.target - Network. Mar 12 23:48:11.665772 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 12 23:48:11.665832 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:48:11.667526 systemd[1]: Stopped target paths.target - Path Units. Mar 12 23:48:11.669121 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 12 23:48:11.673377 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:48:11.676102 systemd[1]: Stopped target slices.target - Slice Units. Mar 12 23:48:11.677712 systemd[1]: Stopped target sockets.target - Socket Units. Mar 12 23:48:11.679517 systemd[1]: iscsid.socket: Deactivated successfully. Mar 12 23:48:11.679561 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:48:11.681000 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 12 23:48:11.681033 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:48:11.682672 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 12 23:48:11.682729 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 12 23:48:11.684678 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 12 23:48:11.684719 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 12 23:48:11.686244 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 12 23:48:11.686291 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 12 23:48:11.688008 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 12 23:48:11.689462 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 12 23:48:11.699340 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 12 23:48:11.699462 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 12 23:48:11.704148 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 12 23:48:11.704572 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 12 23:48:11.704627 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:48:11.707867 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:48:11.708043 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 12 23:48:11.708128 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 12 23:48:11.711034 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 12 23:48:11.711384 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 12 23:48:11.713136 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 12 23:48:11.713190 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:48:11.715898 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 12 23:48:11.716786 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 12 23:48:11.716845 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:48:11.717974 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 12 23:48:11.718017 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:48:11.720449 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 12 23:48:11.720489 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 12 23:48:11.722281 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:48:11.724748 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 12 23:48:11.733606 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 12 23:48:11.733742 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 12 23:48:11.737311 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 12 23:48:11.737453 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:48:11.739718 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 12 23:48:11.739756 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 12 23:48:11.741434 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 12 23:48:11.741465 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:48:11.743192 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 12 23:48:11.743250 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:48:11.745810 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 12 23:48:11.745866 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 12 23:48:11.748356 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 23:48:11.748407 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:48:11.751852 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 12 23:48:11.752903 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 12 23:48:11.752957 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:48:11.755818 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 12 23:48:11.755860 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:48:11.758741 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 12 23:48:11.758781 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:48:11.761949 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 12 23:48:11.761987 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:48:11.764193 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:48:11.764235 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:11.768457 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 12 23:48:11.768560 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 12 23:48:11.770724 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 12 23:48:11.772918 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 12 23:48:11.791264 systemd[1]: Switching root. Mar 12 23:48:11.829173 systemd-journald[310]: Journal stopped Mar 12 23:48:12.870218 systemd-journald[310]: Received SIGTERM from PID 1 (systemd). Mar 12 23:48:12.871365 kernel: SELinux: policy capability network_peer_controls=1 Mar 12 23:48:12.871392 kernel: SELinux: policy capability open_perms=1 Mar 12 23:48:12.871409 kernel: SELinux: policy capability extended_socket_class=1 Mar 12 23:48:12.871423 kernel: SELinux: policy capability always_check_network=0 Mar 12 23:48:12.871432 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 12 23:48:12.871441 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 12 23:48:12.871450 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 12 23:48:12.871462 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 12 23:48:12.871475 kernel: SELinux: policy capability userspace_initial_context=0 Mar 12 23:48:12.871485 systemd[1]: Successfully loaded SELinux policy in 78.664ms. Mar 12 23:48:12.871507 kernel: audit: type=1403 audit(1773359292.029:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 12 23:48:12.871522 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.524ms. Mar 12 23:48:12.871533 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:48:12.871544 systemd[1]: Detected virtualization kvm. Mar 12 23:48:12.871553 systemd[1]: Detected architecture arm64. Mar 12 23:48:12.871563 systemd[1]: Detected first boot. Mar 12 23:48:12.871573 systemd[1]: Hostname set to . Mar 12 23:48:12.871583 systemd[1]: Initializing machine ID from VM UUID. Mar 12 23:48:12.871592 zram_generator::config[1179]: No configuration found. Mar 12 23:48:12.871604 kernel: NET: Registered PF_VSOCK protocol family Mar 12 23:48:12.871614 systemd[1]: Populated /etc with preset unit settings. Mar 12 23:48:12.871624 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 12 23:48:12.871634 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 12 23:48:12.871646 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 12 23:48:12.871656 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 12 23:48:12.871665 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 12 23:48:12.871675 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 12 23:48:12.871688 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 12 23:48:12.871697 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 12 23:48:12.871707 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 12 23:48:12.871717 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 12 23:48:12.871730 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 12 23:48:12.871748 systemd[1]: Created slice user.slice - User and Session Slice. Mar 12 23:48:12.871762 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:48:12.871773 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:48:12.871783 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 12 23:48:12.871798 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 12 23:48:12.871808 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 12 23:48:12.871818 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:48:12.871828 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 12 23:48:12.871841 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:48:12.871851 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:48:12.871862 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 12 23:48:12.871872 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 12 23:48:12.871884 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 12 23:48:12.871897 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 12 23:48:12.871907 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:48:12.871918 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:48:12.871928 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:48:12.871938 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:48:12.871948 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 12 23:48:12.871959 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 12 23:48:12.871969 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 12 23:48:12.871979 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:48:12.871989 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:48:12.871999 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:48:12.872010 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 12 23:48:12.872019 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 12 23:48:12.872029 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 12 23:48:12.872039 systemd[1]: Mounting media.mount - External Media Directory... Mar 12 23:48:12.872050 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 12 23:48:12.872060 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 12 23:48:12.872069 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 12 23:48:12.872079 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 12 23:48:12.872089 systemd[1]: Reached target machines.target - Containers. Mar 12 23:48:12.872100 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 12 23:48:12.872110 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:48:12.872119 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:48:12.872131 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 12 23:48:12.872141 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:48:12.872150 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:48:12.872160 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:48:12.872170 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 12 23:48:12.872180 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:48:12.872190 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 12 23:48:12.872201 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 12 23:48:12.872211 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 12 23:48:12.872222 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 12 23:48:12.872232 systemd[1]: Stopped systemd-fsck-usr.service. Mar 12 23:48:12.872254 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:48:12.872265 kernel: loop: module loaded Mar 12 23:48:12.872276 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:48:12.872286 kernel: fuse: init (API version 7.41) Mar 12 23:48:12.872305 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:48:12.872320 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:48:12.872330 kernel: ACPI: bus type drm_connector registered Mar 12 23:48:12.872341 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 12 23:48:12.872350 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 12 23:48:12.872388 systemd-journald[1247]: Collecting audit messages is disabled. Mar 12 23:48:12.872415 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:48:12.872425 systemd[1]: verity-setup.service: Deactivated successfully. Mar 12 23:48:12.872436 systemd[1]: Stopped verity-setup.service. Mar 12 23:48:12.872448 systemd-journald[1247]: Journal started Mar 12 23:48:12.872469 systemd-journald[1247]: Runtime Journal (/run/log/journal/fc66c3c669d143f09d1941ddea271ad6) is 8M, max 319.5M, 311.5M free. Mar 12 23:48:12.654249 systemd[1]: Queued start job for default target multi-user.target. Mar 12 23:48:12.667358 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 12 23:48:12.667754 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 12 23:48:12.877865 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:48:12.878559 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 12 23:48:12.879721 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 12 23:48:12.880990 systemd[1]: Mounted media.mount - External Media Directory. Mar 12 23:48:12.882065 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 12 23:48:12.883311 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 12 23:48:12.884489 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 12 23:48:12.887321 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 12 23:48:12.888706 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:48:12.890140 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 12 23:48:12.890333 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 12 23:48:12.891624 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:48:12.891780 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:48:12.893067 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:48:12.893223 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:48:12.894527 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:48:12.894705 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:48:12.896076 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 12 23:48:12.896240 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 12 23:48:12.897642 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:48:12.897803 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:48:12.899128 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:48:12.900540 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:48:12.902094 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 12 23:48:12.903598 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 12 23:48:12.915617 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:48:12.917860 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 12 23:48:12.919966 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 12 23:48:12.921201 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 12 23:48:12.921230 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:48:12.923067 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 12 23:48:12.936457 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 12 23:48:12.937550 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:48:12.939598 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 12 23:48:12.944452 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 12 23:48:12.945732 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:48:12.947258 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 12 23:48:12.948415 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:48:12.950142 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:48:12.953712 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 12 23:48:12.958458 systemd-journald[1247]: Time spent on flushing to /var/log/journal/fc66c3c669d143f09d1941ddea271ad6 is 25.543ms for 1721 entries. Mar 12 23:48:12.958458 systemd-journald[1247]: System Journal (/var/log/journal/fc66c3c669d143f09d1941ddea271ad6) is 8M, max 584.8M, 576.8M free. Mar 12 23:48:13.006807 systemd-journald[1247]: Received client request to flush runtime journal. Mar 12 23:48:13.006863 kernel: loop0: detected capacity change from 0 to 100632 Mar 12 23:48:12.958480 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 23:48:12.963310 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:48:12.965141 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 12 23:48:12.968101 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 12 23:48:12.971329 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 12 23:48:12.975827 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 12 23:48:12.978262 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 12 23:48:12.991092 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 12 23:48:12.991103 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 12 23:48:12.993476 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:48:12.995057 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:48:12.999866 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 12 23:48:13.010958 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 12 23:48:13.020347 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 12 23:48:13.029376 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 12 23:48:13.039343 kernel: loop1: detected capacity change from 0 to 119840 Mar 12 23:48:13.051254 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 12 23:48:13.055573 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:48:13.084411 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Mar 12 23:48:13.084717 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Mar 12 23:48:13.085331 kernel: loop2: detected capacity change from 0 to 1632 Mar 12 23:48:13.088962 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:48:13.134341 kernel: loop3: detected capacity change from 0 to 200864 Mar 12 23:48:13.185335 kernel: loop4: detected capacity change from 0 to 100632 Mar 12 23:48:13.213343 kernel: loop5: detected capacity change from 0 to 119840 Mar 12 23:48:13.234345 kernel: loop6: detected capacity change from 0 to 1632 Mar 12 23:48:13.242335 kernel: loop7: detected capacity change from 0 to 200864 Mar 12 23:48:13.269053 (sd-merge)[1329]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Mar 12 23:48:13.269498 (sd-merge)[1329]: Merged extensions into '/usr'. Mar 12 23:48:13.273086 systemd[1]: Reload requested from client PID 1298 ('systemd-sysext') (unit systemd-sysext.service)... Mar 12 23:48:13.273105 systemd[1]: Reloading... Mar 12 23:48:13.331328 zram_generator::config[1354]: No configuration found. Mar 12 23:48:13.503041 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 12 23:48:13.503948 systemd[1]: Reloading finished in 230 ms. Mar 12 23:48:13.531412 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 12 23:48:13.533208 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 12 23:48:13.544356 systemd[1]: Starting ensure-sysext.service... Mar 12 23:48:13.546159 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:48:13.548681 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:48:13.555562 systemd[1]: Reload requested from client PID 1392 ('systemctl') (unit ensure-sysext.service)... Mar 12 23:48:13.555574 systemd[1]: Reloading... Mar 12 23:48:13.561028 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 12 23:48:13.562916 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 12 23:48:13.563255 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 12 23:48:13.563592 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 12 23:48:13.564268 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 12 23:48:13.564583 systemd-tmpfiles[1393]: ACLs are not supported, ignoring. Mar 12 23:48:13.564723 systemd-tmpfiles[1393]: ACLs are not supported, ignoring. Mar 12 23:48:13.567779 systemd-tmpfiles[1393]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:48:13.567882 systemd-tmpfiles[1393]: Skipping /boot Mar 12 23:48:13.573451 systemd-tmpfiles[1393]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:48:13.573528 systemd-tmpfiles[1393]: Skipping /boot Mar 12 23:48:13.581755 systemd-udevd[1394]: Using default interface naming scheme 'v255'. Mar 12 23:48:13.613326 zram_generator::config[1427]: No configuration found. Mar 12 23:48:13.661802 ldconfig[1293]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 12 23:48:13.774324 kernel: mousedev: PS/2 mouse device common for all mice Mar 12 23:48:13.780797 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 12 23:48:13.782758 systemd[1]: Reloading finished in 226 ms. Mar 12 23:48:13.795282 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:48:13.797275 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 12 23:48:13.803498 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:48:13.825726 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 23:48:13.830360 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Mar 12 23:48:13.832020 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 12 23:48:13.832087 kernel: [drm] features: -context_init Mar 12 23:48:13.832865 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:48:13.854007 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 12 23:48:13.855643 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:48:13.862610 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:48:13.873365 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:48:13.883780 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:48:13.885186 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:48:13.888335 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 12 23:48:13.893989 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:48:13.900344 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 12 23:48:13.904822 kernel: [drm] number of scanouts: 1 Mar 12 23:48:13.904958 kernel: [drm] number of cap sets: 0 Mar 12 23:48:13.905501 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:48:13.910324 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Mar 12 23:48:13.910718 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:48:13.915186 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 12 23:48:13.915324 kernel: Console: switching to colour frame buffer device 160x50 Mar 12 23:48:13.916311 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 12 23:48:13.934224 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:48:13.935442 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:48:13.937091 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:48:13.937253 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:48:13.939671 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:48:13.939853 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:48:13.941534 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 12 23:48:13.951385 augenrules[1552]: No rules Mar 12 23:48:13.953329 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:48:13.953534 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:48:13.960997 systemd[1]: Finished ensure-sysext.service. Mar 12 23:48:13.962469 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 12 23:48:13.964286 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 12 23:48:13.974653 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:48:13.975762 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:48:13.977830 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:48:13.990957 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:48:13.993095 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:48:13.995118 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Mar 12 23:48:13.996374 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:48:13.996423 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:48:13.996480 systemd[1]: Reached target time-set.target - System Time Set. Mar 12 23:48:13.998506 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 12 23:48:14.000925 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 12 23:48:14.003241 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:48:14.005043 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:48:14.005230 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:48:14.006795 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:48:14.006960 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:48:14.010445 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 12 23:48:14.010564 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 12 23:48:14.010039 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:48:14.010185 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:48:14.011898 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:48:14.012056 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:48:14.013655 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 12 23:48:14.017404 kernel: PTP clock support registered Mar 12 23:48:14.020378 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:48:14.020480 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:48:14.021864 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Mar 12 23:48:14.022091 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Mar 12 23:48:14.048707 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 12 23:48:14.063206 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 12 23:48:14.068730 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 12 23:48:14.111365 systemd-resolved[1541]: Positive Trust Anchors: Mar 12 23:48:14.111379 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:48:14.111379 systemd-resolved[1541]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:48:14.111412 systemd-resolved[1541]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:48:14.116682 systemd-resolved[1541]: Using system hostname 'ci-4459-2-4-n-9e79e0a9ae'. Mar 12 23:48:14.117740 systemd-networkd[1540]: lo: Link UP Mar 12 23:48:14.117748 systemd-networkd[1540]: lo: Gained carrier Mar 12 23:48:14.118586 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:48:14.118759 systemd-networkd[1540]: Enumeration completed Mar 12 23:48:14.119205 systemd-networkd[1540]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:14.119213 systemd-networkd[1540]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:48:14.120034 systemd-networkd[1540]: eth0: Link UP Mar 12 23:48:14.120150 systemd-networkd[1540]: eth0: Gained carrier Mar 12 23:48:14.120163 systemd-networkd[1540]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:48:14.120196 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:48:14.121572 systemd[1]: Reached target network.target - Network. Mar 12 23:48:14.122426 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:48:14.123520 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:48:14.124612 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 12 23:48:14.125750 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 12 23:48:14.127048 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 12 23:48:14.128159 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 12 23:48:14.129395 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 12 23:48:14.130488 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 12 23:48:14.130523 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:48:14.131325 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:48:14.133352 systemd-networkd[1540]: eth0: DHCPv4 address 10.0.8.7/25, gateway 10.0.8.1 acquired from 10.0.8.1 Mar 12 23:48:14.133955 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 12 23:48:14.136268 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 12 23:48:14.138943 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 12 23:48:14.140339 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 12 23:48:14.141467 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 12 23:48:14.145336 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 12 23:48:14.146584 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 12 23:48:14.148892 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 12 23:48:14.150937 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 12 23:48:14.152532 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 12 23:48:14.153745 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:48:14.154612 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:48:14.155556 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:48:14.155583 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:48:14.158356 systemd[1]: Starting chronyd.service - NTP client/server... Mar 12 23:48:14.161409 systemd[1]: Starting containerd.service - containerd container runtime... Mar 12 23:48:14.163964 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 12 23:48:14.166022 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 12 23:48:14.170467 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 12 23:48:14.171326 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:14.173054 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 12 23:48:14.175165 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 12 23:48:14.176230 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 12 23:48:14.180546 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 12 23:48:14.182472 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 12 23:48:14.184255 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 12 23:48:14.186505 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 12 23:48:14.191517 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 12 23:48:14.193290 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 12 23:48:14.193709 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 12 23:48:14.196433 jq[1600]: false Mar 12 23:48:14.197008 systemd[1]: Starting update-engine.service - Update Engine... Mar 12 23:48:14.198962 extend-filesystems[1601]: Found /dev/vda6 Mar 12 23:48:14.199513 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 12 23:48:14.205202 extend-filesystems[1601]: Found /dev/vda9 Mar 12 23:48:14.205202 extend-filesystems[1601]: Checking size of /dev/vda9 Mar 12 23:48:14.203341 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 12 23:48:14.205557 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 12 23:48:14.207255 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 12 23:48:14.207442 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 12 23:48:14.209680 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 12 23:48:14.209881 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 12 23:48:14.212814 systemd[1]: motdgen.service: Deactivated successfully. Mar 12 23:48:14.215918 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 12 23:48:14.222064 jq[1614]: true Mar 12 23:48:14.232028 chronyd[1593]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 12 23:48:14.232970 extend-filesystems[1601]: Resized partition /dev/vda9 Mar 12 23:48:14.234156 chronyd[1593]: Loaded seccomp filter (level 2) Mar 12 23:48:14.234247 systemd[1]: Started chronyd.service - NTP client/server. Mar 12 23:48:14.237711 (ntainerd)[1628]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 12 23:48:14.238771 tar[1624]: linux-arm64/LICENSE Mar 12 23:48:14.239120 tar[1624]: linux-arm64/helm Mar 12 23:48:14.242088 extend-filesystems[1641]: resize2fs 1.47.3 (8-Jul-2025) Mar 12 23:48:14.253116 jq[1635]: true Mar 12 23:48:14.253593 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Mar 12 23:48:14.257656 update_engine[1612]: I20260312 23:48:14.257387 1612 main.cc:92] Flatcar Update Engine starting Mar 12 23:48:14.285209 systemd-logind[1610]: New seat seat0. Mar 12 23:48:14.287161 systemd-logind[1610]: Watching system buttons on /dev/input/event0 (Power Button) Mar 12 23:48:14.287183 systemd-logind[1610]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Mar 12 23:48:14.287437 systemd[1]: Started systemd-logind.service - User Login Management. Mar 12 23:48:14.296944 dbus-daemon[1596]: [system] SELinux support is enabled Mar 12 23:48:14.297988 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 12 23:48:14.300961 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 12 23:48:14.301004 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 12 23:48:14.303440 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 12 23:48:14.303464 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 12 23:48:14.305963 update_engine[1612]: I20260312 23:48:14.304497 1612 update_check_scheduler.cc:74] Next update check in 2m12s Mar 12 23:48:14.305371 systemd[1]: Started update-engine.service - Update Engine. Mar 12 23:48:14.306158 dbus-daemon[1596]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 12 23:48:14.307750 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 12 23:48:14.356472 locksmithd[1656]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 12 23:48:14.438102 containerd[1628]: time="2026-03-12T23:48:14Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 12 23:48:14.439877 containerd[1628]: time="2026-03-12T23:48:14.439822640Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 12 23:48:14.450569 containerd[1628]: time="2026-03-12T23:48:14.450526360Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.92µs" Mar 12 23:48:14.450569 containerd[1628]: time="2026-03-12T23:48:14.450562960Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 12 23:48:14.457410 containerd[1628]: time="2026-03-12T23:48:14.450581440Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 12 23:48:14.459183 containerd[1628]: time="2026-03-12T23:48:14.459148160Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 12 23:48:14.459183 containerd[1628]: time="2026-03-12T23:48:14.459183800Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 12 23:48:14.459271 containerd[1628]: time="2026-03-12T23:48:14.459210600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:48:14.459312 containerd[1628]: time="2026-03-12T23:48:14.459268680Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:48:14.459312 containerd[1628]: time="2026-03-12T23:48:14.459282000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:48:14.459565 containerd[1628]: time="2026-03-12T23:48:14.459529400Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:48:14.459565 containerd[1628]: time="2026-03-12T23:48:14.459551840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:48:14.459565 containerd[1628]: time="2026-03-12T23:48:14.459563760Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:48:14.459796 containerd[1628]: time="2026-03-12T23:48:14.459571960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 12 23:48:14.463974 containerd[1628]: time="2026-03-12T23:48:14.463924000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 12 23:48:14.465221 containerd[1628]: time="2026-03-12T23:48:14.465173400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:48:14.465221 containerd[1628]: time="2026-03-12T23:48:14.465223120Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:48:14.465383 containerd[1628]: time="2026-03-12T23:48:14.465236080Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 12 23:48:14.465383 containerd[1628]: time="2026-03-12T23:48:14.465273080Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 12 23:48:14.465652 containerd[1628]: time="2026-03-12T23:48:14.465625480Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 12 23:48:14.465715 containerd[1628]: time="2026-03-12T23:48:14.465694480Z" level=info msg="metadata content store policy set" policy=shared Mar 12 23:48:14.527880 bash[1660]: Updated "/home/core/.ssh/authorized_keys" Mar 12 23:48:14.531340 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 12 23:48:14.534311 systemd[1]: Starting sshkeys.service... Mar 12 23:48:14.559728 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 12 23:48:14.562197 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 12 23:48:14.573421 containerd[1628]: time="2026-03-12T23:48:14.573365120Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 12 23:48:14.573534 containerd[1628]: time="2026-03-12T23:48:14.573512880Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 12 23:48:14.573561 containerd[1628]: time="2026-03-12T23:48:14.573535640Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 12 23:48:14.573561 containerd[1628]: time="2026-03-12T23:48:14.573552120Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 12 23:48:14.573610 containerd[1628]: time="2026-03-12T23:48:14.573565320Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 12 23:48:14.573610 containerd[1628]: time="2026-03-12T23:48:14.573585400Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 12 23:48:14.573610 containerd[1628]: time="2026-03-12T23:48:14.573598400Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 12 23:48:14.573659 containerd[1628]: time="2026-03-12T23:48:14.573610840Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 12 23:48:14.573659 containerd[1628]: time="2026-03-12T23:48:14.573624000Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 12 23:48:14.573659 containerd[1628]: time="2026-03-12T23:48:14.573634240Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 12 23:48:14.573659 containerd[1628]: time="2026-03-12T23:48:14.573643760Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 12 23:48:14.573726 containerd[1628]: time="2026-03-12T23:48:14.573663760Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573832840Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573859480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573875120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573885600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573908360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573920200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573935800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573946400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573958360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 12 23:48:14.573977 containerd[1628]: time="2026-03-12T23:48:14.573983360Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 12 23:48:14.574171 containerd[1628]: time="2026-03-12T23:48:14.573997280Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 12 23:48:14.574343 containerd[1628]: time="2026-03-12T23:48:14.574219120Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 12 23:48:14.574343 containerd[1628]: time="2026-03-12T23:48:14.574241120Z" level=info msg="Start snapshots syncer" Mar 12 23:48:14.574343 containerd[1628]: time="2026-03-12T23:48:14.574265520Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 12 23:48:14.574807 containerd[1628]: time="2026-03-12T23:48:14.574645920Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 12 23:48:14.574807 containerd[1628]: time="2026-03-12T23:48:14.574702280Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.574767560Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575064760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575090280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575103040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575125120Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575138760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575149480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575160560Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575201560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575215200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575226160Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575257080Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575280360Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:48:14.575855 containerd[1628]: time="2026-03-12T23:48:14.575290000Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575359800Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575369520Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575389240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575403120Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575509520Z" level=info msg="runtime interface created" Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575515760Z" level=info msg="created NRI interface" Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575523960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575535680Z" level=info msg="Connect containerd service" Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.575556040Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 12 23:48:14.577257 containerd[1628]: time="2026-03-12T23:48:14.576330080Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 23:48:14.577442 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:14.666472 containerd[1628]: time="2026-03-12T23:48:14.666336720Z" level=info msg="Start subscribing containerd event" Mar 12 23:48:14.666472 containerd[1628]: time="2026-03-12T23:48:14.666437400Z" level=info msg="Start recovering state" Mar 12 23:48:14.666601 containerd[1628]: time="2026-03-12T23:48:14.666524640Z" level=info msg="Start event monitor" Mar 12 23:48:14.666601 containerd[1628]: time="2026-03-12T23:48:14.666537040Z" level=info msg="Start cni network conf syncer for default" Mar 12 23:48:14.666601 containerd[1628]: time="2026-03-12T23:48:14.666543400Z" level=info msg="Start streaming server" Mar 12 23:48:14.666601 containerd[1628]: time="2026-03-12T23:48:14.666553600Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 12 23:48:14.666601 containerd[1628]: time="2026-03-12T23:48:14.666560720Z" level=info msg="runtime interface starting up..." Mar 12 23:48:14.666601 containerd[1628]: time="2026-03-12T23:48:14.666565880Z" level=info msg="starting plugins..." Mar 12 23:48:14.666601 containerd[1628]: time="2026-03-12T23:48:14.666578400Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 12 23:48:14.667099 containerd[1628]: time="2026-03-12T23:48:14.667076200Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 12 23:48:14.667164 containerd[1628]: time="2026-03-12T23:48:14.667141280Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 12 23:48:14.668086 containerd[1628]: time="2026-03-12T23:48:14.668059680Z" level=info msg="containerd successfully booted in 0.230395s" Mar 12 23:48:14.668159 systemd[1]: Started containerd.service - containerd container runtime. Mar 12 23:48:14.798020 tar[1624]: linux-arm64/README.md Mar 12 23:48:14.805387 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Mar 12 23:48:14.830324 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 12 23:48:14.835979 extend-filesystems[1641]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 12 23:48:14.835979 extend-filesystems[1641]: old_desc_blocks = 1, new_desc_blocks = 6 Mar 12 23:48:14.835979 extend-filesystems[1641]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Mar 12 23:48:14.840524 extend-filesystems[1601]: Resized filesystem in /dev/vda9 Mar 12 23:48:14.837586 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 12 23:48:14.837801 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 12 23:48:15.182524 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:15.207396 systemd-networkd[1540]: eth0: Gained IPv6LL Mar 12 23:48:15.209980 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 12 23:48:15.211950 systemd[1]: Reached target network-online.target - Network is Online. Mar 12 23:48:15.214471 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:48:15.217630 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 12 23:48:15.258506 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 12 23:48:15.485911 sshd_keygen[1626]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 12 23:48:15.505736 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 12 23:48:15.508828 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 12 23:48:15.526692 systemd[1]: issuegen.service: Deactivated successfully. Mar 12 23:48:15.526920 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 12 23:48:15.530619 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 12 23:48:15.547110 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 12 23:48:15.550892 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 12 23:48:15.553519 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 12 23:48:15.555149 systemd[1]: Reached target getty.target - Login Prompts. Mar 12 23:48:15.587337 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:16.265353 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:48:16.269920 (kubelet)[1732]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:48:16.926227 kubelet[1732]: E0312 23:48:16.926127 1732 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:48:16.928662 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:48:16.928804 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:48:16.929147 systemd[1]: kubelet.service: Consumed 797ms CPU time, 248.9M memory peak. Mar 12 23:48:17.193320 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:17.598335 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:21.201321 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:21.210831 coreos-metadata[1595]: Mar 12 23:48:21.210 WARN failed to locate config-drive, using the metadata service API instead Mar 12 23:48:21.226893 coreos-metadata[1595]: Mar 12 23:48:21.226 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 12 23:48:21.610317 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:48:21.616336 coreos-metadata[1675]: Mar 12 23:48:21.616 WARN failed to locate config-drive, using the metadata service API instead Mar 12 23:48:21.628971 coreos-metadata[1675]: Mar 12 23:48:21.628 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 12 23:48:22.797514 coreos-metadata[1675]: Mar 12 23:48:22.797 INFO Fetch successful Mar 12 23:48:22.797514 coreos-metadata[1675]: Mar 12 23:48:22.797 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 12 23:48:23.487485 coreos-metadata[1675]: Mar 12 23:48:23.487 INFO Fetch successful Mar 12 23:48:23.489999 unknown[1675]: wrote ssh authorized keys file for user: core Mar 12 23:48:23.522326 update-ssh-keys[1751]: Updated "/home/core/.ssh/authorized_keys" Mar 12 23:48:23.523284 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 12 23:48:23.526332 systemd[1]: Finished sshkeys.service. Mar 12 23:48:25.133991 coreos-metadata[1595]: Mar 12 23:48:25.133 INFO Fetch successful Mar 12 23:48:25.133991 coreos-metadata[1595]: Mar 12 23:48:25.133 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 23:48:25.825314 coreos-metadata[1595]: Mar 12 23:48:25.825 INFO Fetch successful Mar 12 23:48:25.825314 coreos-metadata[1595]: Mar 12 23:48:25.825 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 12 23:48:26.513154 coreos-metadata[1595]: Mar 12 23:48:26.513 INFO Fetch successful Mar 12 23:48:26.513154 coreos-metadata[1595]: Mar 12 23:48:26.513 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 12 23:48:27.074966 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 12 23:48:27.076421 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:48:27.196229 coreos-metadata[1595]: Mar 12 23:48:27.196 INFO Fetch successful Mar 12 23:48:27.196229 coreos-metadata[1595]: Mar 12 23:48:27.196 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 12 23:48:27.206687 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:48:27.210394 (kubelet)[1763]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:48:27.251063 kubelet[1763]: E0312 23:48:27.251013 1763 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:48:27.253874 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:48:27.254002 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:48:27.255383 systemd[1]: kubelet.service: Consumed 139ms CPU time, 108.1M memory peak. Mar 12 23:48:27.884871 coreos-metadata[1595]: Mar 12 23:48:27.884 INFO Fetch successful Mar 12 23:48:27.884871 coreos-metadata[1595]: Mar 12 23:48:27.884 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 12 23:48:29.769262 coreos-metadata[1595]: Mar 12 23:48:29.769 INFO Fetch successful Mar 12 23:48:29.803153 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 12 23:48:29.803609 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 12 23:48:29.803733 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 12 23:48:29.803857 systemd[1]: Startup finished in 3.025s (kernel) + 13.396s (initrd) + 17.853s (userspace) = 34.275s. Mar 12 23:48:37.324755 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 12 23:48:37.326038 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:48:37.444778 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:48:37.448192 (kubelet)[1785]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:48:37.477613 kubelet[1785]: E0312 23:48:37.477573 1785 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:48:37.479910 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:48:37.480035 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:48:37.480553 systemd[1]: kubelet.service: Consumed 132ms CPU time, 107.4M memory peak. Mar 12 23:48:38.018685 chronyd[1593]: Selected source PHC0 Mar 12 23:48:42.805241 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 12 23:48:42.806545 systemd[1]: Started sshd@0-10.0.8.7:22-20.161.92.111:59532.service - OpenSSH per-connection server daemon (20.161.92.111:59532). Mar 12 23:48:43.345209 sshd[1795]: Accepted publickey for core from 20.161.92.111 port 59532 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:48:43.350165 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:43.356126 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 12 23:48:43.356953 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 12 23:48:43.362102 systemd-logind[1610]: New session 1 of user core. Mar 12 23:48:43.379229 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 12 23:48:43.381428 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 12 23:48:43.401410 (systemd)[1800]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 12 23:48:43.403523 systemd-logind[1610]: New session c1 of user core. Mar 12 23:48:43.534245 systemd[1800]: Queued start job for default target default.target. Mar 12 23:48:43.544542 systemd[1800]: Created slice app.slice - User Application Slice. Mar 12 23:48:43.544572 systemd[1800]: Reached target paths.target - Paths. Mar 12 23:48:43.544608 systemd[1800]: Reached target timers.target - Timers. Mar 12 23:48:43.545733 systemd[1800]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 12 23:48:43.555058 systemd[1800]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 12 23:48:43.555120 systemd[1800]: Reached target sockets.target - Sockets. Mar 12 23:48:43.555155 systemd[1800]: Reached target basic.target - Basic System. Mar 12 23:48:43.555181 systemd[1800]: Reached target default.target - Main User Target. Mar 12 23:48:43.555207 systemd[1800]: Startup finished in 146ms. Mar 12 23:48:43.555482 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 12 23:48:43.564454 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 12 23:48:43.863578 systemd[1]: Started sshd@1-10.0.8.7:22-20.161.92.111:59548.service - OpenSSH per-connection server daemon (20.161.92.111:59548). Mar 12 23:48:44.373345 sshd[1811]: Accepted publickey for core from 20.161.92.111 port 59548 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:48:44.374491 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:44.378686 systemd-logind[1610]: New session 2 of user core. Mar 12 23:48:44.383616 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 12 23:48:44.661718 sshd[1814]: Connection closed by 20.161.92.111 port 59548 Mar 12 23:48:44.662285 sshd-session[1811]: pam_unix(sshd:session): session closed for user core Mar 12 23:48:44.665779 systemd[1]: sshd@1-10.0.8.7:22-20.161.92.111:59548.service: Deactivated successfully. Mar 12 23:48:44.667192 systemd[1]: session-2.scope: Deactivated successfully. Mar 12 23:48:44.667926 systemd-logind[1610]: Session 2 logged out. Waiting for processes to exit. Mar 12 23:48:44.669022 systemd-logind[1610]: Removed session 2. Mar 12 23:48:44.778626 systemd[1]: Started sshd@2-10.0.8.7:22-20.161.92.111:59552.service - OpenSSH per-connection server daemon (20.161.92.111:59552). Mar 12 23:48:45.283288 sshd[1820]: Accepted publickey for core from 20.161.92.111 port 59552 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:48:45.284718 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:45.288937 systemd-logind[1610]: New session 3 of user core. Mar 12 23:48:45.302652 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 12 23:48:45.564637 sshd[1823]: Connection closed by 20.161.92.111 port 59552 Mar 12 23:48:45.565072 sshd-session[1820]: pam_unix(sshd:session): session closed for user core Mar 12 23:48:45.568482 systemd[1]: sshd@2-10.0.8.7:22-20.161.92.111:59552.service: Deactivated successfully. Mar 12 23:48:45.570686 systemd[1]: session-3.scope: Deactivated successfully. Mar 12 23:48:45.571434 systemd-logind[1610]: Session 3 logged out. Waiting for processes to exit. Mar 12 23:48:45.574015 systemd-logind[1610]: Removed session 3. Mar 12 23:48:45.673523 systemd[1]: Started sshd@3-10.0.8.7:22-20.161.92.111:59554.service - OpenSSH per-connection server daemon (20.161.92.111:59554). Mar 12 23:48:46.201498 sshd[1829]: Accepted publickey for core from 20.161.92.111 port 59554 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:48:46.202722 sshd-session[1829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:46.207549 systemd-logind[1610]: New session 4 of user core. Mar 12 23:48:46.216599 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 12 23:48:46.489225 sshd[1832]: Connection closed by 20.161.92.111 port 59554 Mar 12 23:48:46.489954 sshd-session[1829]: pam_unix(sshd:session): session closed for user core Mar 12 23:48:46.493029 systemd[1]: sshd@3-10.0.8.7:22-20.161.92.111:59554.service: Deactivated successfully. Mar 12 23:48:46.494560 systemd[1]: session-4.scope: Deactivated successfully. Mar 12 23:48:46.496974 systemd-logind[1610]: Session 4 logged out. Waiting for processes to exit. Mar 12 23:48:46.498059 systemd-logind[1610]: Removed session 4. Mar 12 23:48:46.597406 systemd[1]: Started sshd@4-10.0.8.7:22-20.161.92.111:59568.service - OpenSSH per-connection server daemon (20.161.92.111:59568). Mar 12 23:48:47.130339 sshd[1838]: Accepted publickey for core from 20.161.92.111 port 59568 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:48:47.130804 sshd-session[1838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:47.134336 systemd-logind[1610]: New session 5 of user core. Mar 12 23:48:47.142424 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 12 23:48:47.348819 sudo[1842]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 12 23:48:47.349080 sudo[1842]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:48:47.363426 sudo[1842]: pam_unix(sudo:session): session closed for user root Mar 12 23:48:47.459448 sshd[1841]: Connection closed by 20.161.92.111 port 59568 Mar 12 23:48:47.459336 sshd-session[1838]: pam_unix(sshd:session): session closed for user core Mar 12 23:48:47.463108 systemd[1]: sshd@4-10.0.8.7:22-20.161.92.111:59568.service: Deactivated successfully. Mar 12 23:48:47.464558 systemd[1]: session-5.scope: Deactivated successfully. Mar 12 23:48:47.465194 systemd-logind[1610]: Session 5 logged out. Waiting for processes to exit. Mar 12 23:48:47.466532 systemd-logind[1610]: Removed session 5. Mar 12 23:48:47.566821 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 12 23:48:47.568280 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:48:47.569164 systemd[1]: Started sshd@5-10.0.8.7:22-20.161.92.111:59570.service - OpenSSH per-connection server daemon (20.161.92.111:59570). Mar 12 23:48:47.706890 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:48:47.710432 (kubelet)[1859]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:48:47.741384 kubelet[1859]: E0312 23:48:47.741330 1859 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:48:47.743664 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:48:47.743816 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:48:47.744109 systemd[1]: kubelet.service: Consumed 132ms CPU time, 107.5M memory peak. Mar 12 23:48:48.092329 sshd[1849]: Accepted publickey for core from 20.161.92.111 port 59570 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:48:48.093622 sshd-session[1849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:48.098025 systemd-logind[1610]: New session 6 of user core. Mar 12 23:48:48.105494 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 12 23:48:48.287183 sudo[1869]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 12 23:48:48.287468 sudo[1869]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:48:48.292420 sudo[1869]: pam_unix(sudo:session): session closed for user root Mar 12 23:48:48.296878 sudo[1868]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 12 23:48:48.297125 sudo[1868]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:48:48.305643 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:48:48.348642 augenrules[1891]: No rules Mar 12 23:48:48.349673 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:48:48.349867 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:48:48.351818 sudo[1868]: pam_unix(sudo:session): session closed for user root Mar 12 23:48:48.447367 sshd[1867]: Connection closed by 20.161.92.111 port 59570 Mar 12 23:48:48.447245 sshd-session[1849]: pam_unix(sshd:session): session closed for user core Mar 12 23:48:48.451570 systemd[1]: sshd@5-10.0.8.7:22-20.161.92.111:59570.service: Deactivated successfully. Mar 12 23:48:48.452983 systemd[1]: session-6.scope: Deactivated successfully. Mar 12 23:48:48.453706 systemd-logind[1610]: Session 6 logged out. Waiting for processes to exit. Mar 12 23:48:48.454783 systemd-logind[1610]: Removed session 6. Mar 12 23:48:48.551580 systemd[1]: Started sshd@6-10.0.8.7:22-20.161.92.111:59580.service - OpenSSH per-connection server daemon (20.161.92.111:59580). Mar 12 23:48:49.079043 sshd[1900]: Accepted publickey for core from 20.161.92.111 port 59580 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:48:49.080695 sshd-session[1900]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:48:49.084395 systemd-logind[1610]: New session 7 of user core. Mar 12 23:48:49.092431 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 12 23:48:49.273374 sudo[1904]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 12 23:48:49.273629 sudo[1904]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:48:49.661778 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 12 23:48:49.678937 (dockerd)[1924]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 12 23:48:49.926963 dockerd[1924]: time="2026-03-12T23:48:49.926841833Z" level=info msg="Starting up" Mar 12 23:48:49.927829 dockerd[1924]: time="2026-03-12T23:48:49.927805111Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 12 23:48:49.937464 dockerd[1924]: time="2026-03-12T23:48:49.937431375Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 12 23:48:49.956016 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1869382613-merged.mount: Deactivated successfully. Mar 12 23:48:49.984822 systemd[1]: var-lib-docker-metacopy\x2dcheck1429009364-merged.mount: Deactivated successfully. Mar 12 23:48:49.995998 dockerd[1924]: time="2026-03-12T23:48:49.995939674Z" level=info msg="Loading containers: start." Mar 12 23:48:50.008345 kernel: Initializing XFRM netlink socket Mar 12 23:48:50.250177 systemd-networkd[1540]: docker0: Link UP Mar 12 23:48:50.257193 dockerd[1924]: time="2026-03-12T23:48:50.257146384Z" level=info msg="Loading containers: done." Mar 12 23:48:50.272749 dockerd[1924]: time="2026-03-12T23:48:50.272694077Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 12 23:48:50.272885 dockerd[1924]: time="2026-03-12T23:48:50.272786477Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 12 23:48:50.272885 dockerd[1924]: time="2026-03-12T23:48:50.272877237Z" level=info msg="Initializing buildkit" Mar 12 23:48:50.300544 dockerd[1924]: time="2026-03-12T23:48:50.300506509Z" level=info msg="Completed buildkit initialization" Mar 12 23:48:50.305100 dockerd[1924]: time="2026-03-12T23:48:50.305074141Z" level=info msg="Daemon has completed initialization" Mar 12 23:48:50.305571 dockerd[1924]: time="2026-03-12T23:48:50.305162941Z" level=info msg="API listen on /run/docker.sock" Mar 12 23:48:50.305363 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 12 23:48:50.947104 containerd[1628]: time="2026-03-12T23:48:50.947064836Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 12 23:48:51.547868 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1717293659.mount: Deactivated successfully. Mar 12 23:48:52.459926 containerd[1628]: time="2026-03-12T23:48:52.459863750Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:52.461339 containerd[1628]: time="2026-03-12T23:48:52.461304347Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=24583350" Mar 12 23:48:52.463090 containerd[1628]: time="2026-03-12T23:48:52.463050944Z" level=info msg="ImageCreate event name:\"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:52.466286 containerd[1628]: time="2026-03-12T23:48:52.466230859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:52.467395 containerd[1628]: time="2026-03-12T23:48:52.467345937Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"24579851\" in 1.520239701s" Mar 12 23:48:52.467395 containerd[1628]: time="2026-03-12T23:48:52.467384297Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:3299c3f36446e899e7d38f97cdbd93a12ace0457ebca8f6d94ab33d86f9740bd\"" Mar 12 23:48:52.468089 containerd[1628]: time="2026-03-12T23:48:52.468057615Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 12 23:48:53.548783 containerd[1628]: time="2026-03-12T23:48:53.548389755Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:53.549870 containerd[1628]: time="2026-03-12T23:48:53.549844792Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=19139661" Mar 12 23:48:53.551618 containerd[1628]: time="2026-03-12T23:48:53.551575669Z" level=info msg="ImageCreate event name:\"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:53.555373 containerd[1628]: time="2026-03-12T23:48:53.555347303Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:53.556380 containerd[1628]: time="2026-03-12T23:48:53.556356541Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"20724045\" in 1.088270246s" Mar 12 23:48:53.556505 containerd[1628]: time="2026-03-12T23:48:53.556466901Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:be20fbe989d9e759458cc8dbbc6e6c4a17e5d6f9db86b2a6cf4e3dfba0fe86e5\"" Mar 12 23:48:53.556951 containerd[1628]: time="2026-03-12T23:48:53.556924540Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 12 23:48:54.488057 containerd[1628]: time="2026-03-12T23:48:54.487971065Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:54.490334 containerd[1628]: time="2026-03-12T23:48:54.490158061Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=14195564" Mar 12 23:48:54.492492 containerd[1628]: time="2026-03-12T23:48:54.492428777Z" level=info msg="ImageCreate event name:\"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:54.495928 containerd[1628]: time="2026-03-12T23:48:54.495878611Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:54.496818 containerd[1628]: time="2026-03-12T23:48:54.496789970Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"15779966\" in 939.83367ms" Mar 12 23:48:54.496861 containerd[1628]: time="2026-03-12T23:48:54.496824530Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:4addcfb720a81f20ddfad093c4a397bb9f3d99b798f610f0ecc83cafd7f0a3bd\"" Mar 12 23:48:54.497410 containerd[1628]: time="2026-03-12T23:48:54.497387689Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 12 23:48:55.432785 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount894917535.mount: Deactivated successfully. Mar 12 23:48:55.604689 containerd[1628]: time="2026-03-12T23:48:55.604623885Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:55.606915 containerd[1628]: time="2026-03-12T23:48:55.606885361Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=22697114" Mar 12 23:48:55.608664 containerd[1628]: time="2026-03-12T23:48:55.608640838Z" level=info msg="ImageCreate event name:\"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:55.611360 containerd[1628]: time="2026-03-12T23:48:55.611333474Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:55.611912 containerd[1628]: time="2026-03-12T23:48:55.611864673Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"22696107\" in 1.114448424s" Mar 12 23:48:55.611912 containerd[1628]: time="2026-03-12T23:48:55.611901073Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:8167398c8957d56adceac5bd6436d6ac238c546a5f5c92e450a1c380c0aa7d5d\"" Mar 12 23:48:55.612343 containerd[1628]: time="2026-03-12T23:48:55.612320952Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 12 23:48:56.244783 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1845528643.mount: Deactivated successfully. Mar 12 23:48:56.994175 containerd[1628]: time="2026-03-12T23:48:56.993323242Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:56.995987 containerd[1628]: time="2026-03-12T23:48:56.995955918Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395498" Mar 12 23:48:56.997567 containerd[1628]: time="2026-03-12T23:48:56.997517155Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:57.001561 containerd[1628]: time="2026-03-12T23:48:57.001535868Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:57.003420 containerd[1628]: time="2026-03-12T23:48:57.003367345Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.391018153s" Mar 12 23:48:57.003420 containerd[1628]: time="2026-03-12T23:48:57.003411225Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Mar 12 23:48:57.004154 containerd[1628]: time="2026-03-12T23:48:57.004114704Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 12 23:48:57.560013 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2569120702.mount: Deactivated successfully. Mar 12 23:48:57.570029 containerd[1628]: time="2026-03-12T23:48:57.569977818Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:57.571431 containerd[1628]: time="2026-03-12T23:48:57.571403455Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Mar 12 23:48:57.572872 containerd[1628]: time="2026-03-12T23:48:57.572819133Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:57.575397 containerd[1628]: time="2026-03-12T23:48:57.575354808Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:57.576066 containerd[1628]: time="2026-03-12T23:48:57.576027767Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 571.882343ms" Mar 12 23:48:57.576066 containerd[1628]: time="2026-03-12T23:48:57.576056847Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Mar 12 23:48:57.576515 containerd[1628]: time="2026-03-12T23:48:57.576492966Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 12 23:48:57.824538 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 12 23:48:57.825832 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:48:57.972216 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:48:57.975377 (kubelet)[2279]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:48:58.006491 kubelet[2279]: E0312 23:48:58.006439 2279 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:48:58.008754 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:48:58.008886 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:48:58.009191 systemd[1]: kubelet.service: Consumed 133ms CPU time, 106.8M memory peak. Mar 12 23:48:58.215653 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3711147419.mount: Deactivated successfully. Mar 12 23:48:58.867438 containerd[1628]: time="2026-03-12T23:48:58.867241023Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:58.868916 containerd[1628]: time="2026-03-12T23:48:58.868884940Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21125601" Mar 12 23:48:58.870862 containerd[1628]: time="2026-03-12T23:48:58.870826217Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:58.875152 containerd[1628]: time="2026-03-12T23:48:58.875092049Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:48:58.876557 containerd[1628]: time="2026-03-12T23:48:58.876523287Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 1.299999361s" Mar 12 23:48:58.876592 containerd[1628]: time="2026-03-12T23:48:58.876559447Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"" Mar 12 23:48:59.537987 update_engine[1612]: I20260312 23:48:59.537895 1612 update_attempter.cc:509] Updating boot flags... Mar 12 23:49:04.200321 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:04.200453 systemd[1]: kubelet.service: Consumed 133ms CPU time, 106.8M memory peak. Mar 12 23:49:04.202225 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:04.227712 systemd[1]: Reload requested from client PID 2396 ('systemctl') (unit session-7.scope)... Mar 12 23:49:04.227740 systemd[1]: Reloading... Mar 12 23:49:04.300329 zram_generator::config[2439]: No configuration found. Mar 12 23:49:04.466365 systemd[1]: Reloading finished in 238 ms. Mar 12 23:49:04.524833 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 12 23:49:04.524917 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 12 23:49:04.525410 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:04.525461 systemd[1]: kubelet.service: Consumed 88ms CPU time, 95M memory peak. Mar 12 23:49:04.526998 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:04.645339 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:04.649637 (kubelet)[2487]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:49:04.683447 kubelet[2487]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 23:49:04.683447 kubelet[2487]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:49:04.684600 kubelet[2487]: I0312 23:49:04.684534 2487 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 23:49:05.444726 kubelet[2487]: I0312 23:49:05.444008 2487 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 12 23:49:05.444726 kubelet[2487]: I0312 23:49:05.444041 2487 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:49:05.447037 kubelet[2487]: I0312 23:49:05.447002 2487 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 23:49:05.447087 kubelet[2487]: I0312 23:49:05.447042 2487 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:49:05.447536 kubelet[2487]: I0312 23:49:05.447495 2487 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 23:49:05.455760 kubelet[2487]: E0312 23:49:05.455694 2487 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.8.7:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 23:49:05.456795 kubelet[2487]: I0312 23:49:05.456761 2487 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:49:05.460102 kubelet[2487]: I0312 23:49:05.460086 2487 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:49:05.462798 kubelet[2487]: I0312 23:49:05.462774 2487 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 23:49:05.463096 kubelet[2487]: I0312 23:49:05.463069 2487 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:49:05.463317 kubelet[2487]: I0312 23:49:05.463153 2487 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-9e79e0a9ae","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:49:05.463457 kubelet[2487]: I0312 23:49:05.463443 2487 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 23:49:05.463508 kubelet[2487]: I0312 23:49:05.463500 2487 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 23:49:05.463655 kubelet[2487]: I0312 23:49:05.463640 2487 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 23:49:05.466995 kubelet[2487]: I0312 23:49:05.466971 2487 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:05.469827 kubelet[2487]: I0312 23:49:05.469800 2487 kubelet.go:475] "Attempting to sync node with API server" Mar 12 23:49:05.469921 kubelet[2487]: I0312 23:49:05.469910 2487 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:49:05.470507 kubelet[2487]: E0312 23:49:05.470457 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.8.7:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-9e79e0a9ae&limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 23:49:05.472120 kubelet[2487]: I0312 23:49:05.471092 2487 kubelet.go:387] "Adding apiserver pod source" Mar 12 23:49:05.472120 kubelet[2487]: I0312 23:49:05.471125 2487 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:49:05.472120 kubelet[2487]: E0312 23:49:05.471586 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.8.7:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 23:49:05.472471 kubelet[2487]: I0312 23:49:05.472453 2487 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:49:05.473196 kubelet[2487]: I0312 23:49:05.473172 2487 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:49:05.473289 kubelet[2487]: I0312 23:49:05.473280 2487 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 23:49:05.473421 kubelet[2487]: W0312 23:49:05.473408 2487 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 12 23:49:05.476286 kubelet[2487]: I0312 23:49:05.476264 2487 server.go:1262] "Started kubelet" Mar 12 23:49:05.478371 kubelet[2487]: I0312 23:49:05.478314 2487 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:49:05.478429 kubelet[2487]: I0312 23:49:05.478383 2487 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 23:49:05.478716 kubelet[2487]: I0312 23:49:05.478689 2487 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:49:05.478865 kubelet[2487]: I0312 23:49:05.478840 2487 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:49:05.479675 kubelet[2487]: I0312 23:49:05.479650 2487 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 23:49:05.482229 kubelet[2487]: I0312 23:49:05.481670 2487 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 12 23:49:05.482229 kubelet[2487]: E0312 23:49:05.481786 2487 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" Mar 12 23:49:05.482229 kubelet[2487]: I0312 23:49:05.482068 2487 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 23:49:05.482229 kubelet[2487]: I0312 23:49:05.482139 2487 reconciler.go:29] "Reconciler: start to sync state" Mar 12 23:49:05.483051 kubelet[2487]: I0312 23:49:05.482588 2487 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:49:05.483688 kubelet[2487]: I0312 23:49:05.483642 2487 server.go:310] "Adding debug handlers to kubelet server" Mar 12 23:49:05.488290 kubelet[2487]: E0312 23:49:05.487744 2487 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.7:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-9e79e0a9ae?timeout=10s\": dial tcp 10.0.8.7:6443: connect: connection refused" interval="200ms" Mar 12 23:49:05.488290 kubelet[2487]: E0312 23:49:05.487879 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.8.7:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 23:49:05.488776 kubelet[2487]: I0312 23:49:05.488752 2487 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:49:05.488925 kubelet[2487]: I0312 23:49:05.488851 2487 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:49:05.489733 kubelet[2487]: E0312 23:49:05.487970 2487 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.8.7:6443/api/v1/namespaces/default/events\": dial tcp 10.0.8.7:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-9e79e0a9ae.189c3cedd63ad8c7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-9e79e0a9ae,UID:ci-4459-2-4-n-9e79e0a9ae,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-9e79e0a9ae,},FirstTimestamp:2026-03-12 23:49:05.476229319 +0000 UTC m=+0.823697302,LastTimestamp:2026-03-12 23:49:05.476229319 +0000 UTC m=+0.823697302,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-9e79e0a9ae,}" Mar 12 23:49:05.490117 kubelet[2487]: E0312 23:49:05.490096 2487 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 23:49:05.490213 kubelet[2487]: I0312 23:49:05.490185 2487 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:49:05.498085 kubelet[2487]: I0312 23:49:05.498051 2487 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 23:49:05.498085 kubelet[2487]: I0312 23:49:05.498071 2487 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 23:49:05.498085 kubelet[2487]: I0312 23:49:05.498089 2487 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:05.500168 kubelet[2487]: I0312 23:49:05.500137 2487 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 23:49:05.501174 kubelet[2487]: I0312 23:49:05.501143 2487 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 23:49:05.501174 kubelet[2487]: I0312 23:49:05.501165 2487 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 12 23:49:05.501262 kubelet[2487]: I0312 23:49:05.501187 2487 kubelet.go:2428] "Starting kubelet main sync loop" Mar 12 23:49:05.501262 kubelet[2487]: E0312 23:49:05.501223 2487 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:49:05.502439 kubelet[2487]: I0312 23:49:05.502412 2487 policy_none.go:49] "None policy: Start" Mar 12 23:49:05.502439 kubelet[2487]: I0312 23:49:05.502436 2487 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 23:49:05.502498 kubelet[2487]: I0312 23:49:05.502447 2487 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 23:49:05.504676 kubelet[2487]: I0312 23:49:05.504630 2487 policy_none.go:47] "Start" Mar 12 23:49:05.506189 kubelet[2487]: E0312 23:49:05.506147 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.8.7:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 23:49:05.509709 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 12 23:49:05.524681 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 12 23:49:05.528159 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 12 23:49:05.541821 kubelet[2487]: E0312 23:49:05.541339 2487 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:49:05.541821 kubelet[2487]: I0312 23:49:05.541542 2487 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 23:49:05.541821 kubelet[2487]: I0312 23:49:05.541554 2487 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:49:05.541821 kubelet[2487]: I0312 23:49:05.541754 2487 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 23:49:05.543023 kubelet[2487]: E0312 23:49:05.542994 2487 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:49:05.543076 kubelet[2487]: E0312 23:49:05.543040 2487 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-9e79e0a9ae\" not found" Mar 12 23:49:05.613491 systemd[1]: Created slice kubepods-burstable-pod4f4fae850342de969b026666a272e751.slice - libcontainer container kubepods-burstable-pod4f4fae850342de969b026666a272e751.slice. Mar 12 23:49:05.627179 kubelet[2487]: E0312 23:49:05.627119 2487 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.630111 systemd[1]: Created slice kubepods-burstable-pod85e36aedfabe9a02ea0843015d24cd54.slice - libcontainer container kubepods-burstable-pod85e36aedfabe9a02ea0843015d24cd54.slice. Mar 12 23:49:05.632155 kubelet[2487]: E0312 23:49:05.632125 2487 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.642021 systemd[1]: Created slice kubepods-burstable-pod7871e1fa007a80c16e20b18b7e032c7b.slice - libcontainer container kubepods-burstable-pod7871e1fa007a80c16e20b18b7e032c7b.slice. Mar 12 23:49:05.644175 kubelet[2487]: E0312 23:49:05.644128 2487 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.644405 kubelet[2487]: I0312 23:49:05.644388 2487 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.644961 kubelet[2487]: E0312 23:49:05.644931 2487 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.7:6443/api/v1/nodes\": dial tcp 10.0.8.7:6443: connect: connection refused" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.689350 kubelet[2487]: E0312 23:49:05.688952 2487 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.7:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-9e79e0a9ae?timeout=10s\": dial tcp 10.0.8.7:6443: connect: connection refused" interval="400ms" Mar 12 23:49:05.785416 kubelet[2487]: I0312 23:49:05.785181 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.785416 kubelet[2487]: I0312 23:49:05.785395 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7871e1fa007a80c16e20b18b7e032c7b-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"7871e1fa007a80c16e20b18b7e032c7b\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.785659 kubelet[2487]: I0312 23:49:05.785471 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f4fae850342de969b026666a272e751-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"4f4fae850342de969b026666a272e751\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.785659 kubelet[2487]: I0312 23:49:05.785545 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.785659 kubelet[2487]: I0312 23:49:05.785586 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f4fae850342de969b026666a272e751-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"4f4fae850342de969b026666a272e751\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.785659 kubelet[2487]: I0312 23:49:05.785600 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f4fae850342de969b026666a272e751-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"4f4fae850342de969b026666a272e751\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.785659 kubelet[2487]: I0312 23:49:05.785625 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.786382 kubelet[2487]: I0312 23:49:05.785639 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.786382 kubelet[2487]: I0312 23:49:05.785655 2487 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.847325 kubelet[2487]: I0312 23:49:05.847109 2487 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.847509 kubelet[2487]: E0312 23:49:05.847479 2487 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.7:6443/api/v1/nodes\": dial tcp 10.0.8.7:6443: connect: connection refused" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:05.933321 containerd[1628]: time="2026-03-12T23:49:05.933262092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-9e79e0a9ae,Uid:4f4fae850342de969b026666a272e751,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:05.935773 containerd[1628]: time="2026-03-12T23:49:05.935522368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae,Uid:85e36aedfabe9a02ea0843015d24cd54,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:05.948773 containerd[1628]: time="2026-03-12T23:49:05.948729705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-9e79e0a9ae,Uid:7871e1fa007a80c16e20b18b7e032c7b,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:06.090321 kubelet[2487]: E0312 23:49:06.090212 2487 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.7:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-9e79e0a9ae?timeout=10s\": dial tcp 10.0.8.7:6443: connect: connection refused" interval="800ms" Mar 12 23:49:06.249335 kubelet[2487]: I0312 23:49:06.249275 2487 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:06.249649 kubelet[2487]: E0312 23:49:06.249613 2487 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.7:6443/api/v1/nodes\": dial tcp 10.0.8.7:6443: connect: connection refused" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:06.506434 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3153319215.mount: Deactivated successfully. Mar 12 23:49:06.516813 containerd[1628]: time="2026-03-12T23:49:06.516756087Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:06.521406 containerd[1628]: time="2026-03-12T23:49:06.521339559Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Mar 12 23:49:06.524195 containerd[1628]: time="2026-03-12T23:49:06.524140794Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:06.524662 kubelet[2487]: E0312 23:49:06.524623 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.8.7:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-9e79e0a9ae&limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 23:49:06.528351 containerd[1628]: time="2026-03-12T23:49:06.528315547Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:06.530034 containerd[1628]: time="2026-03-12T23:49:06.529996784Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 12 23:49:06.531744 containerd[1628]: time="2026-03-12T23:49:06.531683581Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:06.532649 containerd[1628]: time="2026-03-12T23:49:06.532175660Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 595.743334ms" Mar 12 23:49:06.533542 containerd[1628]: time="2026-03-12T23:49:06.533487098Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:49:06.538211 containerd[1628]: time="2026-03-12T23:49:06.537162651Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 12 23:49:06.542580 containerd[1628]: time="2026-03-12T23:49:06.542545482Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 604.173919ms" Mar 12 23:49:06.549452 containerd[1628]: time="2026-03-12T23:49:06.549416790Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 598.806968ms" Mar 12 23:49:06.553405 kubelet[2487]: E0312 23:49:06.553366 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.8.7:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 23:49:06.570197 containerd[1628]: time="2026-03-12T23:49:06.570114075Z" level=info msg="connecting to shim c38cdbade925f134fda08186d411871e359fd6350052f1aeef382d74b30b60b0" address="unix:///run/containerd/s/2b16c03d39595cf2bee2005dd234a251c4020d0da7afe422a4cf746f6101f967" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:06.581492 containerd[1628]: time="2026-03-12T23:49:06.581435015Z" level=info msg="connecting to shim ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989" address="unix:///run/containerd/s/427792e458f9adbf24c8a4cecd1ad88828d167e3c416c5dc12d755628b0df4cc" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:06.589280 containerd[1628]: time="2026-03-12T23:49:06.589237242Z" level=info msg="connecting to shim aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699" address="unix:///run/containerd/s/6bab9b77ae75a443330889a4b31f7aad2789f26b15d3547b213b089b2d980b66" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:06.603513 systemd[1]: Started cri-containerd-ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989.scope - libcontainer container ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989. Mar 12 23:49:06.605037 systemd[1]: Started cri-containerd-c38cdbade925f134fda08186d411871e359fd6350052f1aeef382d74b30b60b0.scope - libcontainer container c38cdbade925f134fda08186d411871e359fd6350052f1aeef382d74b30b60b0. Mar 12 23:49:06.617325 systemd[1]: Started cri-containerd-aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699.scope - libcontainer container aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699. Mar 12 23:49:06.652619 containerd[1628]: time="2026-03-12T23:49:06.652509373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae,Uid:85e36aedfabe9a02ea0843015d24cd54,Namespace:kube-system,Attempt:0,} returns sandbox id \"ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989\"" Mar 12 23:49:06.656221 containerd[1628]: time="2026-03-12T23:49:06.656181126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-9e79e0a9ae,Uid:4f4fae850342de969b026666a272e751,Namespace:kube-system,Attempt:0,} returns sandbox id \"c38cdbade925f134fda08186d411871e359fd6350052f1aeef382d74b30b60b0\"" Mar 12 23:49:06.661761 containerd[1628]: time="2026-03-12T23:49:06.661724037Z" level=info msg="CreateContainer within sandbox \"ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 12 23:49:06.662893 containerd[1628]: time="2026-03-12T23:49:06.662859875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-9e79e0a9ae,Uid:7871e1fa007a80c16e20b18b7e032c7b,Namespace:kube-system,Attempt:0,} returns sandbox id \"aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699\"" Mar 12 23:49:06.664420 containerd[1628]: time="2026-03-12T23:49:06.663995593Z" level=info msg="CreateContainer within sandbox \"c38cdbade925f134fda08186d411871e359fd6350052f1aeef382d74b30b60b0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 12 23:49:06.669373 containerd[1628]: time="2026-03-12T23:49:06.669332664Z" level=info msg="CreateContainer within sandbox \"aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 12 23:49:06.681854 containerd[1628]: time="2026-03-12T23:49:06.681797642Z" level=info msg="Container c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:06.687566 containerd[1628]: time="2026-03-12T23:49:06.687485593Z" level=info msg="Container 089448d727d47efb66617f9e70c13fb1ccb903b5b5d9b6859d402f2d259b4024: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:06.691505 kubelet[2487]: E0312 23:49:06.691448 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.8.7:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 23:49:06.692908 containerd[1628]: time="2026-03-12T23:49:06.692818663Z" level=info msg="Container ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:06.723691 containerd[1628]: time="2026-03-12T23:49:06.723629610Z" level=info msg="CreateContainer within sandbox \"c38cdbade925f134fda08186d411871e359fd6350052f1aeef382d74b30b60b0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"089448d727d47efb66617f9e70c13fb1ccb903b5b5d9b6859d402f2d259b4024\"" Mar 12 23:49:06.724305 containerd[1628]: time="2026-03-12T23:49:06.724263449Z" level=info msg="StartContainer for \"089448d727d47efb66617f9e70c13fb1ccb903b5b5d9b6859d402f2d259b4024\"" Mar 12 23:49:06.725365 containerd[1628]: time="2026-03-12T23:49:06.725339007Z" level=info msg="connecting to shim 089448d727d47efb66617f9e70c13fb1ccb903b5b5d9b6859d402f2d259b4024" address="unix:///run/containerd/s/2b16c03d39595cf2bee2005dd234a251c4020d0da7afe422a4cf746f6101f967" protocol=ttrpc version=3 Mar 12 23:49:06.736047 containerd[1628]: time="2026-03-12T23:49:06.736000589Z" level=info msg="CreateContainer within sandbox \"ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7\"" Mar 12 23:49:06.737236 containerd[1628]: time="2026-03-12T23:49:06.737179187Z" level=info msg="CreateContainer within sandbox \"aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6\"" Mar 12 23:49:06.738358 containerd[1628]: time="2026-03-12T23:49:06.737551306Z" level=info msg="StartContainer for \"c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7\"" Mar 12 23:49:06.738561 containerd[1628]: time="2026-03-12T23:49:06.738537025Z" level=info msg="StartContainer for \"ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6\"" Mar 12 23:49:06.738760 containerd[1628]: time="2026-03-12T23:49:06.738727824Z" level=info msg="connecting to shim c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7" address="unix:///run/containerd/s/427792e458f9adbf24c8a4cecd1ad88828d167e3c416c5dc12d755628b0df4cc" protocol=ttrpc version=3 Mar 12 23:49:06.740470 containerd[1628]: time="2026-03-12T23:49:06.740427661Z" level=info msg="connecting to shim ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6" address="unix:///run/containerd/s/6bab9b77ae75a443330889a4b31f7aad2789f26b15d3547b213b089b2d980b66" protocol=ttrpc version=3 Mar 12 23:49:06.751486 systemd[1]: Started cri-containerd-089448d727d47efb66617f9e70c13fb1ccb903b5b5d9b6859d402f2d259b4024.scope - libcontainer container 089448d727d47efb66617f9e70c13fb1ccb903b5b5d9b6859d402f2d259b4024. Mar 12 23:49:06.768333 kubelet[2487]: E0312 23:49:06.768224 2487 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.8.7:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.8.7:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 23:49:06.769690 systemd[1]: Started cri-containerd-c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7.scope - libcontainer container c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7. Mar 12 23:49:06.772421 systemd[1]: Started cri-containerd-ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6.scope - libcontainer container ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6. Mar 12 23:49:06.810779 containerd[1628]: time="2026-03-12T23:49:06.810667740Z" level=info msg="StartContainer for \"089448d727d47efb66617f9e70c13fb1ccb903b5b5d9b6859d402f2d259b4024\" returns successfully" Mar 12 23:49:06.820442 containerd[1628]: time="2026-03-12T23:49:06.820399364Z" level=info msg="StartContainer for \"ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6\" returns successfully" Mar 12 23:49:06.821876 containerd[1628]: time="2026-03-12T23:49:06.821846601Z" level=info msg="StartContainer for \"c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7\" returns successfully" Mar 12 23:49:06.891234 kubelet[2487]: E0312 23:49:06.891190 2487 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.7:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-9e79e0a9ae?timeout=10s\": dial tcp 10.0.8.7:6443: connect: connection refused" interval="1.6s" Mar 12 23:49:07.052379 kubelet[2487]: I0312 23:49:07.051698 2487 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:07.515377 kubelet[2487]: E0312 23:49:07.515006 2487 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:07.517665 kubelet[2487]: E0312 23:49:07.517641 2487 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:07.521078 kubelet[2487]: E0312 23:49:07.521050 2487 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.085112 kubelet[2487]: I0312 23:49:08.085060 2487 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.085112 kubelet[2487]: E0312 23:49:08.085105 2487 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-n-9e79e0a9ae\": node \"ci-4459-2-4-n-9e79e0a9ae\" not found" Mar 12 23:49:08.101551 kubelet[2487]: E0312 23:49:08.101515 2487 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" Mar 12 23:49:08.202107 kubelet[2487]: E0312 23:49:08.202070 2487 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" Mar 12 23:49:08.302773 kubelet[2487]: E0312 23:49:08.302740 2487 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-9e79e0a9ae\" not found" Mar 12 23:49:08.382653 kubelet[2487]: I0312 23:49:08.382461 2487 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.389373 kubelet[2487]: E0312 23:49:08.389341 2487 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.389373 kubelet[2487]: I0312 23:49:08.389371 2487 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.390957 kubelet[2487]: E0312 23:49:08.390931 2487 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.390957 kubelet[2487]: I0312 23:49:08.390953 2487 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.392439 kubelet[2487]: E0312 23:49:08.392409 2487 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-9e79e0a9ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.473392 kubelet[2487]: I0312 23:49:08.473320 2487 apiserver.go:52] "Watching apiserver" Mar 12 23:49:08.483134 kubelet[2487]: I0312 23:49:08.483104 2487 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 23:49:08.522463 kubelet[2487]: I0312 23:49:08.521725 2487 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.522463 kubelet[2487]: I0312 23:49:08.521843 2487 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.523838 kubelet[2487]: E0312 23:49:08.523813 2487 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-9e79e0a9ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:08.524050 kubelet[2487]: E0312 23:49:08.523880 2487 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:09.524435 kubelet[2487]: I0312 23:49:09.524333 2487 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.115797 systemd[1]: Reload requested from client PID 2775 ('systemctl') (unit session-7.scope)... Mar 12 23:49:10.115813 systemd[1]: Reloading... Mar 12 23:49:10.186392 zram_generator::config[2824]: No configuration found. Mar 12 23:49:10.355128 systemd[1]: Reloading finished in 239 ms. Mar 12 23:49:10.385704 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:10.413400 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 23:49:10.413630 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:10.413692 systemd[1]: kubelet.service: Consumed 1.164s CPU time, 124.5M memory peak. Mar 12 23:49:10.415341 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:49:10.573658 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:49:10.577873 (kubelet)[2863]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:49:10.618324 kubelet[2863]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 23:49:10.618324 kubelet[2863]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:49:10.618619 kubelet[2863]: I0312 23:49:10.618375 2863 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 23:49:10.626961 kubelet[2863]: I0312 23:49:10.626900 2863 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 12 23:49:10.626961 kubelet[2863]: I0312 23:49:10.626934 2863 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:49:10.626961 kubelet[2863]: I0312 23:49:10.626963 2863 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 23:49:10.626961 kubelet[2863]: I0312 23:49:10.626969 2863 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:49:10.627235 kubelet[2863]: I0312 23:49:10.627216 2863 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 23:49:10.628487 kubelet[2863]: I0312 23:49:10.628464 2863 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 12 23:49:10.630564 kubelet[2863]: I0312 23:49:10.630537 2863 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:49:10.633712 kubelet[2863]: I0312 23:49:10.633690 2863 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:49:10.636497 kubelet[2863]: I0312 23:49:10.636376 2863 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 23:49:10.636697 kubelet[2863]: I0312 23:49:10.636657 2863 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:49:10.636957 kubelet[2863]: I0312 23:49:10.636691 2863 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-9e79e0a9ae","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:49:10.637033 kubelet[2863]: I0312 23:49:10.636964 2863 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 23:49:10.637033 kubelet[2863]: I0312 23:49:10.636980 2863 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 23:49:10.637033 kubelet[2863]: I0312 23:49:10.637009 2863 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 23:49:10.637236 kubelet[2863]: I0312 23:49:10.637222 2863 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:10.637420 kubelet[2863]: I0312 23:49:10.637403 2863 kubelet.go:475] "Attempting to sync node with API server" Mar 12 23:49:10.637831 kubelet[2863]: I0312 23:49:10.637460 2863 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:49:10.637831 kubelet[2863]: I0312 23:49:10.637484 2863 kubelet.go:387] "Adding apiserver pod source" Mar 12 23:49:10.637831 kubelet[2863]: I0312 23:49:10.637499 2863 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:49:10.639263 kubelet[2863]: I0312 23:49:10.639192 2863 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:49:10.639879 kubelet[2863]: I0312 23:49:10.639862 2863 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:49:10.639910 kubelet[2863]: I0312 23:49:10.639895 2863 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 23:49:10.647625 kubelet[2863]: I0312 23:49:10.647581 2863 server.go:1262] "Started kubelet" Mar 12 23:49:10.649311 kubelet[2863]: I0312 23:49:10.649157 2863 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:49:10.649311 kubelet[2863]: I0312 23:49:10.649220 2863 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 23:49:10.649469 kubelet[2863]: I0312 23:49:10.649443 2863 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:49:10.649533 kubelet[2863]: I0312 23:49:10.649499 2863 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:49:10.650133 kubelet[2863]: I0312 23:49:10.650113 2863 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 23:49:10.660621 kubelet[2863]: I0312 23:49:10.660506 2863 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:49:10.661627 kubelet[2863]: E0312 23:49:10.661601 2863 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 23:49:10.662128 kubelet[2863]: I0312 23:49:10.661952 2863 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 12 23:49:10.662128 kubelet[2863]: I0312 23:49:10.662070 2863 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 23:49:10.662215 kubelet[2863]: I0312 23:49:10.662196 2863 reconciler.go:29] "Reconciler: start to sync state" Mar 12 23:49:10.662847 kubelet[2863]: I0312 23:49:10.650274 2863 server.go:310] "Adding debug handlers to kubelet server" Mar 12 23:49:10.663175 kubelet[2863]: I0312 23:49:10.663154 2863 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:49:10.663362 kubelet[2863]: I0312 23:49:10.663341 2863 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:49:10.664506 kubelet[2863]: I0312 23:49:10.664489 2863 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:49:10.667886 kubelet[2863]: I0312 23:49:10.667858 2863 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 23:49:10.670376 kubelet[2863]: I0312 23:49:10.670354 2863 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 23:49:10.670451 kubelet[2863]: I0312 23:49:10.670442 2863 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 12 23:49:10.670511 kubelet[2863]: I0312 23:49:10.670503 2863 kubelet.go:2428] "Starting kubelet main sync loop" Mar 12 23:49:10.670591 kubelet[2863]: E0312 23:49:10.670575 2863 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:49:10.700373 kubelet[2863]: I0312 23:49:10.700346 2863 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 23:49:10.700373 kubelet[2863]: I0312 23:49:10.700364 2863 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 23:49:10.700373 kubelet[2863]: I0312 23:49:10.700384 2863 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:49:10.700734 kubelet[2863]: I0312 23:49:10.700695 2863 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 12 23:49:10.700734 kubelet[2863]: I0312 23:49:10.700713 2863 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 12 23:49:10.700734 kubelet[2863]: I0312 23:49:10.700731 2863 policy_none.go:49] "None policy: Start" Mar 12 23:49:10.700734 kubelet[2863]: I0312 23:49:10.700739 2863 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 23:49:10.700825 kubelet[2863]: I0312 23:49:10.700749 2863 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 23:49:10.700878 kubelet[2863]: I0312 23:49:10.700863 2863 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 12 23:49:10.700910 kubelet[2863]: I0312 23:49:10.700891 2863 policy_none.go:47] "Start" Mar 12 23:49:10.705102 kubelet[2863]: E0312 23:49:10.705053 2863 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:49:10.705222 kubelet[2863]: I0312 23:49:10.705206 2863 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 23:49:10.705249 kubelet[2863]: I0312 23:49:10.705221 2863 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:49:10.706474 kubelet[2863]: I0312 23:49:10.706288 2863 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 23:49:10.708736 kubelet[2863]: E0312 23:49:10.708699 2863 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:49:10.772480 kubelet[2863]: I0312 23:49:10.772190 2863 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.772480 kubelet[2863]: I0312 23:49:10.772262 2863 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.772640 kubelet[2863]: I0312 23:49:10.772618 2863 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.781402 kubelet[2863]: E0312 23:49:10.781337 2863 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-9e79e0a9ae\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.810280 kubelet[2863]: I0312 23:49:10.810257 2863 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.818529 kubelet[2863]: I0312 23:49:10.818494 2863 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.818616 kubelet[2863]: I0312 23:49:10.818570 2863 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.862909 kubelet[2863]: I0312 23:49:10.862833 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7871e1fa007a80c16e20b18b7e032c7b-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"7871e1fa007a80c16e20b18b7e032c7b\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.863379 kubelet[2863]: I0312 23:49:10.863351 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f4fae850342de969b026666a272e751-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"4f4fae850342de969b026666a272e751\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.863438 kubelet[2863]: I0312 23:49:10.863398 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f4fae850342de969b026666a272e751-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"4f4fae850342de969b026666a272e751\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.863438 kubelet[2863]: I0312 23:49:10.863416 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f4fae850342de969b026666a272e751-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"4f4fae850342de969b026666a272e751\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.964442 kubelet[2863]: I0312 23:49:10.964193 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.964442 kubelet[2863]: I0312 23:49:10.964418 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.964442 kubelet[2863]: I0312 23:49:10.964437 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.965269 kubelet[2863]: I0312 23:49:10.964473 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:10.965269 kubelet[2863]: I0312 23:49:10.965092 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/85e36aedfabe9a02ea0843015d24cd54-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" (UID: \"85e36aedfabe9a02ea0843015d24cd54\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:11.638884 kubelet[2863]: I0312 23:49:11.638789 2863 apiserver.go:52] "Watching apiserver" Mar 12 23:49:11.662836 kubelet[2863]: I0312 23:49:11.662801 2863 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 23:49:11.687925 kubelet[2863]: I0312 23:49:11.687823 2863 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:11.688022 kubelet[2863]: I0312 23:49:11.687894 2863 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:11.695078 kubelet[2863]: E0312 23:49:11.695042 2863 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-9e79e0a9ae\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:11.696892 kubelet[2863]: E0312 23:49:11.696869 2863 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:11.717611 kubelet[2863]: I0312 23:49:11.717480 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-9e79e0a9ae" podStartSLOduration=1.717465529 podStartE2EDuration="1.717465529s" podCreationTimestamp="2026-03-12 23:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:11.704473351 +0000 UTC m=+1.123432346" watchObservedRunningTime="2026-03-12 23:49:11.717465529 +0000 UTC m=+1.136424524" Mar 12 23:49:11.719315 kubelet[2863]: I0312 23:49:11.717852 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-9e79e0a9ae" podStartSLOduration=1.717842968 podStartE2EDuration="1.717842968s" podCreationTimestamp="2026-03-12 23:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:11.71723305 +0000 UTC m=+1.136192045" watchObservedRunningTime="2026-03-12 23:49:11.717842968 +0000 UTC m=+1.136802043" Mar 12 23:49:11.746043 kubelet[2863]: I0312 23:49:11.745820 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-9e79e0a9ae" podStartSLOduration=2.74580344 podStartE2EDuration="2.74580344s" podCreationTimestamp="2026-03-12 23:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:11.727468992 +0000 UTC m=+1.146427987" watchObservedRunningTime="2026-03-12 23:49:11.74580344 +0000 UTC m=+1.164762435" Mar 12 23:49:15.556715 kubelet[2863]: I0312 23:49:15.556665 2863 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 12 23:49:15.557074 containerd[1628]: time="2026-03-12T23:49:15.556974716Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 12 23:49:15.557262 kubelet[2863]: I0312 23:49:15.557227 2863 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 12 23:49:16.151169 systemd[1]: Created slice kubepods-besteffort-pod54926772_ab7e_445d_a1ae_52f9bea0d78a.slice - libcontainer container kubepods-besteffort-pod54926772_ab7e_445d_a1ae_52f9bea0d78a.slice. Mar 12 23:49:16.199390 kubelet[2863]: I0312 23:49:16.199344 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54926772-ab7e-445d-a1ae-52f9bea0d78a-lib-modules\") pod \"kube-proxy-8kn8z\" (UID: \"54926772-ab7e-445d-a1ae-52f9bea0d78a\") " pod="kube-system/kube-proxy-8kn8z" Mar 12 23:49:16.199390 kubelet[2863]: I0312 23:49:16.199391 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/54926772-ab7e-445d-a1ae-52f9bea0d78a-kube-proxy\") pod \"kube-proxy-8kn8z\" (UID: \"54926772-ab7e-445d-a1ae-52f9bea0d78a\") " pod="kube-system/kube-proxy-8kn8z" Mar 12 23:49:16.199645 kubelet[2863]: I0312 23:49:16.199407 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/54926772-ab7e-445d-a1ae-52f9bea0d78a-xtables-lock\") pod \"kube-proxy-8kn8z\" (UID: \"54926772-ab7e-445d-a1ae-52f9bea0d78a\") " pod="kube-system/kube-proxy-8kn8z" Mar 12 23:49:16.199645 kubelet[2863]: I0312 23:49:16.199422 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmkk\" (UniqueName: \"kubernetes.io/projected/54926772-ab7e-445d-a1ae-52f9bea0d78a-kube-api-access-kdmkk\") pod \"kube-proxy-8kn8z\" (UID: \"54926772-ab7e-445d-a1ae-52f9bea0d78a\") " pod="kube-system/kube-proxy-8kn8z" Mar 12 23:49:16.467262 containerd[1628]: time="2026-03-12T23:49:16.467168349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8kn8z,Uid:54926772-ab7e-445d-a1ae-52f9bea0d78a,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:16.491741 containerd[1628]: time="2026-03-12T23:49:16.491688587Z" level=info msg="connecting to shim a3327401d271b2901eaf6b3b55ddd4b3b2bb7027c1ce2dc4eb967d48bbd62f52" address="unix:///run/containerd/s/865000907e0ba52fbfa38430c95fa46c43222db8e6b063f8024d11b41869ddbe" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:16.516467 systemd[1]: Started cri-containerd-a3327401d271b2901eaf6b3b55ddd4b3b2bb7027c1ce2dc4eb967d48bbd62f52.scope - libcontainer container a3327401d271b2901eaf6b3b55ddd4b3b2bb7027c1ce2dc4eb967d48bbd62f52. Mar 12 23:49:16.550075 containerd[1628]: time="2026-03-12T23:49:16.549986846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8kn8z,Uid:54926772-ab7e-445d-a1ae-52f9bea0d78a,Namespace:kube-system,Attempt:0,} returns sandbox id \"a3327401d271b2901eaf6b3b55ddd4b3b2bb7027c1ce2dc4eb967d48bbd62f52\"" Mar 12 23:49:16.556836 containerd[1628]: time="2026-03-12T23:49:16.556804354Z" level=info msg="CreateContainer within sandbox \"a3327401d271b2901eaf6b3b55ddd4b3b2bb7027c1ce2dc4eb967d48bbd62f52\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 12 23:49:16.573312 containerd[1628]: time="2026-03-12T23:49:16.573089206Z" level=info msg="Container 9105589d6de6c7d4ccdcb255f43cfacbfec32687dde5c48a9c074b380c7d58af: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:16.574977 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4279618855.mount: Deactivated successfully. Mar 12 23:49:16.584534 containerd[1628]: time="2026-03-12T23:49:16.584490667Z" level=info msg="CreateContainer within sandbox \"a3327401d271b2901eaf6b3b55ddd4b3b2bb7027c1ce2dc4eb967d48bbd62f52\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9105589d6de6c7d4ccdcb255f43cfacbfec32687dde5c48a9c074b380c7d58af\"" Mar 12 23:49:16.585312 containerd[1628]: time="2026-03-12T23:49:16.585153706Z" level=info msg="StartContainer for \"9105589d6de6c7d4ccdcb255f43cfacbfec32687dde5c48a9c074b380c7d58af\"" Mar 12 23:49:16.586725 containerd[1628]: time="2026-03-12T23:49:16.586676303Z" level=info msg="connecting to shim 9105589d6de6c7d4ccdcb255f43cfacbfec32687dde5c48a9c074b380c7d58af" address="unix:///run/containerd/s/865000907e0ba52fbfa38430c95fa46c43222db8e6b063f8024d11b41869ddbe" protocol=ttrpc version=3 Mar 12 23:49:16.602442 systemd[1]: Started cri-containerd-9105589d6de6c7d4ccdcb255f43cfacbfec32687dde5c48a9c074b380c7d58af.scope - libcontainer container 9105589d6de6c7d4ccdcb255f43cfacbfec32687dde5c48a9c074b380c7d58af. Mar 12 23:49:16.670312 containerd[1628]: time="2026-03-12T23:49:16.670257999Z" level=info msg="StartContainer for \"9105589d6de6c7d4ccdcb255f43cfacbfec32687dde5c48a9c074b380c7d58af\" returns successfully" Mar 12 23:49:16.716060 kubelet[2863]: I0312 23:49:16.715940 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-8kn8z" podStartSLOduration=0.71592776 podStartE2EDuration="715.92776ms" podCreationTimestamp="2026-03-12 23:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:16.71589092 +0000 UTC m=+6.134849915" watchObservedRunningTime="2026-03-12 23:49:16.71592776 +0000 UTC m=+6.134886755" Mar 12 23:49:16.724755 systemd[1]: Created slice kubepods-besteffort-pod688aa2c8_8aa6_461c_b728_f40bb078323b.slice - libcontainer container kubepods-besteffort-pod688aa2c8_8aa6_461c_b728_f40bb078323b.slice. Mar 12 23:49:16.802327 kubelet[2863]: I0312 23:49:16.802198 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/688aa2c8-8aa6-461c-b728-f40bb078323b-var-lib-calico\") pod \"tigera-operator-5588576f44-62l88\" (UID: \"688aa2c8-8aa6-461c-b728-f40bb078323b\") " pod="tigera-operator/tigera-operator-5588576f44-62l88" Mar 12 23:49:16.802327 kubelet[2863]: I0312 23:49:16.802260 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fll\" (UniqueName: \"kubernetes.io/projected/688aa2c8-8aa6-461c-b728-f40bb078323b-kube-api-access-z5fll\") pod \"tigera-operator-5588576f44-62l88\" (UID: \"688aa2c8-8aa6-461c-b728-f40bb078323b\") " pod="tigera-operator/tigera-operator-5588576f44-62l88" Mar 12 23:49:17.032668 containerd[1628]: time="2026-03-12T23:49:17.032550655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-62l88,Uid:688aa2c8-8aa6-461c-b728-f40bb078323b,Namespace:tigera-operator,Attempt:0,}" Mar 12 23:49:17.057441 containerd[1628]: time="2026-03-12T23:49:17.057393772Z" level=info msg="connecting to shim 7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea" address="unix:///run/containerd/s/199c938dc207ff6d8d26c40cc5656f5ba3754744d06b9e3c2283abadcd26ac4f" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:17.079158 systemd[1]: Started cri-containerd-7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea.scope - libcontainer container 7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea. Mar 12 23:49:17.115392 containerd[1628]: time="2026-03-12T23:49:17.115211193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-62l88,Uid:688aa2c8-8aa6-461c-b728-f40bb078323b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea\"" Mar 12 23:49:17.117339 containerd[1628]: time="2026-03-12T23:49:17.117277989Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 12 23:49:18.938666 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1759357979.mount: Deactivated successfully. Mar 12 23:49:19.262003 containerd[1628]: time="2026-03-12T23:49:19.261888815Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:19.263114 containerd[1628]: time="2026-03-12T23:49:19.263076293Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 12 23:49:19.264660 containerd[1628]: time="2026-03-12T23:49:19.264617211Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:19.267739 containerd[1628]: time="2026-03-12T23:49:19.267702405Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:19.268519 containerd[1628]: time="2026-03-12T23:49:19.268484684Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.151164935s" Mar 12 23:49:19.268519 containerd[1628]: time="2026-03-12T23:49:19.268518284Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 12 23:49:19.274996 containerd[1628]: time="2026-03-12T23:49:19.274965633Z" level=info msg="CreateContainer within sandbox \"7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 12 23:49:19.286477 containerd[1628]: time="2026-03-12T23:49:19.285891894Z" level=info msg="Container 076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:19.288050 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2949938731.mount: Deactivated successfully. Mar 12 23:49:19.294431 containerd[1628]: time="2026-03-12T23:49:19.294395359Z" level=info msg="CreateContainer within sandbox \"7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2\"" Mar 12 23:49:19.295150 containerd[1628]: time="2026-03-12T23:49:19.295126238Z" level=info msg="StartContainer for \"076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2\"" Mar 12 23:49:19.295915 containerd[1628]: time="2026-03-12T23:49:19.295886957Z" level=info msg="connecting to shim 076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2" address="unix:///run/containerd/s/199c938dc207ff6d8d26c40cc5656f5ba3754744d06b9e3c2283abadcd26ac4f" protocol=ttrpc version=3 Mar 12 23:49:19.317467 systemd[1]: Started cri-containerd-076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2.scope - libcontainer container 076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2. Mar 12 23:49:19.343895 containerd[1628]: time="2026-03-12T23:49:19.343858674Z" level=info msg="StartContainer for \"076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2\" returns successfully" Mar 12 23:49:19.835900 kubelet[2863]: I0312 23:49:19.835832 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-62l88" podStartSLOduration=1.682972335 podStartE2EDuration="3.835814627s" podCreationTimestamp="2026-03-12 23:49:16 +0000 UTC" firstStartedPulling="2026-03-12 23:49:17.116377991 +0000 UTC m=+6.535336986" lastFinishedPulling="2026-03-12 23:49:19.269220323 +0000 UTC m=+8.688179278" observedRunningTime="2026-03-12 23:49:19.719187388 +0000 UTC m=+9.138146423" watchObservedRunningTime="2026-03-12 23:49:19.835814627 +0000 UTC m=+9.254773622" Mar 12 23:49:24.453991 sudo[1904]: pam_unix(sudo:session): session closed for user root Mar 12 23:49:24.548636 sshd[1903]: Connection closed by 20.161.92.111 port 59580 Mar 12 23:49:24.550183 sshd-session[1900]: pam_unix(sshd:session): session closed for user core Mar 12 23:49:24.553978 systemd[1]: sshd@6-10.0.8.7:22-20.161.92.111:59580.service: Deactivated successfully. Mar 12 23:49:24.555668 systemd[1]: session-7.scope: Deactivated successfully. Mar 12 23:49:24.555839 systemd[1]: session-7.scope: Consumed 7.436s CPU time, 226.5M memory peak. Mar 12 23:49:24.559868 systemd-logind[1610]: Session 7 logged out. Waiting for processes to exit. Mar 12 23:49:24.562029 systemd-logind[1610]: Removed session 7. Mar 12 23:49:28.000014 systemd[1]: Created slice kubepods-besteffort-podd53083be_f660_4b11_953b_bd62a5d6c5be.slice - libcontainer container kubepods-besteffort-podd53083be_f660_4b11_953b_bd62a5d6c5be.slice. Mar 12 23:49:28.054225 systemd[1]: Created slice kubepods-besteffort-pod5328a226_de6c_405b_bb7b_c4d189524d10.slice - libcontainer container kubepods-besteffort-pod5328a226_de6c_405b_bb7b_c4d189524d10.slice. Mar 12 23:49:28.077003 kubelet[2863]: I0312 23:49:28.076939 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6cs\" (UniqueName: \"kubernetes.io/projected/5328a226-de6c-405b-bb7b-c4d189524d10-kube-api-access-pw6cs\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077503 kubelet[2863]: I0312 23:49:28.077105 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27t78\" (UniqueName: \"kubernetes.io/projected/d53083be-f660-4b11-953b-bd62a5d6c5be-kube-api-access-27t78\") pod \"calico-typha-595f7685fc-g65wt\" (UID: \"d53083be-f660-4b11-953b-bd62a5d6c5be\") " pod="calico-system/calico-typha-595f7685fc-g65wt" Mar 12 23:49:28.077503 kubelet[2863]: I0312 23:49:28.077134 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-nodeproc\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077503 kubelet[2863]: I0312 23:49:28.077153 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-sys-fs\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077503 kubelet[2863]: I0312 23:49:28.077169 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5328a226-de6c-405b-bb7b-c4d189524d10-node-certs\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077503 kubelet[2863]: I0312 23:49:28.077222 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d53083be-f660-4b11-953b-bd62a5d6c5be-typha-certs\") pod \"calico-typha-595f7685fc-g65wt\" (UID: \"d53083be-f660-4b11-953b-bd62a5d6c5be\") " pod="calico-system/calico-typha-595f7685fc-g65wt" Mar 12 23:49:28.077614 kubelet[2863]: I0312 23:49:28.077250 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-cni-net-dir\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077614 kubelet[2863]: I0312 23:49:28.077273 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-lib-modules\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077614 kubelet[2863]: I0312 23:49:28.077335 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-var-lib-calico\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077614 kubelet[2863]: I0312 23:49:28.077353 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d53083be-f660-4b11-953b-bd62a5d6c5be-tigera-ca-bundle\") pod \"calico-typha-595f7685fc-g65wt\" (UID: \"d53083be-f660-4b11-953b-bd62a5d6c5be\") " pod="calico-system/calico-typha-595f7685fc-g65wt" Mar 12 23:49:28.077740 kubelet[2863]: I0312 23:49:28.077685 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-bpffs\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077764 kubelet[2863]: I0312 23:49:28.077737 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-cni-bin-dir\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077872 kubelet[2863]: I0312 23:49:28.077829 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-cni-log-dir\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077872 kubelet[2863]: I0312 23:49:28.077886 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-flexvol-driver-host\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077952 kubelet[2863]: I0312 23:49:28.077917 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-policysync\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.077952 kubelet[2863]: I0312 23:49:28.077941 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5328a226-de6c-405b-bb7b-c4d189524d10-tigera-ca-bundle\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.078039 kubelet[2863]: I0312 23:49:28.077958 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-var-run-calico\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.078039 kubelet[2863]: I0312 23:49:28.077989 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5328a226-de6c-405b-bb7b-c4d189524d10-xtables-lock\") pod \"calico-node-k2x2l\" (UID: \"5328a226-de6c-405b-bb7b-c4d189524d10\") " pod="calico-system/calico-node-k2x2l" Mar 12 23:49:28.159818 kubelet[2863]: E0312 23:49:28.159773 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9szck" podUID="a240f634-9da5-444e-bd18-80be2ab75c23" Mar 12 23:49:28.179242 kubelet[2863]: I0312 23:49:28.179189 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8dg7\" (UniqueName: \"kubernetes.io/projected/a240f634-9da5-444e-bd18-80be2ab75c23-kube-api-access-h8dg7\") pod \"csi-node-driver-9szck\" (UID: \"a240f634-9da5-444e-bd18-80be2ab75c23\") " pod="calico-system/csi-node-driver-9szck" Mar 12 23:49:28.179581 kubelet[2863]: I0312 23:49:28.179451 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a240f634-9da5-444e-bd18-80be2ab75c23-kubelet-dir\") pod \"csi-node-driver-9szck\" (UID: \"a240f634-9da5-444e-bd18-80be2ab75c23\") " pod="calico-system/csi-node-driver-9szck" Mar 12 23:49:28.179581 kubelet[2863]: I0312 23:49:28.179506 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a240f634-9da5-444e-bd18-80be2ab75c23-varrun\") pod \"csi-node-driver-9szck\" (UID: \"a240f634-9da5-444e-bd18-80be2ab75c23\") " pod="calico-system/csi-node-driver-9szck" Mar 12 23:49:28.179663 kubelet[2863]: I0312 23:49:28.179552 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a240f634-9da5-444e-bd18-80be2ab75c23-socket-dir\") pod \"csi-node-driver-9szck\" (UID: \"a240f634-9da5-444e-bd18-80be2ab75c23\") " pod="calico-system/csi-node-driver-9szck" Mar 12 23:49:28.180718 kubelet[2863]: I0312 23:49:28.180206 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a240f634-9da5-444e-bd18-80be2ab75c23-registration-dir\") pod \"csi-node-driver-9szck\" (UID: \"a240f634-9da5-444e-bd18-80be2ab75c23\") " pod="calico-system/csi-node-driver-9szck" Mar 12 23:49:28.181415 kubelet[2863]: E0312 23:49:28.181364 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.181415 kubelet[2863]: W0312 23:49:28.181392 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.181415 kubelet[2863]: E0312 23:49:28.181417 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.182043 kubelet[2863]: E0312 23:49:28.182011 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.182043 kubelet[2863]: W0312 23:49:28.182033 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.182043 kubelet[2863]: E0312 23:49:28.182046 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.182731 kubelet[2863]: E0312 23:49:28.182485 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.182731 kubelet[2863]: W0312 23:49:28.182627 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.182731 kubelet[2863]: E0312 23:49:28.182641 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.187640 kubelet[2863]: E0312 23:49:28.186374 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.187640 kubelet[2863]: W0312 23:49:28.186404 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.187640 kubelet[2863]: E0312 23:49:28.186423 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.187640 kubelet[2863]: E0312 23:49:28.186732 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.187640 kubelet[2863]: W0312 23:49:28.186742 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.187640 kubelet[2863]: E0312 23:49:28.186753 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.188074 kubelet[2863]: E0312 23:49:28.188033 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.188074 kubelet[2863]: W0312 23:49:28.188058 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.188074 kubelet[2863]: E0312 23:49:28.188071 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.188356 kubelet[2863]: E0312 23:49:28.188250 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.188356 kubelet[2863]: W0312 23:49:28.188276 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.188356 kubelet[2863]: E0312 23:49:28.188286 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.188458 kubelet[2863]: E0312 23:49:28.188436 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.188458 kubelet[2863]: W0312 23:49:28.188451 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.188545 kubelet[2863]: E0312 23:49:28.188463 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191091 kubelet[2863]: E0312 23:49:28.188611 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191091 kubelet[2863]: W0312 23:49:28.188624 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191091 kubelet[2863]: E0312 23:49:28.188633 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191091 kubelet[2863]: E0312 23:49:28.188768 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191091 kubelet[2863]: W0312 23:49:28.188775 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191091 kubelet[2863]: E0312 23:49:28.188782 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191091 kubelet[2863]: E0312 23:49:28.189033 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191091 kubelet[2863]: W0312 23:49:28.189041 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191091 kubelet[2863]: E0312 23:49:28.189050 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191091 kubelet[2863]: E0312 23:49:28.189735 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191349 kubelet[2863]: W0312 23:49:28.189743 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191349 kubelet[2863]: E0312 23:49:28.189751 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191349 kubelet[2863]: E0312 23:49:28.189912 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191349 kubelet[2863]: W0312 23:49:28.189919 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191349 kubelet[2863]: E0312 23:49:28.189927 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191349 kubelet[2863]: E0312 23:49:28.190370 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191349 kubelet[2863]: W0312 23:49:28.190378 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191349 kubelet[2863]: E0312 23:49:28.190387 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191349 kubelet[2863]: E0312 23:49:28.190549 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191349 kubelet[2863]: W0312 23:49:28.190557 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191609 kubelet[2863]: E0312 23:49:28.190565 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.191609 kubelet[2863]: E0312 23:49:28.190870 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.191609 kubelet[2863]: W0312 23:49:28.190878 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.191609 kubelet[2863]: E0312 23:49:28.190888 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.192957 kubelet[2863]: E0312 23:49:28.192936 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.192957 kubelet[2863]: W0312 23:49:28.192957 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.193029 kubelet[2863]: E0312 23:49:28.192968 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.193496 kubelet[2863]: E0312 23:49:28.193456 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.193496 kubelet[2863]: W0312 23:49:28.193471 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.193496 kubelet[2863]: E0312 23:49:28.193482 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.193671 kubelet[2863]: E0312 23:49:28.193652 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.193671 kubelet[2863]: W0312 23:49:28.193662 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.193671 kubelet[2863]: E0312 23:49:28.193671 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.194470 kubelet[2863]: E0312 23:49:28.194447 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.194470 kubelet[2863]: W0312 23:49:28.194460 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.194470 kubelet[2863]: E0312 23:49:28.194470 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.195318 kubelet[2863]: E0312 23:49:28.194597 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.195318 kubelet[2863]: W0312 23:49:28.194609 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.195318 kubelet[2863]: E0312 23:49:28.194617 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.195318 kubelet[2863]: E0312 23:49:28.194725 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.195318 kubelet[2863]: W0312 23:49:28.194731 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.195318 kubelet[2863]: E0312 23:49:28.194738 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.195318 kubelet[2863]: E0312 23:49:28.194834 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.195318 kubelet[2863]: W0312 23:49:28.194839 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.195318 kubelet[2863]: E0312 23:49:28.194846 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.198595 kubelet[2863]: E0312 23:49:28.195721 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.198595 kubelet[2863]: W0312 23:49:28.195755 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.198595 kubelet[2863]: E0312 23:49:28.195765 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.198595 kubelet[2863]: E0312 23:49:28.196241 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.198595 kubelet[2863]: W0312 23:49:28.196250 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.198595 kubelet[2863]: E0312 23:49:28.196260 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.198595 kubelet[2863]: E0312 23:49:28.196469 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.198595 kubelet[2863]: W0312 23:49:28.196478 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.198595 kubelet[2863]: E0312 23:49:28.196487 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.198595 kubelet[2863]: E0312 23:49:28.196635 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.198946 kubelet[2863]: W0312 23:49:28.196642 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.198946 kubelet[2863]: E0312 23:49:28.196650 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.198946 kubelet[2863]: E0312 23:49:28.197017 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.198946 kubelet[2863]: W0312 23:49:28.197025 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.198946 kubelet[2863]: E0312 23:49:28.197033 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.198946 kubelet[2863]: E0312 23:49:28.198376 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.198946 kubelet[2863]: W0312 23:49:28.198386 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.198946 kubelet[2863]: E0312 23:49:28.198399 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.198946 kubelet[2863]: E0312 23:49:28.198818 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.198946 kubelet[2863]: W0312 23:49:28.198828 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.199182 kubelet[2863]: E0312 23:49:28.198838 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.199182 kubelet[2863]: E0312 23:49:28.198987 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.199182 kubelet[2863]: W0312 23:49:28.198994 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.199182 kubelet[2863]: E0312 23:49:28.199002 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.199251 kubelet[2863]: E0312 23:49:28.199236 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.199251 kubelet[2863]: W0312 23:49:28.199245 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.199291 kubelet[2863]: E0312 23:49:28.199255 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.281049 kubelet[2863]: E0312 23:49:28.280943 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.281049 kubelet[2863]: W0312 23:49:28.280966 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.281049 kubelet[2863]: E0312 23:49:28.280986 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.281500 kubelet[2863]: E0312 23:49:28.281269 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.281500 kubelet[2863]: W0312 23:49:28.281281 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.281500 kubelet[2863]: E0312 23:49:28.281291 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.281623 kubelet[2863]: E0312 23:49:28.281539 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.281623 kubelet[2863]: W0312 23:49:28.281547 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.281623 kubelet[2863]: E0312 23:49:28.281556 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.281750 kubelet[2863]: E0312 23:49:28.281708 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.281750 kubelet[2863]: W0312 23:49:28.281719 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.281750 kubelet[2863]: E0312 23:49:28.281727 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.281964 kubelet[2863]: E0312 23:49:28.281856 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.281964 kubelet[2863]: W0312 23:49:28.281880 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.281964 kubelet[2863]: E0312 23:49:28.281888 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.282116 kubelet[2863]: E0312 23:49:28.282088 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.282116 kubelet[2863]: W0312 23:49:28.282096 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.282116 kubelet[2863]: E0312 23:49:28.282104 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.282281 kubelet[2863]: E0312 23:49:28.282238 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.282281 kubelet[2863]: W0312 23:49:28.282248 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.282281 kubelet[2863]: E0312 23:49:28.282256 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.282448 kubelet[2863]: E0312 23:49:28.282418 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.282448 kubelet[2863]: W0312 23:49:28.282428 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.282448 kubelet[2863]: E0312 23:49:28.282436 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.282609 kubelet[2863]: E0312 23:49:28.282603 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.282694 kubelet[2863]: W0312 23:49:28.282611 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.282694 kubelet[2863]: E0312 23:49:28.282619 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.282799 kubelet[2863]: E0312 23:49:28.282782 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.282799 kubelet[2863]: W0312 23:49:28.282790 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.282799 kubelet[2863]: E0312 23:49:28.282798 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.283166 kubelet[2863]: E0312 23:49:28.282989 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.283166 kubelet[2863]: W0312 23:49:28.282999 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.283166 kubelet[2863]: E0312 23:49:28.283007 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.283263 kubelet[2863]: E0312 23:49:28.283233 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.283263 kubelet[2863]: W0312 23:49:28.283242 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.283333 kubelet[2863]: E0312 23:49:28.283267 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.283490 kubelet[2863]: E0312 23:49:28.283447 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.283490 kubelet[2863]: W0312 23:49:28.283461 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.283490 kubelet[2863]: E0312 23:49:28.283476 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.283692 kubelet[2863]: E0312 23:49:28.283675 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.283692 kubelet[2863]: W0312 23:49:28.283687 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.283765 kubelet[2863]: E0312 23:49:28.283695 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.283900 kubelet[2863]: E0312 23:49:28.283885 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.283900 kubelet[2863]: W0312 23:49:28.283896 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.283945 kubelet[2863]: E0312 23:49:28.283904 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.284084 kubelet[2863]: E0312 23:49:28.284070 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.284084 kubelet[2863]: W0312 23:49:28.284081 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.284128 kubelet[2863]: E0312 23:49:28.284089 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.284226 kubelet[2863]: E0312 23:49:28.284215 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.284250 kubelet[2863]: W0312 23:49:28.284226 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.284250 kubelet[2863]: E0312 23:49:28.284234 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.284437 kubelet[2863]: E0312 23:49:28.284423 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.284437 kubelet[2863]: W0312 23:49:28.284435 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.284499 kubelet[2863]: E0312 23:49:28.284443 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.284593 kubelet[2863]: E0312 23:49:28.284579 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.284593 kubelet[2863]: W0312 23:49:28.284589 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.284637 kubelet[2863]: E0312 23:49:28.284597 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.284797 kubelet[2863]: E0312 23:49:28.284784 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.284797 kubelet[2863]: W0312 23:49:28.284795 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.284845 kubelet[2863]: E0312 23:49:28.284803 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.284974 kubelet[2863]: E0312 23:49:28.284962 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.284974 kubelet[2863]: W0312 23:49:28.284972 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.285022 kubelet[2863]: E0312 23:49:28.284980 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.285268 kubelet[2863]: E0312 23:49:28.285252 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.285268 kubelet[2863]: W0312 23:49:28.285267 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.285334 kubelet[2863]: E0312 23:49:28.285276 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.285474 kubelet[2863]: E0312 23:49:28.285460 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.285474 kubelet[2863]: W0312 23:49:28.285472 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.285524 kubelet[2863]: E0312 23:49:28.285480 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.285662 kubelet[2863]: E0312 23:49:28.285649 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.285662 kubelet[2863]: W0312 23:49:28.285661 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.285711 kubelet[2863]: E0312 23:49:28.285668 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.286787 kubelet[2863]: E0312 23:49:28.286751 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.286787 kubelet[2863]: W0312 23:49:28.286769 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.286870 kubelet[2863]: E0312 23:49:28.286781 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.297794 kubelet[2863]: E0312 23:49:28.297763 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:28.297794 kubelet[2863]: W0312 23:49:28.297779 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:28.297794 kubelet[2863]: E0312 23:49:28.297790 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:28.311902 containerd[1628]: time="2026-03-12T23:49:28.311862509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-595f7685fc-g65wt,Uid:d53083be-f660-4b11-953b-bd62a5d6c5be,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:28.337081 containerd[1628]: time="2026-03-12T23:49:28.337041785Z" level=info msg="connecting to shim bf1112ce11c6ae032b16277ad29d505f334925f0462461fd2ee08563bd900c75" address="unix:///run/containerd/s/b6c1541411c8e3876c45f8fa4900c8c8ca12ed70318da24fd699362841642161" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:28.358642 systemd[1]: Started cri-containerd-bf1112ce11c6ae032b16277ad29d505f334925f0462461fd2ee08563bd900c75.scope - libcontainer container bf1112ce11c6ae032b16277ad29d505f334925f0462461fd2ee08563bd900c75. Mar 12 23:49:28.360591 containerd[1628]: time="2026-03-12T23:49:28.360549345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-k2x2l,Uid:5328a226-de6c-405b-bb7b-c4d189524d10,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:28.395859 containerd[1628]: time="2026-03-12T23:49:28.395815044Z" level=info msg="connecting to shim 2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab" address="unix:///run/containerd/s/eda9b717a1f7da61bcde1bb74f44a92360f6fc7e744366dcca0f5308cd667655" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:28.396091 containerd[1628]: time="2026-03-12T23:49:28.396062284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-595f7685fc-g65wt,Uid:d53083be-f660-4b11-953b-bd62a5d6c5be,Namespace:calico-system,Attempt:0,} returns sandbox id \"bf1112ce11c6ae032b16277ad29d505f334925f0462461fd2ee08563bd900c75\"" Mar 12 23:49:28.397743 containerd[1628]: time="2026-03-12T23:49:28.397713121Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 12 23:49:28.416461 systemd[1]: Started cri-containerd-2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab.scope - libcontainer container 2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab. Mar 12 23:49:28.440431 containerd[1628]: time="2026-03-12T23:49:28.440382087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-k2x2l,Uid:5328a226-de6c-405b-bb7b-c4d189524d10,Namespace:calico-system,Attempt:0,} returns sandbox id \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\"" Mar 12 23:49:29.642569 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4238523156.mount: Deactivated successfully. Mar 12 23:49:29.671208 kubelet[2863]: E0312 23:49:29.671023 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9szck" podUID="a240f634-9da5-444e-bd18-80be2ab75c23" Mar 12 23:49:30.006551 containerd[1628]: time="2026-03-12T23:49:30.006353590Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:30.008123 containerd[1628]: time="2026-03-12T23:49:30.008075707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 12 23:49:30.010260 containerd[1628]: time="2026-03-12T23:49:30.010221584Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:30.013129 containerd[1628]: time="2026-03-12T23:49:30.013061019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:30.013930 containerd[1628]: time="2026-03-12T23:49:30.013887737Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 1.616136776s" Mar 12 23:49:30.013930 containerd[1628]: time="2026-03-12T23:49:30.013923817Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 12 23:49:30.015277 containerd[1628]: time="2026-03-12T23:49:30.015253175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 12 23:49:30.026605 containerd[1628]: time="2026-03-12T23:49:30.026565036Z" level=info msg="CreateContainer within sandbox \"bf1112ce11c6ae032b16277ad29d505f334925f0462461fd2ee08563bd900c75\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 12 23:49:30.038767 containerd[1628]: time="2026-03-12T23:49:30.038681655Z" level=info msg="Container 5428c22f9574112eb58ce971f07d0158e3384e7b6cfb08f1d884a597c4c4792f: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:30.049109 containerd[1628]: time="2026-03-12T23:49:30.049065797Z" level=info msg="CreateContainer within sandbox \"bf1112ce11c6ae032b16277ad29d505f334925f0462461fd2ee08563bd900c75\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5428c22f9574112eb58ce971f07d0158e3384e7b6cfb08f1d884a597c4c4792f\"" Mar 12 23:49:30.049629 containerd[1628]: time="2026-03-12T23:49:30.049539916Z" level=info msg="StartContainer for \"5428c22f9574112eb58ce971f07d0158e3384e7b6cfb08f1d884a597c4c4792f\"" Mar 12 23:49:30.050706 containerd[1628]: time="2026-03-12T23:49:30.050685514Z" level=info msg="connecting to shim 5428c22f9574112eb58ce971f07d0158e3384e7b6cfb08f1d884a597c4c4792f" address="unix:///run/containerd/s/b6c1541411c8e3876c45f8fa4900c8c8ca12ed70318da24fd699362841642161" protocol=ttrpc version=3 Mar 12 23:49:30.070519 systemd[1]: Started cri-containerd-5428c22f9574112eb58ce971f07d0158e3384e7b6cfb08f1d884a597c4c4792f.scope - libcontainer container 5428c22f9574112eb58ce971f07d0158e3384e7b6cfb08f1d884a597c4c4792f. Mar 12 23:49:30.106905 containerd[1628]: time="2026-03-12T23:49:30.106872777Z" level=info msg="StartContainer for \"5428c22f9574112eb58ce971f07d0158e3384e7b6cfb08f1d884a597c4c4792f\" returns successfully" Mar 12 23:49:30.783682 kubelet[2863]: E0312 23:49:30.783641 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.783682 kubelet[2863]: W0312 23:49:30.783666 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.783682 kubelet[2863]: E0312 23:49:30.783685 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.784057 kubelet[2863]: E0312 23:49:30.783967 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.784057 kubelet[2863]: W0312 23:49:30.783976 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.784057 kubelet[2863]: E0312 23:49:30.784011 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.784182 kubelet[2863]: E0312 23:49:30.784164 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.784182 kubelet[2863]: W0312 23:49:30.784176 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.784232 kubelet[2863]: E0312 23:49:30.784185 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.784323 kubelet[2863]: E0312 23:49:30.784312 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.784348 kubelet[2863]: W0312 23:49:30.784323 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.784348 kubelet[2863]: E0312 23:49:30.784331 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.784504 kubelet[2863]: E0312 23:49:30.784491 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.784504 kubelet[2863]: W0312 23:49:30.784502 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.784555 kubelet[2863]: E0312 23:49:30.784509 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.784653 kubelet[2863]: E0312 23:49:30.784624 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.784653 kubelet[2863]: W0312 23:49:30.784635 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.784653 kubelet[2863]: E0312 23:49:30.784643 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.784761 kubelet[2863]: E0312 23:49:30.784750 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.784761 kubelet[2863]: W0312 23:49:30.784760 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.784804 kubelet[2863]: E0312 23:49:30.784767 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.784905 kubelet[2863]: E0312 23:49:30.784895 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.784905 kubelet[2863]: W0312 23:49:30.784904 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.784950 kubelet[2863]: E0312 23:49:30.784911 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.785062 kubelet[2863]: E0312 23:49:30.785050 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.785062 kubelet[2863]: W0312 23:49:30.785060 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.785110 kubelet[2863]: E0312 23:49:30.785068 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.785189 kubelet[2863]: E0312 23:49:30.785178 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.785189 kubelet[2863]: W0312 23:49:30.785189 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.785237 kubelet[2863]: E0312 23:49:30.785197 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.785328 kubelet[2863]: E0312 23:49:30.785318 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.785328 kubelet[2863]: W0312 23:49:30.785327 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.785379 kubelet[2863]: E0312 23:49:30.785335 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.785491 kubelet[2863]: E0312 23:49:30.785480 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.785491 kubelet[2863]: W0312 23:49:30.785490 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.785535 kubelet[2863]: E0312 23:49:30.785499 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.785629 kubelet[2863]: E0312 23:49:30.785619 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.785651 kubelet[2863]: W0312 23:49:30.785628 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.785651 kubelet[2863]: E0312 23:49:30.785636 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.785773 kubelet[2863]: E0312 23:49:30.785762 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.785773 kubelet[2863]: W0312 23:49:30.785771 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.785821 kubelet[2863]: E0312 23:49:30.785779 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.785897 kubelet[2863]: E0312 23:49:30.785886 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.785897 kubelet[2863]: W0312 23:49:30.785895 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.785941 kubelet[2863]: E0312 23:49:30.785902 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.800402 kubelet[2863]: E0312 23:49:30.800363 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.800402 kubelet[2863]: W0312 23:49:30.800386 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.800402 kubelet[2863]: E0312 23:49:30.800403 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.800612 kubelet[2863]: E0312 23:49:30.800591 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.800612 kubelet[2863]: W0312 23:49:30.800604 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.800612 kubelet[2863]: E0312 23:49:30.800614 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.800786 kubelet[2863]: E0312 23:49:30.800776 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.800786 kubelet[2863]: W0312 23:49:30.800786 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.800841 kubelet[2863]: E0312 23:49:30.800795 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.801007 kubelet[2863]: E0312 23:49:30.800995 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.801007 kubelet[2863]: W0312 23:49:30.801006 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.801071 kubelet[2863]: E0312 23:49:30.801014 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.801165 kubelet[2863]: E0312 23:49:30.801152 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.801165 kubelet[2863]: W0312 23:49:30.801162 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.801220 kubelet[2863]: E0312 23:49:30.801171 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.801359 kubelet[2863]: E0312 23:49:30.801347 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.801359 kubelet[2863]: W0312 23:49:30.801356 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.801405 kubelet[2863]: E0312 23:49:30.801364 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.801548 kubelet[2863]: E0312 23:49:30.801512 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.801548 kubelet[2863]: W0312 23:49:30.801527 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.801548 kubelet[2863]: E0312 23:49:30.801535 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.802061 kubelet[2863]: E0312 23:49:30.801953 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.802061 kubelet[2863]: W0312 23:49:30.801971 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.802061 kubelet[2863]: E0312 23:49:30.801983 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.802220 kubelet[2863]: E0312 23:49:30.802208 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.802335 kubelet[2863]: W0312 23:49:30.802282 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.802392 kubelet[2863]: E0312 23:49:30.802381 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.802609 kubelet[2863]: E0312 23:49:30.802594 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.802668 kubelet[2863]: W0312 23:49:30.802657 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.802728 kubelet[2863]: E0312 23:49:30.802718 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.802957 kubelet[2863]: E0312 23:49:30.802918 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.802957 kubelet[2863]: W0312 23:49:30.802930 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.803041 kubelet[2863]: E0312 23:49:30.803030 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.803313 kubelet[2863]: E0312 23:49:30.803236 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.803313 kubelet[2863]: W0312 23:49:30.803247 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.803313 kubelet[2863]: E0312 23:49:30.803256 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.803772 kubelet[2863]: E0312 23:49:30.803546 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.803772 kubelet[2863]: W0312 23:49:30.803558 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.803772 kubelet[2863]: E0312 23:49:30.803567 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.803868 kubelet[2863]: E0312 23:49:30.803810 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.803868 kubelet[2863]: W0312 23:49:30.803824 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.803868 kubelet[2863]: E0312 23:49:30.803836 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.804009 kubelet[2863]: E0312 23:49:30.803995 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.804009 kubelet[2863]: W0312 23:49:30.804005 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.804053 kubelet[2863]: E0312 23:49:30.804013 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.804189 kubelet[2863]: E0312 23:49:30.804179 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.804189 kubelet[2863]: W0312 23:49:30.804189 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.804236 kubelet[2863]: E0312 23:49:30.804197 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.804444 kubelet[2863]: E0312 23:49:30.804429 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.804475 kubelet[2863]: W0312 23:49:30.804444 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.804475 kubelet[2863]: E0312 23:49:30.804456 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:30.804621 kubelet[2863]: E0312 23:49:30.804610 2863 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:49:30.804621 kubelet[2863]: W0312 23:49:30.804621 2863 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:49:30.804677 kubelet[2863]: E0312 23:49:30.804629 2863 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:49:31.271469 containerd[1628]: time="2026-03-12T23:49:31.271374852Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:31.273376 containerd[1628]: time="2026-03-12T23:49:31.273335528Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 12 23:49:31.275217 containerd[1628]: time="2026-03-12T23:49:31.275174285Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:31.278137 containerd[1628]: time="2026-03-12T23:49:31.278099160Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:31.278726 containerd[1628]: time="2026-03-12T23:49:31.278678879Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.263393704s" Mar 12 23:49:31.278726 containerd[1628]: time="2026-03-12T23:49:31.278718399Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 12 23:49:31.283955 containerd[1628]: time="2026-03-12T23:49:31.283909550Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 12 23:49:31.297083 containerd[1628]: time="2026-03-12T23:49:31.297031008Z" level=info msg="Container 5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:31.309681 containerd[1628]: time="2026-03-12T23:49:31.309626586Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65\"" Mar 12 23:49:31.310458 containerd[1628]: time="2026-03-12T23:49:31.310432705Z" level=info msg="StartContainer for \"5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65\"" Mar 12 23:49:31.312071 containerd[1628]: time="2026-03-12T23:49:31.311879862Z" level=info msg="connecting to shim 5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65" address="unix:///run/containerd/s/eda9b717a1f7da61bcde1bb74f44a92360f6fc7e744366dcca0f5308cd667655" protocol=ttrpc version=3 Mar 12 23:49:31.330457 systemd[1]: Started cri-containerd-5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65.scope - libcontainer container 5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65. Mar 12 23:49:31.408071 containerd[1628]: time="2026-03-12T23:49:31.408032416Z" level=info msg="StartContainer for \"5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65\" returns successfully" Mar 12 23:49:31.420720 systemd[1]: cri-containerd-5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65.scope: Deactivated successfully. Mar 12 23:49:31.422620 containerd[1628]: time="2026-03-12T23:49:31.422481672Z" level=info msg="received container exit event container_id:\"5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65\" id:\"5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65\" pid:3536 exited_at:{seconds:1773359371 nanos:422125872}" Mar 12 23:49:31.440242 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5e58d5824ad659a087e3a946315b7b7aeff571191f5143d458b5707ce04fcf65-rootfs.mount: Deactivated successfully. Mar 12 23:49:31.671253 kubelet[2863]: E0312 23:49:31.671171 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9szck" podUID="a240f634-9da5-444e-bd18-80be2ab75c23" Mar 12 23:49:31.737888 kubelet[2863]: I0312 23:49:31.737860 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:49:31.739695 containerd[1628]: time="2026-03-12T23:49:31.739654565Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 12 23:49:31.756111 kubelet[2863]: I0312 23:49:31.756050 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-595f7685fc-g65wt" podStartSLOduration=3.138435602 podStartE2EDuration="4.756033737s" podCreationTimestamp="2026-03-12 23:49:27 +0000 UTC" firstStartedPulling="2026-03-12 23:49:28.397324081 +0000 UTC m=+17.816283076" lastFinishedPulling="2026-03-12 23:49:30.014922216 +0000 UTC m=+19.433881211" observedRunningTime="2026-03-12 23:49:30.747622274 +0000 UTC m=+20.166581269" watchObservedRunningTime="2026-03-12 23:49:31.756033737 +0000 UTC m=+21.174992732" Mar 12 23:49:33.671345 kubelet[2863]: E0312 23:49:33.671259 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9szck" podUID="a240f634-9da5-444e-bd18-80be2ab75c23" Mar 12 23:49:35.371235 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3957768102.mount: Deactivated successfully. Mar 12 23:49:35.401487 containerd[1628]: time="2026-03-12T23:49:35.401436459Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:35.402853 containerd[1628]: time="2026-03-12T23:49:35.402828136Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 12 23:49:35.404567 containerd[1628]: time="2026-03-12T23:49:35.404543933Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:35.407538 containerd[1628]: time="2026-03-12T23:49:35.407494808Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:35.408103 containerd[1628]: time="2026-03-12T23:49:35.408064487Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 3.668363922s" Mar 12 23:49:35.408103 containerd[1628]: time="2026-03-12T23:49:35.408094847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 12 23:49:35.413915 containerd[1628]: time="2026-03-12T23:49:35.413886077Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 12 23:49:35.426252 containerd[1628]: time="2026-03-12T23:49:35.426219776Z" level=info msg="Container 0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:35.438235 containerd[1628]: time="2026-03-12T23:49:35.438106476Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44\"" Mar 12 23:49:35.439384 containerd[1628]: time="2026-03-12T23:49:35.438681915Z" level=info msg="StartContainer for \"0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44\"" Mar 12 23:49:35.440129 containerd[1628]: time="2026-03-12T23:49:35.440098632Z" level=info msg="connecting to shim 0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44" address="unix:///run/containerd/s/eda9b717a1f7da61bcde1bb74f44a92360f6fc7e744366dcca0f5308cd667655" protocol=ttrpc version=3 Mar 12 23:49:35.460572 systemd[1]: Started cri-containerd-0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44.scope - libcontainer container 0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44. Mar 12 23:49:35.533606 containerd[1628]: time="2026-03-12T23:49:35.533509511Z" level=info msg="StartContainer for \"0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44\" returns successfully" Mar 12 23:49:35.631374 systemd[1]: cri-containerd-0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44.scope: Deactivated successfully. Mar 12 23:49:35.633568 containerd[1628]: time="2026-03-12T23:49:35.633529299Z" level=info msg="received container exit event container_id:\"0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44\" id:\"0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44\" pid:3593 exited_at:{seconds:1773359375 nanos:633356899}" Mar 12 23:49:35.671804 kubelet[2863]: E0312 23:49:35.671747 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9szck" podUID="a240f634-9da5-444e-bd18-80be2ab75c23" Mar 12 23:49:36.371312 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0c96915dce055423dc670479b6344b861197fe1135671f8e667b73a81c671b44-rootfs.mount: Deactivated successfully. Mar 12 23:49:36.762644 containerd[1628]: time="2026-03-12T23:49:36.762390354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 12 23:49:37.671328 kubelet[2863]: E0312 23:49:37.671027 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9szck" podUID="a240f634-9da5-444e-bd18-80be2ab75c23" Mar 12 23:49:38.383581 kubelet[2863]: I0312 23:49:38.383539 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:49:38.930091 containerd[1628]: time="2026-03-12T23:49:38.930038061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:38.932916 containerd[1628]: time="2026-03-12T23:49:38.932887576Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 12 23:49:38.934405 containerd[1628]: time="2026-03-12T23:49:38.934354493Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:38.937157 containerd[1628]: time="2026-03-12T23:49:38.937109689Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:38.937723 containerd[1628]: time="2026-03-12T23:49:38.937691328Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.175263174s" Mar 12 23:49:38.937723 containerd[1628]: time="2026-03-12T23:49:38.937720368Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 12 23:49:38.943792 containerd[1628]: time="2026-03-12T23:49:38.943760997Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 12 23:49:38.956032 containerd[1628]: time="2026-03-12T23:49:38.955984016Z" level=info msg="Container 263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:38.971842 containerd[1628]: time="2026-03-12T23:49:38.971792269Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143\"" Mar 12 23:49:38.972337 containerd[1628]: time="2026-03-12T23:49:38.972312708Z" level=info msg="StartContainer for \"263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143\"" Mar 12 23:49:38.974278 containerd[1628]: time="2026-03-12T23:49:38.974243705Z" level=info msg="connecting to shim 263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143" address="unix:///run/containerd/s/eda9b717a1f7da61bcde1bb74f44a92360f6fc7e744366dcca0f5308cd667655" protocol=ttrpc version=3 Mar 12 23:49:38.999668 systemd[1]: Started cri-containerd-263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143.scope - libcontainer container 263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143. Mar 12 23:49:39.101882 containerd[1628]: time="2026-03-12T23:49:39.101840445Z" level=info msg="StartContainer for \"263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143\" returns successfully" Mar 12 23:49:39.509242 containerd[1628]: time="2026-03-12T23:49:39.509184583Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 23:49:39.511680 systemd[1]: cri-containerd-263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143.scope: Deactivated successfully. Mar 12 23:49:39.512024 systemd[1]: cri-containerd-263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143.scope: Consumed 465ms CPU time, 193.3M memory peak, 171.3M written to disk. Mar 12 23:49:39.513012 containerd[1628]: time="2026-03-12T23:49:39.512863257Z" level=info msg="received container exit event container_id:\"263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143\" id:\"263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143\" pid:3658 exited_at:{seconds:1773359379 nanos:512575777}" Mar 12 23:49:39.532270 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-263deed711668a8e4a9988b6df72c1ef26d006814fb4bf156165c2dd60969143-rootfs.mount: Deactivated successfully. Mar 12 23:49:39.545844 kubelet[2863]: I0312 23:49:39.545798 2863 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 12 23:49:39.610841 systemd[1]: Created slice kubepods-burstable-pod8aa4ec60_e1a1_4108_912f_bfa1021f9e1a.slice - libcontainer container kubepods-burstable-pod8aa4ec60_e1a1_4108_912f_bfa1021f9e1a.slice. Mar 12 23:49:39.622014 systemd[1]: Created slice kubepods-burstable-podb1fd0cfb_c997_44dc_9932_e2477c6dd925.slice - libcontainer container kubepods-burstable-podb1fd0cfb_c997_44dc_9932_e2477c6dd925.slice. Mar 12 23:49:39.632049 systemd[1]: Created slice kubepods-besteffort-pod377cbf76_95de_4a5f_af4f_7e8550f78380.slice - libcontainer container kubepods-besteffort-pod377cbf76_95de_4a5f_af4f_7e8550f78380.slice. Mar 12 23:49:39.638815 systemd[1]: Created slice kubepods-besteffort-pod19579544_0590_4a13_b454_9d848e5ef1c8.slice - libcontainer container kubepods-besteffort-pod19579544_0590_4a13_b454_9d848e5ef1c8.slice. Mar 12 23:49:39.645543 systemd[1]: Created slice kubepods-besteffort-pod5bcd708f_2a1c_437f_9c18_28373adbc8f2.slice - libcontainer container kubepods-besteffort-pod5bcd708f_2a1c_437f_9c18_28373adbc8f2.slice. Mar 12 23:49:39.654150 systemd[1]: Created slice kubepods-besteffort-pod5c52aa28_3fda_4026_9bd2_f6628c2ff87e.slice - libcontainer container kubepods-besteffort-pod5c52aa28_3fda_4026_9bd2_f6628c2ff87e.slice. Mar 12 23:49:39.660737 systemd[1]: Created slice kubepods-besteffort-pod4b3f69ac_4f6d_445c_a6b0_274c4dbc3c69.slice - libcontainer container kubepods-besteffort-pod4b3f69ac_4f6d_445c_a6b0_274c4dbc3c69.slice. Mar 12 23:49:39.661389 kubelet[2863]: I0312 23:49:39.661176 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-nginx-config\") pod \"whisker-85d4f8d679-tlvw9\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " pod="calico-system/whisker-85d4f8d679-tlvw9" Mar 12 23:49:39.661557 kubelet[2863]: I0312 23:49:39.661535 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j7gg\" (UniqueName: \"kubernetes.io/projected/19579544-0590-4a13-b454-9d848e5ef1c8-kube-api-access-9j7gg\") pod \"goldmane-cccfbd5cf-9qpsq\" (UID: \"19579544-0590-4a13-b454-9d848e5ef1c8\") " pod="calico-system/goldmane-cccfbd5cf-9qpsq" Mar 12 23:49:39.661625 kubelet[2863]: I0312 23:49:39.661615 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7xtx\" (UniqueName: \"kubernetes.io/projected/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-kube-api-access-n7xtx\") pod \"whisker-85d4f8d679-tlvw9\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " pod="calico-system/whisker-85d4f8d679-tlvw9" Mar 12 23:49:39.661699 kubelet[2863]: I0312 23:49:39.661687 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/19579544-0590-4a13-b454-9d848e5ef1c8-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-9qpsq\" (UID: \"19579544-0590-4a13-b454-9d848e5ef1c8\") " pod="calico-system/goldmane-cccfbd5cf-9qpsq" Mar 12 23:49:39.661764 kubelet[2863]: I0312 23:49:39.661752 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8aa4ec60-e1a1-4108-912f-bfa1021f9e1a-config-volume\") pod \"coredns-66bc5c9577-6lcvm\" (UID: \"8aa4ec60-e1a1-4108-912f-bfa1021f9e1a\") " pod="kube-system/coredns-66bc5c9577-6lcvm" Mar 12 23:49:39.661878 kubelet[2863]: I0312 23:49:39.661842 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9n6k\" (UniqueName: \"kubernetes.io/projected/5c52aa28-3fda-4026-9bd2-f6628c2ff87e-kube-api-access-c9n6k\") pod \"calico-apiserver-64485f694f-pjqf8\" (UID: \"5c52aa28-3fda-4026-9bd2-f6628c2ff87e\") " pod="calico-system/calico-apiserver-64485f694f-pjqf8" Mar 12 23:49:39.661916 kubelet[2863]: I0312 23:49:39.661889 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzl6h\" (UniqueName: \"kubernetes.io/projected/8aa4ec60-e1a1-4108-912f-bfa1021f9e1a-kube-api-access-nzl6h\") pod \"coredns-66bc5c9577-6lcvm\" (UID: \"8aa4ec60-e1a1-4108-912f-bfa1021f9e1a\") " pod="kube-system/coredns-66bc5c9577-6lcvm" Mar 12 23:49:39.661916 kubelet[2863]: I0312 23:49:39.661906 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1fd0cfb-c997-44dc-9932-e2477c6dd925-config-volume\") pod \"coredns-66bc5c9577-zvbsm\" (UID: \"b1fd0cfb-c997-44dc-9932-e2477c6dd925\") " pod="kube-system/coredns-66bc5c9577-zvbsm" Mar 12 23:49:39.661966 kubelet[2863]: I0312 23:49:39.661922 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bcd708f-2a1c-437f-9c18-28373adbc8f2-tigera-ca-bundle\") pod \"calico-kube-controllers-86bf7fd9f7-thg7n\" (UID: \"5bcd708f-2a1c-437f-9c18-28373adbc8f2\") " pod="calico-system/calico-kube-controllers-86bf7fd9f7-thg7n" Mar 12 23:49:39.661966 kubelet[2863]: I0312 23:49:39.661938 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/377cbf76-95de-4a5f-af4f-7e8550f78380-calico-apiserver-certs\") pod \"calico-apiserver-64485f694f-jt869\" (UID: \"377cbf76-95de-4a5f-af4f-7e8550f78380\") " pod="calico-system/calico-apiserver-64485f694f-jt869" Mar 12 23:49:39.661966 kubelet[2863]: I0312 23:49:39.661953 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f285q\" (UniqueName: \"kubernetes.io/projected/377cbf76-95de-4a5f-af4f-7e8550f78380-kube-api-access-f285q\") pod \"calico-apiserver-64485f694f-jt869\" (UID: \"377cbf76-95de-4a5f-af4f-7e8550f78380\") " pod="calico-system/calico-apiserver-64485f694f-jt869" Mar 12 23:49:39.662158 kubelet[2863]: I0312 23:49:39.662005 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19579544-0590-4a13-b454-9d848e5ef1c8-config\") pod \"goldmane-cccfbd5cf-9qpsq\" (UID: \"19579544-0590-4a13-b454-9d848e5ef1c8\") " pod="calico-system/goldmane-cccfbd5cf-9qpsq" Mar 12 23:49:39.662158 kubelet[2863]: I0312 23:49:39.662130 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19579544-0590-4a13-b454-9d848e5ef1c8-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-9qpsq\" (UID: \"19579544-0590-4a13-b454-9d848e5ef1c8\") " pod="calico-system/goldmane-cccfbd5cf-9qpsq" Mar 12 23:49:39.662263 kubelet[2863]: I0312 23:49:39.662164 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrlwh\" (UniqueName: \"kubernetes.io/projected/5bcd708f-2a1c-437f-9c18-28373adbc8f2-kube-api-access-lrlwh\") pod \"calico-kube-controllers-86bf7fd9f7-thg7n\" (UID: \"5bcd708f-2a1c-437f-9c18-28373adbc8f2\") " pod="calico-system/calico-kube-controllers-86bf7fd9f7-thg7n" Mar 12 23:49:39.662358 kubelet[2863]: I0312 23:49:39.662337 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-ca-bundle\") pod \"whisker-85d4f8d679-tlvw9\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " pod="calico-system/whisker-85d4f8d679-tlvw9" Mar 12 23:49:39.662456 kubelet[2863]: I0312 23:49:39.662434 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-backend-key-pair\") pod \"whisker-85d4f8d679-tlvw9\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " pod="calico-system/whisker-85d4f8d679-tlvw9" Mar 12 23:49:39.662491 kubelet[2863]: I0312 23:49:39.662461 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5c52aa28-3fda-4026-9bd2-f6628c2ff87e-calico-apiserver-certs\") pod \"calico-apiserver-64485f694f-pjqf8\" (UID: \"5c52aa28-3fda-4026-9bd2-f6628c2ff87e\") " pod="calico-system/calico-apiserver-64485f694f-pjqf8" Mar 12 23:49:39.662522 kubelet[2863]: I0312 23:49:39.662489 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwnj7\" (UniqueName: \"kubernetes.io/projected/b1fd0cfb-c997-44dc-9932-e2477c6dd925-kube-api-access-lwnj7\") pod \"coredns-66bc5c9577-zvbsm\" (UID: \"b1fd0cfb-c997-44dc-9932-e2477c6dd925\") " pod="kube-system/coredns-66bc5c9577-zvbsm" Mar 12 23:49:39.679503 systemd[1]: Created slice kubepods-besteffort-poda240f634_9da5_444e_bd18_80be2ab75c23.slice - libcontainer container kubepods-besteffort-poda240f634_9da5_444e_bd18_80be2ab75c23.slice. Mar 12 23:49:39.687464 containerd[1628]: time="2026-03-12T23:49:39.687415916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9szck,Uid:a240f634-9da5-444e-bd18-80be2ab75c23,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:39.759434 containerd[1628]: time="2026-03-12T23:49:39.759278592Z" level=error msg="Failed to destroy network for sandbox \"0df4b1120d8edc6eae83c170cc7fab64636ddf3102d398fba97db29aa7888f12\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:39.761560 containerd[1628]: time="2026-03-12T23:49:39.761399589Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9szck,Uid:a240f634-9da5-444e-bd18-80be2ab75c23,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df4b1120d8edc6eae83c170cc7fab64636ddf3102d398fba97db29aa7888f12\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:39.762368 kubelet[2863]: E0312 23:49:39.761843 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df4b1120d8edc6eae83c170cc7fab64636ddf3102d398fba97db29aa7888f12\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:39.762368 kubelet[2863]: E0312 23:49:39.761907 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df4b1120d8edc6eae83c170cc7fab64636ddf3102d398fba97db29aa7888f12\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9szck" Mar 12 23:49:39.762368 kubelet[2863]: E0312 23:49:39.761924 2863 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df4b1120d8edc6eae83c170cc7fab64636ddf3102d398fba97db29aa7888f12\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9szck" Mar 12 23:49:39.762554 kubelet[2863]: E0312 23:49:39.761970 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9szck_calico-system(a240f634-9da5-444e-bd18-80be2ab75c23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9szck_calico-system(a240f634-9da5-444e-bd18-80be2ab75c23)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0df4b1120d8edc6eae83c170cc7fab64636ddf3102d398fba97db29aa7888f12\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9szck" podUID="a240f634-9da5-444e-bd18-80be2ab75c23" Mar 12 23:49:39.792587 containerd[1628]: time="2026-03-12T23:49:39.792529735Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 12 23:49:39.805602 containerd[1628]: time="2026-03-12T23:49:39.805562073Z" level=info msg="Container 3d6c5f65d02567bc1d2e965fc3bf358a9e03226ec73068cf89636fe1d3631e8f: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:39.819478 containerd[1628]: time="2026-03-12T23:49:39.819426049Z" level=info msg="CreateContainer within sandbox \"2d3dacb07afd83b7476b59fa487f4d49c53625f8513a896a5bdbfc33d5d3c7ab\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3d6c5f65d02567bc1d2e965fc3bf358a9e03226ec73068cf89636fe1d3631e8f\"" Mar 12 23:49:39.820419 containerd[1628]: time="2026-03-12T23:49:39.820387967Z" level=info msg="StartContainer for \"3d6c5f65d02567bc1d2e965fc3bf358a9e03226ec73068cf89636fe1d3631e8f\"" Mar 12 23:49:39.823048 containerd[1628]: time="2026-03-12T23:49:39.823001563Z" level=info msg="connecting to shim 3d6c5f65d02567bc1d2e965fc3bf358a9e03226ec73068cf89636fe1d3631e8f" address="unix:///run/containerd/s/eda9b717a1f7da61bcde1bb74f44a92360f6fc7e744366dcca0f5308cd667655" protocol=ttrpc version=3 Mar 12 23:49:39.851448 systemd[1]: Started cri-containerd-3d6c5f65d02567bc1d2e965fc3bf358a9e03226ec73068cf89636fe1d3631e8f.scope - libcontainer container 3d6c5f65d02567bc1d2e965fc3bf358a9e03226ec73068cf89636fe1d3631e8f. Mar 12 23:49:39.920045 containerd[1628]: time="2026-03-12T23:49:39.920005356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6lcvm,Uid:8aa4ec60-e1a1-4108-912f-bfa1021f9e1a,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:39.930114 containerd[1628]: time="2026-03-12T23:49:39.930025618Z" level=info msg="StartContainer for \"3d6c5f65d02567bc1d2e965fc3bf358a9e03226ec73068cf89636fe1d3631e8f\" returns successfully" Mar 12 23:49:39.931581 containerd[1628]: time="2026-03-12T23:49:39.930789257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zvbsm,Uid:b1fd0cfb-c997-44dc-9932-e2477c6dd925,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:39.939028 containerd[1628]: time="2026-03-12T23:49:39.938942323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-jt869,Uid:377cbf76-95de-4a5f-af4f-7e8550f78380,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:39.947127 containerd[1628]: time="2026-03-12T23:49:39.947079789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-9qpsq,Uid:19579544-0590-4a13-b454-9d848e5ef1c8,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:39.957643 containerd[1628]: time="2026-03-12T23:49:39.957595131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bf7fd9f7-thg7n,Uid:5bcd708f-2a1c-437f-9c18-28373adbc8f2,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:39.965124 containerd[1628]: time="2026-03-12T23:49:39.965076558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-pjqf8,Uid:5c52aa28-3fda-4026-9bd2-f6628c2ff87e,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:39.968120 containerd[1628]: time="2026-03-12T23:49:39.967837993Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85d4f8d679-tlvw9,Uid:4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:39.975559 systemd[1]: run-netns-cni\x2dafce9bd6\x2d435d\x2d4d71\x2d0b84\x2d784a2235f177.mount: Deactivated successfully. Mar 12 23:49:40.038368 containerd[1628]: time="2026-03-12T23:49:40.038220712Z" level=error msg="Failed to destroy network for sandbox \"96ded46e0ab9cf8357dc3dedd4fbe29743682ff44952bccc30c814c73fbf7f6e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.041112 systemd[1]: run-netns-cni\x2dc73f5401\x2dcbae\x2d6436\x2d05ec\x2dd206f69dc7ed.mount: Deactivated successfully. Mar 12 23:49:40.044550 containerd[1628]: time="2026-03-12T23:49:40.044486061Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6lcvm,Uid:8aa4ec60-e1a1-4108-912f-bfa1021f9e1a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ded46e0ab9cf8357dc3dedd4fbe29743682ff44952bccc30c814c73fbf7f6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.045243 kubelet[2863]: E0312 23:49:40.044966 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ded46e0ab9cf8357dc3dedd4fbe29743682ff44952bccc30c814c73fbf7f6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.045243 kubelet[2863]: E0312 23:49:40.045056 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ded46e0ab9cf8357dc3dedd4fbe29743682ff44952bccc30c814c73fbf7f6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6lcvm" Mar 12 23:49:40.045243 kubelet[2863]: E0312 23:49:40.045090 2863 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96ded46e0ab9cf8357dc3dedd4fbe29743682ff44952bccc30c814c73fbf7f6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6lcvm" Mar 12 23:49:40.045514 kubelet[2863]: E0312 23:49:40.045141 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-6lcvm_kube-system(8aa4ec60-e1a1-4108-912f-bfa1021f9e1a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-6lcvm_kube-system(8aa4ec60-e1a1-4108-912f-bfa1021f9e1a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"96ded46e0ab9cf8357dc3dedd4fbe29743682ff44952bccc30c814c73fbf7f6e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-6lcvm" podUID="8aa4ec60-e1a1-4108-912f-bfa1021f9e1a" Mar 12 23:49:40.056497 containerd[1628]: time="2026-03-12T23:49:40.056418641Z" level=error msg="Failed to destroy network for sandbox \"3f7b127bd0527bacf046ea806eadf7b593faefe7c94e6670b2ffe0df647ceb26\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.059493 systemd[1]: run-netns-cni\x2d291de991\x2daa88\x2dd72a\x2dc627\x2dcb9eca6f137a.mount: Deactivated successfully. Mar 12 23:49:40.061155 containerd[1628]: time="2026-03-12T23:49:40.061083913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zvbsm,Uid:b1fd0cfb-c997-44dc-9932-e2477c6dd925,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f7b127bd0527bacf046ea806eadf7b593faefe7c94e6670b2ffe0df647ceb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.062850 kubelet[2863]: E0312 23:49:40.061362 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f7b127bd0527bacf046ea806eadf7b593faefe7c94e6670b2ffe0df647ceb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.062850 kubelet[2863]: E0312 23:49:40.061410 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f7b127bd0527bacf046ea806eadf7b593faefe7c94e6670b2ffe0df647ceb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-zvbsm" Mar 12 23:49:40.062850 kubelet[2863]: E0312 23:49:40.061441 2863 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f7b127bd0527bacf046ea806eadf7b593faefe7c94e6670b2ffe0df647ceb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-zvbsm" Mar 12 23:49:40.062948 kubelet[2863]: E0312 23:49:40.061495 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-zvbsm_kube-system(b1fd0cfb-c997-44dc-9932-e2477c6dd925)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-zvbsm_kube-system(b1fd0cfb-c997-44dc-9932-e2477c6dd925)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f7b127bd0527bacf046ea806eadf7b593faefe7c94e6670b2ffe0df647ceb26\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-zvbsm" podUID="b1fd0cfb-c997-44dc-9932-e2477c6dd925" Mar 12 23:49:40.096519 containerd[1628]: time="2026-03-12T23:49:40.096456492Z" level=error msg="Failed to destroy network for sandbox \"a686b8dd106ed8e47f24b7840623e5af4782884d896d3eeaeb8fe8f97b642b60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.100062 containerd[1628]: time="2026-03-12T23:49:40.100009205Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-9qpsq,Uid:19579544-0590-4a13-b454-9d848e5ef1c8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a686b8dd106ed8e47f24b7840623e5af4782884d896d3eeaeb8fe8f97b642b60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.100467 kubelet[2863]: E0312 23:49:40.100391 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a686b8dd106ed8e47f24b7840623e5af4782884d896d3eeaeb8fe8f97b642b60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.100843 kubelet[2863]: E0312 23:49:40.100448 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a686b8dd106ed8e47f24b7840623e5af4782884d896d3eeaeb8fe8f97b642b60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-9qpsq" Mar 12 23:49:40.100843 kubelet[2863]: E0312 23:49:40.100592 2863 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a686b8dd106ed8e47f24b7840623e5af4782884d896d3eeaeb8fe8f97b642b60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-9qpsq" Mar 12 23:49:40.100843 kubelet[2863]: E0312 23:49:40.100642 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-9qpsq_calico-system(19579544-0590-4a13-b454-9d848e5ef1c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-9qpsq_calico-system(19579544-0590-4a13-b454-9d848e5ef1c8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a686b8dd106ed8e47f24b7840623e5af4782884d896d3eeaeb8fe8f97b642b60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-9qpsq" podUID="19579544-0590-4a13-b454-9d848e5ef1c8" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.179 [INFO][3974] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.180 [INFO][3974] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" iface="eth0" netns="/var/run/netns/cni-0e3b68a3-6dd0-0536-1050-4f8c985d6c80" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.180 [INFO][3974] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" iface="eth0" netns="/var/run/netns/cni-0e3b68a3-6dd0-0536-1050-4f8c985d6c80" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.180 [INFO][3974] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" iface="eth0" netns="/var/run/netns/cni-0e3b68a3-6dd0-0536-1050-4f8c985d6c80" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.180 [INFO][3974] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.180 [INFO][3974] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.228 [INFO][4022] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" HandleID="k8s-pod-network.193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.228 [INFO][4022] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:40.249861 containerd[1628]: 2026-03-12 23:49:40.228 [INFO][4022] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:40.250211 containerd[1628]: 2026-03-12 23:49:40.241 [WARNING][4022] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" HandleID="k8s-pod-network.193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.250211 containerd[1628]: 2026-03-12 23:49:40.241 [INFO][4022] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" HandleID="k8s-pod-network.193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.250211 containerd[1628]: 2026-03-12 23:49:40.243 [INFO][4022] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:40.250211 containerd[1628]: 2026-03-12 23:49:40.245 [INFO][3974] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4" Mar 12 23:49:40.253807 containerd[1628]: time="2026-03-12T23:49:40.253751941Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-jt869,Uid:377cbf76-95de-4a5f-af4f-7e8550f78380,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.254140 kubelet[2863]: E0312 23:49:40.254090 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.254236 kubelet[2863]: E0312 23:49:40.254221 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-64485f694f-jt869" Mar 12 23:49:40.254314 kubelet[2863]: E0312 23:49:40.254300 2863 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-64485f694f-jt869" Mar 12 23:49:40.254440 kubelet[2863]: E0312 23:49:40.254402 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64485f694f-jt869_calico-system(377cbf76-95de-4a5f-af4f-7e8550f78380)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64485f694f-jt869_calico-system(377cbf76-95de-4a5f-af4f-7e8550f78380)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"193836370cd63c7a251fa6f13d18a8318130088c163a1a8b17a2e6e4bfc727e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-64485f694f-jt869" podUID="377cbf76-95de-4a5f-af4f-7e8550f78380" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.160 [INFO][3959] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.160 [INFO][3959] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" iface="eth0" netns="/var/run/netns/cni-db7101e8-d103-785d-30da-92e5dd31cc11" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.162 [INFO][3959] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" iface="eth0" netns="/var/run/netns/cni-db7101e8-d103-785d-30da-92e5dd31cc11" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.163 [INFO][3959] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" iface="eth0" netns="/var/run/netns/cni-db7101e8-d103-785d-30da-92e5dd31cc11" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.163 [INFO][3959] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.163 [INFO][3959] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.228 [INFO][4014] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" HandleID="k8s-pod-network.c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.229 [INFO][4014] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:40.262020 containerd[1628]: 2026-03-12 23:49:40.243 [INFO][4014] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:40.263516 containerd[1628]: 2026-03-12 23:49:40.257 [WARNING][4014] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" HandleID="k8s-pod-network.c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:40.263516 containerd[1628]: 2026-03-12 23:49:40.257 [INFO][4014] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" HandleID="k8s-pod-network.c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:40.263516 containerd[1628]: 2026-03-12 23:49:40.259 [INFO][4014] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:40.263516 containerd[1628]: 2026-03-12 23:49:40.260 [INFO][3959] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135" Mar 12 23:49:40.265245 containerd[1628]: time="2026-03-12T23:49:40.265190761Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-pjqf8,Uid:5c52aa28-3fda-4026-9bd2-f6628c2ff87e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.265451 kubelet[2863]: E0312 23:49:40.265413 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.265500 kubelet[2863]: E0312 23:49:40.265462 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-64485f694f-pjqf8" Mar 12 23:49:40.265500 kubelet[2863]: E0312 23:49:40.265480 2863 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-64485f694f-pjqf8" Mar 12 23:49:40.265555 kubelet[2863]: E0312 23:49:40.265522 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64485f694f-pjqf8_calico-system(5c52aa28-3fda-4026-9bd2-f6628c2ff87e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64485f694f-pjqf8_calico-system(5c52aa28-3fda-4026-9bd2-f6628c2ff87e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c96d56afad06583de00aeff57b25501a5d8de88f7f1b7f205126bcdb57602135\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-64485f694f-pjqf8" podUID="5c52aa28-3fda-4026-9bd2-f6628c2ff87e" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.188 [INFO][3996] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.188 [INFO][3996] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" iface="eth0" netns="/var/run/netns/cni-99ece552-87f7-7d5a-40e4-33ff66a9a6c3" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.189 [INFO][3996] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" iface="eth0" netns="/var/run/netns/cni-99ece552-87f7-7d5a-40e4-33ff66a9a6c3" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.189 [INFO][3996] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" iface="eth0" netns="/var/run/netns/cni-99ece552-87f7-7d5a-40e4-33ff66a9a6c3" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.189 [INFO][3996] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.189 [INFO][3996] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.231 [INFO][4029] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" HandleID="k8s-pod-network.9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.231 [INFO][4029] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:40.276369 containerd[1628]: 2026-03-12 23:49:40.259 [INFO][4029] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:40.276605 containerd[1628]: 2026-03-12 23:49:40.270 [WARNING][4029] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" HandleID="k8s-pod-network.9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:40.276605 containerd[1628]: 2026-03-12 23:49:40.271 [INFO][4029] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" HandleID="k8s-pod-network.9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:40.276605 containerd[1628]: 2026-03-12 23:49:40.272 [INFO][4029] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:40.276605 containerd[1628]: 2026-03-12 23:49:40.274 [INFO][3996] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2" Mar 12 23:49:40.278688 containerd[1628]: time="2026-03-12T23:49:40.278649298Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bf7fd9f7-thg7n,Uid:5bcd708f-2a1c-437f-9c18-28373adbc8f2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.278893 kubelet[2863]: E0312 23:49:40.278862 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.278940 kubelet[2863]: E0312 23:49:40.278908 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86bf7fd9f7-thg7n" Mar 12 23:49:40.278940 kubelet[2863]: E0312 23:49:40.278930 2863 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86bf7fd9f7-thg7n" Mar 12 23:49:40.279003 kubelet[2863]: E0312 23:49:40.278978 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86bf7fd9f7-thg7n_calico-system(5bcd708f-2a1c-437f-9c18-28373adbc8f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86bf7fd9f7-thg7n_calico-system(5bcd708f-2a1c-437f-9c18-28373adbc8f2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e8c7b71ccc1ba603cb5943bafe9487221bd5e9e9c75d4da947f02c9d901e9b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86bf7fd9f7-thg7n" podUID="5bcd708f-2a1c-437f-9c18-28373adbc8f2" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.197 [INFO][3948] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.197 [INFO][3948] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" iface="eth0" netns="/var/run/netns/cni-d89448c5-4d20-ebaf-ec22-1418e1c31865" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.197 [INFO][3948] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" iface="eth0" netns="/var/run/netns/cni-d89448c5-4d20-ebaf-ec22-1418e1c31865" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.199 [INFO][3948] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" iface="eth0" netns="/var/run/netns/cni-d89448c5-4d20-ebaf-ec22-1418e1c31865" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.199 [INFO][3948] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.199 [INFO][3948] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.232 [INFO][4035] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" HandleID="k8s-pod-network.1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--85d4f8d679--tlvw9-eth0" Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.232 [INFO][4035] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:40.289655 containerd[1628]: 2026-03-12 23:49:40.273 [INFO][4035] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:40.289877 containerd[1628]: 2026-03-12 23:49:40.283 [WARNING][4035] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" HandleID="k8s-pod-network.1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--85d4f8d679--tlvw9-eth0" Mar 12 23:49:40.289877 containerd[1628]: 2026-03-12 23:49:40.283 [INFO][4035] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" HandleID="k8s-pod-network.1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--85d4f8d679--tlvw9-eth0" Mar 12 23:49:40.289877 containerd[1628]: 2026-03-12 23:49:40.285 [INFO][4035] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:40.289877 containerd[1628]: 2026-03-12 23:49:40.287 [INFO][3948] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6" Mar 12 23:49:40.290309 containerd[1628]: time="2026-03-12T23:49:40.290231518Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85d4f8d679-tlvw9,Uid:4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.290487 kubelet[2863]: E0312 23:49:40.290453 2863 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:49:40.290529 kubelet[2863]: E0312 23:49:40.290507 2863 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a3772be8006fff275ff69e91955c635458b989aaed56be716f8b1086a8b87d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-85d4f8d679-tlvw9" Mar 12 23:49:40.791458 containerd[1628]: time="2026-03-12T23:49:40.791413055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-jt869,Uid:377cbf76-95de-4a5f-af4f-7e8550f78380,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:40.797341 containerd[1628]: time="2026-03-12T23:49:40.796710805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-pjqf8,Uid:5c52aa28-3fda-4026-9bd2-f6628c2ff87e,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:40.799542 containerd[1628]: time="2026-03-12T23:49:40.799502321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bf7fd9f7-thg7n,Uid:5bcd708f-2a1c-437f-9c18-28373adbc8f2,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:40.817841 kubelet[2863]: I0312 23:49:40.817262 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-k2x2l" podStartSLOduration=2.320343569 podStartE2EDuration="12.81724493s" podCreationTimestamp="2026-03-12 23:49:28 +0000 UTC" firstStartedPulling="2026-03-12 23:49:28.441617045 +0000 UTC m=+17.860576040" lastFinishedPulling="2026-03-12 23:49:38.938518446 +0000 UTC m=+28.357477401" observedRunningTime="2026-03-12 23:49:40.813887696 +0000 UTC m=+30.232846691" watchObservedRunningTime="2026-03-12 23:49:40.81724493 +0000 UTC m=+30.236203925" Mar 12 23:49:40.872539 kubelet[2863]: I0312 23:49:40.872485 2863 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7xtx\" (UniqueName: \"kubernetes.io/projected/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-kube-api-access-n7xtx\") pod \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " Mar 12 23:49:40.872539 kubelet[2863]: I0312 23:49:40.872543 2863 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-ca-bundle\") pod \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " Mar 12 23:49:40.872684 kubelet[2863]: I0312 23:49:40.872561 2863 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-backend-key-pair\") pod \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " Mar 12 23:49:40.872684 kubelet[2863]: I0312 23:49:40.872587 2863 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-nginx-config\") pod \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\" (UID: \"4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69\") " Mar 12 23:49:40.873562 kubelet[2863]: I0312 23:49:40.873487 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69" (UID: "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:49:40.876342 kubelet[2863]: I0312 23:49:40.876130 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69" (UID: "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 23:49:40.876426 kubelet[2863]: I0312 23:49:40.876412 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69" (UID: "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:49:40.877499 kubelet[2863]: I0312 23:49:40.877447 2863 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-kube-api-access-n7xtx" (OuterVolumeSpecName: "kube-api-access-n7xtx") pod "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69" (UID: "4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69"). InnerVolumeSpecName "kube-api-access-n7xtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 23:49:40.958404 systemd[1]: run-netns-cni\x2dd89448c5\x2d4d20\x2debaf\x2dec22\x2d1418e1c31865.mount: Deactivated successfully. Mar 12 23:49:40.958497 systemd[1]: run-netns-cni\x2d99ece552\x2d87f7\x2d7d5a\x2d40e4\x2d33ff66a9a6c3.mount: Deactivated successfully. Mar 12 23:49:40.958543 systemd[1]: run-netns-cni\x2ddb7101e8\x2dd103\x2d785d\x2d30da\x2d92e5dd31cc11.mount: Deactivated successfully. Mar 12 23:49:40.958582 systemd[1]: run-netns-cni\x2da4fd9177\x2d2144\x2d0232\x2d8499\x2d57cdd2fe3b0c.mount: Deactivated successfully. Mar 12 23:49:40.958627 systemd[1]: run-netns-cni\x2d0e3b68a3\x2d6dd0\x2d0536\x2d1050\x2d4f8c985d6c80.mount: Deactivated successfully. Mar 12 23:49:40.958670 systemd[1]: var-lib-kubelet-pods-4b3f69ac\x2d4f6d\x2d445c\x2da6b0\x2d274c4dbc3c69-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dn7xtx.mount: Deactivated successfully. Mar 12 23:49:40.958721 systemd[1]: var-lib-kubelet-pods-4b3f69ac\x2d4f6d\x2d445c\x2da6b0\x2d274c4dbc3c69-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 12 23:49:40.967146 systemd-networkd[1540]: cali26b7cefa7aa: Link UP Mar 12 23:49:40.968461 systemd-networkd[1540]: cali26b7cefa7aa: Gained carrier Mar 12 23:49:40.973310 kubelet[2863]: I0312 23:49:40.973261 2863 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7xtx\" (UniqueName: \"kubernetes.io/projected/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-kube-api-access-n7xtx\") on node \"ci-4459-2-4-n-9e79e0a9ae\" DevicePath \"\"" Mar 12 23:49:40.973310 kubelet[2863]: I0312 23:49:40.973302 2863 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-ca-bundle\") on node \"ci-4459-2-4-n-9e79e0a9ae\" DevicePath \"\"" Mar 12 23:49:40.973310 kubelet[2863]: I0312 23:49:40.973313 2863 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-9e79e0a9ae\" DevicePath \"\"" Mar 12 23:49:40.973467 kubelet[2863]: I0312 23:49:40.973324 2863 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69-nginx-config\") on node \"ci-4459-2-4-n-9e79e0a9ae\" DevicePath \"\"" Mar 12 23:49:40.980836 containerd[1628]: 2026-03-12 23:49:40.834 [ERROR][4074] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 23:49:40.980836 containerd[1628]: 2026-03-12 23:49:40.857 [INFO][4074] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0 calico-apiserver-64485f694f- calico-system 377cbf76-95de-4a5f-af4f-7e8550f78380 843 0 2026-03-12 23:49:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64485f694f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae calico-apiserver-64485f694f-jt869 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali26b7cefa7aa [] [] }} ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-" Mar 12 23:49:40.980836 containerd[1628]: 2026-03-12 23:49:40.857 [INFO][4074] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.980836 containerd[1628]: 2026-03-12 23:49:40.899 [INFO][4126] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" HandleID="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.915 [INFO][4126] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" HandleID="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400050cf20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"calico-apiserver-64485f694f-jt869", "timestamp":"2026-03-12 23:49:40.899647108 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002046e0)} Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.915 [INFO][4126] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.915 [INFO][4126] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.915 [INFO][4126] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.919 [INFO][4126] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.925 [INFO][4126] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.932 [INFO][4126] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.934 [INFO][4126] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981590 containerd[1628]: 2026-03-12 23:49:40.937 [INFO][4126] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981865 containerd[1628]: 2026-03-12 23:49:40.937 [INFO][4126] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981865 containerd[1628]: 2026-03-12 23:49:40.940 [INFO][4126] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37 Mar 12 23:49:40.981865 containerd[1628]: 2026-03-12 23:49:40.945 [INFO][4126] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981865 containerd[1628]: 2026-03-12 23:49:40.950 [INFO][4126] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.129/26] block=192.168.124.128/26 handle="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981865 containerd[1628]: 2026-03-12 23:49:40.950 [INFO][4126] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.129/26] handle="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:40.981865 containerd[1628]: 2026-03-12 23:49:40.950 [INFO][4126] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:40.981865 containerd[1628]: 2026-03-12 23:49:40.950 [INFO][4126] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.129/26] IPv6=[] ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" HandleID="k8s-pod-network.766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.982045 containerd[1628]: 2026-03-12 23:49:40.952 [INFO][4074] cni-plugin/k8s.go 418: Populated endpoint ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0", GenerateName:"calico-apiserver-64485f694f-", Namespace:"calico-system", SelfLink:"", UID:"377cbf76-95de-4a5f-af4f-7e8550f78380", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64485f694f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"calico-apiserver-64485f694f-jt869", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali26b7cefa7aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:40.982098 containerd[1628]: 2026-03-12 23:49:40.952 [INFO][4074] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.129/32] ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.982098 containerd[1628]: 2026-03-12 23:49:40.952 [INFO][4074] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali26b7cefa7aa ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.982098 containerd[1628]: 2026-03-12 23:49:40.968 [INFO][4074] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:40.982153 containerd[1628]: 2026-03-12 23:49:40.968 [INFO][4074] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0", GenerateName:"calico-apiserver-64485f694f-", Namespace:"calico-system", SelfLink:"", UID:"377cbf76-95de-4a5f-af4f-7e8550f78380", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64485f694f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37", Pod:"calico-apiserver-64485f694f-jt869", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali26b7cefa7aa", MAC:"8e:13:91:c0:2a:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:40.982199 containerd[1628]: 2026-03-12 23:49:40.978 [INFO][4074] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" Namespace="calico-system" Pod="calico-apiserver-64485f694f-jt869" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--jt869-eth0" Mar 12 23:49:41.005114 containerd[1628]: time="2026-03-12T23:49:41.005074047Z" level=info msg="connecting to shim 766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37" address="unix:///run/containerd/s/ac1a7e2ae0f8132c180b7286fd54a52ca429f368d279816541560b189e56ae47" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:41.031460 systemd[1]: Started cri-containerd-766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37.scope - libcontainer container 766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37. Mar 12 23:49:41.058674 systemd-networkd[1540]: calia25f9569897: Link UP Mar 12 23:49:41.059119 systemd-networkd[1540]: calia25f9569897: Gained carrier Mar 12 23:49:41.077184 containerd[1628]: time="2026-03-12T23:49:41.076887563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-jt869,Uid:377cbf76-95de-4a5f-af4f-7e8550f78380,Namespace:calico-system,Attempt:0,} returns sandbox id \"766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37\"" Mar 12 23:49:41.078380 containerd[1628]: 2026-03-12 23:49:40.844 [ERROR][4097] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 23:49:41.078380 containerd[1628]: 2026-03-12 23:49:40.862 [INFO][4097] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0 calico-kube-controllers-86bf7fd9f7- calico-system 5bcd708f-2a1c-437f-9c18-28373adbc8f2 844 0 2026-03-12 23:49:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:86bf7fd9f7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae calico-kube-controllers-86bf7fd9f7-thg7n eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calia25f9569897 [] [] }} ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-" Mar 12 23:49:41.078380 containerd[1628]: 2026-03-12 23:49:40.862 [INFO][4097] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:41.078380 containerd[1628]: 2026-03-12 23:49:40.897 [INFO][4128] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" HandleID="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:40.916 [INFO][4128] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" HandleID="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000416ee0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"calico-kube-controllers-86bf7fd9f7-thg7n", "timestamp":"2026-03-12 23:49:40.897053833 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002a4dc0)} Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:40.916 [INFO][4128] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:40.950 [INFO][4128] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:40.950 [INFO][4128] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:41.020 [INFO][4128] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:41.026 [INFO][4128] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:41.032 [INFO][4128] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:41.035 [INFO][4128] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078553 containerd[1628]: 2026-03-12 23:49:41.038 [INFO][4128] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078748 containerd[1628]: 2026-03-12 23:49:41.038 [INFO][4128] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078748 containerd[1628]: 2026-03-12 23:49:41.040 [INFO][4128] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795 Mar 12 23:49:41.078748 containerd[1628]: 2026-03-12 23:49:41.045 [INFO][4128] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078748 containerd[1628]: 2026-03-12 23:49:41.052 [INFO][4128] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.130/26] block=192.168.124.128/26 handle="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078748 containerd[1628]: 2026-03-12 23:49:41.052 [INFO][4128] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.130/26] handle="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.078748 containerd[1628]: 2026-03-12 23:49:41.052 [INFO][4128] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:41.078748 containerd[1628]: 2026-03-12 23:49:41.052 [INFO][4128] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.130/26] IPv6=[] ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" HandleID="k8s-pod-network.d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:41.078909 containerd[1628]: 2026-03-12 23:49:41.054 [INFO][4097] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0", GenerateName:"calico-kube-controllers-86bf7fd9f7-", Namespace:"calico-system", SelfLink:"", UID:"5bcd708f-2a1c-437f-9c18-28373adbc8f2", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86bf7fd9f7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"calico-kube-controllers-86bf7fd9f7-thg7n", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.124.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia25f9569897", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:41.078956 containerd[1628]: 2026-03-12 23:49:41.054 [INFO][4097] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.130/32] ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:41.078956 containerd[1628]: 2026-03-12 23:49:41.055 [INFO][4097] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia25f9569897 ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:41.078956 containerd[1628]: 2026-03-12 23:49:41.066 [INFO][4097] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:41.079016 containerd[1628]: 2026-03-12 23:49:41.066 [INFO][4097] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0", GenerateName:"calico-kube-controllers-86bf7fd9f7-", Namespace:"calico-system", SelfLink:"", UID:"5bcd708f-2a1c-437f-9c18-28373adbc8f2", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86bf7fd9f7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795", Pod:"calico-kube-controllers-86bf7fd9f7-thg7n", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.124.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia25f9569897", MAC:"3a:bf:96:9c:a4:d8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:41.079062 containerd[1628]: 2026-03-12 23:49:41.076 [INFO][4097] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" Namespace="calico-system" Pod="calico-kube-controllers-86bf7fd9f7-thg7n" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--kube--controllers--86bf7fd9f7--thg7n-eth0" Mar 12 23:49:41.079549 containerd[1628]: time="2026-03-12T23:49:41.079504038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:49:41.106973 containerd[1628]: time="2026-03-12T23:49:41.106898911Z" level=info msg="connecting to shim d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795" address="unix:///run/containerd/s/4dc717a38d2b56cd0c26954f4300d86189c24aebf024af3061933109cba7d165" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:41.138618 systemd[1]: Started cri-containerd-d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795.scope - libcontainer container d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795. Mar 12 23:49:41.165028 systemd-networkd[1540]: cali62a7353e6ec: Link UP Mar 12 23:49:41.165712 systemd-networkd[1540]: cali62a7353e6ec: Gained carrier Mar 12 23:49:41.181708 containerd[1628]: 2026-03-12 23:49:40.844 [ERROR][4086] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 23:49:41.181708 containerd[1628]: 2026-03-12 23:49:40.864 [INFO][4086] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0 calico-apiserver-64485f694f- calico-system 5c52aa28-3fda-4026-9bd2-f6628c2ff87e 840 0 2026-03-12 23:49:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64485f694f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae calico-apiserver-64485f694f-pjqf8 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali62a7353e6ec [] [] }} ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-" Mar 12 23:49:41.181708 containerd[1628]: 2026-03-12 23:49:40.864 [INFO][4086] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:41.181708 containerd[1628]: 2026-03-12 23:49:40.902 [INFO][4143] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" HandleID="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:40.918 [INFO][4143] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" HandleID="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003774d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"calico-apiserver-64485f694f-pjqf8", "timestamp":"2026-03-12 23:49:40.902934262 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001eb600)} Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:40.918 [INFO][4143] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:41.052 [INFO][4143] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:41.053 [INFO][4143] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:41.120 [INFO][4143] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:41.131 [INFO][4143] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:41.136 [INFO][4143] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:41.139 [INFO][4143] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.181905 containerd[1628]: 2026-03-12 23:49:41.142 [INFO][4143] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.182102 containerd[1628]: 2026-03-12 23:49:41.142 [INFO][4143] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.182102 containerd[1628]: 2026-03-12 23:49:41.144 [INFO][4143] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186 Mar 12 23:49:41.182102 containerd[1628]: 2026-03-12 23:49:41.150 [INFO][4143] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.182102 containerd[1628]: 2026-03-12 23:49:41.158 [INFO][4143] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.131/26] block=192.168.124.128/26 handle="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.182102 containerd[1628]: 2026-03-12 23:49:41.158 [INFO][4143] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.131/26] handle="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:41.182102 containerd[1628]: 2026-03-12 23:49:41.158 [INFO][4143] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:41.182102 containerd[1628]: 2026-03-12 23:49:41.158 [INFO][4143] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.131/26] IPv6=[] ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" HandleID="k8s-pod-network.15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:41.182225 containerd[1628]: 2026-03-12 23:49:41.160 [INFO][4086] cni-plugin/k8s.go 418: Populated endpoint ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0", GenerateName:"calico-apiserver-64485f694f-", Namespace:"calico-system", SelfLink:"", UID:"5c52aa28-3fda-4026-9bd2-f6628c2ff87e", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64485f694f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"calico-apiserver-64485f694f-pjqf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali62a7353e6ec", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:41.182281 containerd[1628]: 2026-03-12 23:49:41.160 [INFO][4086] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.131/32] ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:41.182281 containerd[1628]: 2026-03-12 23:49:41.160 [INFO][4086] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali62a7353e6ec ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:41.182281 containerd[1628]: 2026-03-12 23:49:41.165 [INFO][4086] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:41.182385 containerd[1628]: 2026-03-12 23:49:41.167 [INFO][4086] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0", GenerateName:"calico-apiserver-64485f694f-", Namespace:"calico-system", SelfLink:"", UID:"5c52aa28-3fda-4026-9bd2-f6628c2ff87e", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64485f694f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186", Pod:"calico-apiserver-64485f694f-pjqf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.124.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali62a7353e6ec", MAC:"da:fd:37:d4:67:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:41.182438 containerd[1628]: 2026-03-12 23:49:41.179 [INFO][4086] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" Namespace="calico-system" Pod="calico-apiserver-64485f694f-pjqf8" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-calico--apiserver--64485f694f--pjqf8-eth0" Mar 12 23:49:41.185967 containerd[1628]: time="2026-03-12T23:49:41.185912255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bf7fd9f7-thg7n,Uid:5bcd708f-2a1c-437f-9c18-28373adbc8f2,Namespace:calico-system,Attempt:0,} returns sandbox id \"d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795\"" Mar 12 23:49:41.210336 containerd[1628]: time="2026-03-12T23:49:41.210079773Z" level=info msg="connecting to shim 15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186" address="unix:///run/containerd/s/e1a2a08001801e33c84e1da81ec4702414617f4d612be17bc3209e160d7fe6ec" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:41.231459 systemd[1]: Started cri-containerd-15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186.scope - libcontainer container 15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186. Mar 12 23:49:41.261917 containerd[1628]: time="2026-03-12T23:49:41.261879724Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64485f694f-pjqf8,Uid:5c52aa28-3fda-4026-9bd2-f6628c2ff87e,Namespace:calico-system,Attempt:0,} returns sandbox id \"15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186\"" Mar 12 23:49:41.798133 systemd[1]: Removed slice kubepods-besteffort-pod4b3f69ac_4f6d_445c_a6b0_274c4dbc3c69.slice - libcontainer container kubepods-besteffort-pod4b3f69ac_4f6d_445c_a6b0_274c4dbc3c69.slice. Mar 12 23:49:41.879685 systemd[1]: Created slice kubepods-besteffort-poda16cf0c8_a9bd_4185_8f98_ee35fab4b6f1.slice - libcontainer container kubepods-besteffort-poda16cf0c8_a9bd_4185_8f98_ee35fab4b6f1.slice. Mar 12 23:49:41.980490 kubelet[2863]: I0312 23:49:41.980359 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x755d\" (UniqueName: \"kubernetes.io/projected/a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1-kube-api-access-x755d\") pod \"whisker-d5fb6f4dd-b6lv7\" (UID: \"a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1\") " pod="calico-system/whisker-d5fb6f4dd-b6lv7" Mar 12 23:49:41.980490 kubelet[2863]: I0312 23:49:41.980403 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1-nginx-config\") pod \"whisker-d5fb6f4dd-b6lv7\" (UID: \"a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1\") " pod="calico-system/whisker-d5fb6f4dd-b6lv7" Mar 12 23:49:41.980490 kubelet[2863]: I0312 23:49:41.980444 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1-whisker-ca-bundle\") pod \"whisker-d5fb6f4dd-b6lv7\" (UID: \"a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1\") " pod="calico-system/whisker-d5fb6f4dd-b6lv7" Mar 12 23:49:41.980852 kubelet[2863]: I0312 23:49:41.980553 2863 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1-whisker-backend-key-pair\") pod \"whisker-d5fb6f4dd-b6lv7\" (UID: \"a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1\") " pod="calico-system/whisker-d5fb6f4dd-b6lv7" Mar 12 23:49:42.106238 systemd-networkd[1540]: vxlan.calico: Link UP Mar 12 23:49:42.106250 systemd-networkd[1540]: vxlan.calico: Gained carrier Mar 12 23:49:42.189311 containerd[1628]: time="2026-03-12T23:49:42.188521848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d5fb6f4dd-b6lv7,Uid:a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:42.317618 systemd-networkd[1540]: caliaa28ee8d46d: Link UP Mar 12 23:49:42.318144 systemd-networkd[1540]: caliaa28ee8d46d: Gained carrier Mar 12 23:49:42.336082 containerd[1628]: 2026-03-12 23:49:42.227 [INFO][4516] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0 whisker-d5fb6f4dd- calico-system a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1 890 0 2026-03-12 23:49:41 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:d5fb6f4dd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae whisker-d5fb6f4dd-b6lv7 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliaa28ee8d46d [] [] }} ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-" Mar 12 23:49:42.336082 containerd[1628]: 2026-03-12 23:49:42.227 [INFO][4516] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" Mar 12 23:49:42.336082 containerd[1628]: 2026-03-12 23:49:42.252 [INFO][4532] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" HandleID="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.262 [INFO][4532] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" HandleID="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004dbb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"whisker-d5fb6f4dd-b6lv7", "timestamp":"2026-03-12 23:49:42.252159738 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003dedc0)} Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.262 [INFO][4532] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.262 [INFO][4532] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.262 [INFO][4532] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.265 [INFO][4532] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.273 [INFO][4532] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.281 [INFO][4532] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.285 [INFO][4532] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336272 containerd[1628]: 2026-03-12 23:49:42.289 [INFO][4532] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336485 containerd[1628]: 2026-03-12 23:49:42.289 [INFO][4532] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336485 containerd[1628]: 2026-03-12 23:49:42.293 [INFO][4532] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c Mar 12 23:49:42.336485 containerd[1628]: 2026-03-12 23:49:42.299 [INFO][4532] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336485 containerd[1628]: 2026-03-12 23:49:42.308 [INFO][4532] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.132/26] block=192.168.124.128/26 handle="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336485 containerd[1628]: 2026-03-12 23:49:42.308 [INFO][4532] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.132/26] handle="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:42.336485 containerd[1628]: 2026-03-12 23:49:42.308 [INFO][4532] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:42.336485 containerd[1628]: 2026-03-12 23:49:42.308 [INFO][4532] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.132/26] IPv6=[] ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" HandleID="k8s-pod-network.675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" Mar 12 23:49:42.337941 containerd[1628]: 2026-03-12 23:49:42.312 [INFO][4516] cni-plugin/k8s.go 418: Populated endpoint ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0", GenerateName:"whisker-d5fb6f4dd-", Namespace:"calico-system", SelfLink:"", UID:"a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"d5fb6f4dd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"whisker-d5fb6f4dd-b6lv7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.124.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaa28ee8d46d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:42.337941 containerd[1628]: 2026-03-12 23:49:42.312 [INFO][4516] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.132/32] ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" Mar 12 23:49:42.338088 containerd[1628]: 2026-03-12 23:49:42.312 [INFO][4516] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaa28ee8d46d ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" Mar 12 23:49:42.338088 containerd[1628]: 2026-03-12 23:49:42.318 [INFO][4516] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" Mar 12 23:49:42.338137 containerd[1628]: 2026-03-12 23:49:42.318 [INFO][4516] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0", GenerateName:"whisker-d5fb6f4dd-", Namespace:"calico-system", SelfLink:"", UID:"a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"d5fb6f4dd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c", Pod:"whisker-d5fb6f4dd-b6lv7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.124.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaa28ee8d46d", MAC:"06:cb:bf:0b:f5:9c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:42.338188 containerd[1628]: 2026-03-12 23:49:42.331 [INFO][4516] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" Namespace="calico-system" Pod="whisker-d5fb6f4dd-b6lv7" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-whisker--d5fb6f4dd--b6lv7-eth0" Mar 12 23:49:42.368478 containerd[1628]: time="2026-03-12T23:49:42.367375060Z" level=info msg="connecting to shim 675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c" address="unix:///run/containerd/s/56156ef753da6fe65ae010bbdd8ad578cfa121fd4f365450200e7647b1fa1aa9" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:42.392486 systemd[1]: Started cri-containerd-675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c.scope - libcontainer container 675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c. Mar 12 23:49:42.439505 systemd-networkd[1540]: cali26b7cefa7aa: Gained IPv6LL Mar 12 23:49:42.440291 containerd[1628]: time="2026-03-12T23:49:42.440256534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d5fb6f4dd-b6lv7,Uid:a16cf0c8-a9bd-4185-8f98-ee35fab4b6f1,Namespace:calico-system,Attempt:0,} returns sandbox id \"675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c\"" Mar 12 23:49:42.674378 kubelet[2863]: I0312 23:49:42.674273 2863 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69" path="/var/lib/kubelet/pods/4b3f69ac-4f6d-445c-a6b0-274c4dbc3c69/volumes" Mar 12 23:49:42.759490 systemd-networkd[1540]: calia25f9569897: Gained IPv6LL Mar 12 23:49:42.980504 containerd[1628]: time="2026-03-12T23:49:42.980256804Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:42.981676 containerd[1628]: time="2026-03-12T23:49:42.981639842Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 12 23:49:42.983233 containerd[1628]: time="2026-03-12T23:49:42.983188879Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:42.986520 containerd[1628]: time="2026-03-12T23:49:42.986472554Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:42.987266 containerd[1628]: time="2026-03-12T23:49:42.987225032Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 1.907668034s" Mar 12 23:49:42.987266 containerd[1628]: time="2026-03-12T23:49:42.987258272Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:49:42.988819 containerd[1628]: time="2026-03-12T23:49:42.988789750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 12 23:49:42.992716 containerd[1628]: time="2026-03-12T23:49:42.992682943Z" level=info msg="CreateContainer within sandbox \"766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:49:43.006030 containerd[1628]: time="2026-03-12T23:49:43.005687040Z" level=info msg="Container a008c410fd8c9fbf2e5140bcec77759185efc6106ef50d24f58ad8b310137a2a: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:43.021062 containerd[1628]: time="2026-03-12T23:49:43.021024414Z" level=info msg="CreateContainer within sandbox \"766ec734d5120e3cb9ff5e1a818590d25fd7b11eb7beff19509a902d5350dc37\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a008c410fd8c9fbf2e5140bcec77759185efc6106ef50d24f58ad8b310137a2a\"" Mar 12 23:49:43.021989 containerd[1628]: time="2026-03-12T23:49:43.021817173Z" level=info msg="StartContainer for \"a008c410fd8c9fbf2e5140bcec77759185efc6106ef50d24f58ad8b310137a2a\"" Mar 12 23:49:43.022867 containerd[1628]: time="2026-03-12T23:49:43.022832251Z" level=info msg="connecting to shim a008c410fd8c9fbf2e5140bcec77759185efc6106ef50d24f58ad8b310137a2a" address="unix:///run/containerd/s/ac1a7e2ae0f8132c180b7286fd54a52ca429f368d279816541560b189e56ae47" protocol=ttrpc version=3 Mar 12 23:49:43.041462 systemd[1]: Started cri-containerd-a008c410fd8c9fbf2e5140bcec77759185efc6106ef50d24f58ad8b310137a2a.scope - libcontainer container a008c410fd8c9fbf2e5140bcec77759185efc6106ef50d24f58ad8b310137a2a. Mar 12 23:49:43.076676 containerd[1628]: time="2026-03-12T23:49:43.076641438Z" level=info msg="StartContainer for \"a008c410fd8c9fbf2e5140bcec77759185efc6106ef50d24f58ad8b310137a2a\" returns successfully" Mar 12 23:49:43.207514 systemd-networkd[1540]: cali62a7353e6ec: Gained IPv6LL Mar 12 23:49:43.399552 systemd-networkd[1540]: vxlan.calico: Gained IPv6LL Mar 12 23:49:44.167593 systemd-networkd[1540]: caliaa28ee8d46d: Gained IPv6LL Mar 12 23:49:44.800478 kubelet[2863]: I0312 23:49:44.800446 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:49:44.890419 containerd[1628]: time="2026-03-12T23:49:44.890360154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:44.891791 containerd[1628]: time="2026-03-12T23:49:44.891616592Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 12 23:49:44.893273 containerd[1628]: time="2026-03-12T23:49:44.893238949Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:44.898209 containerd[1628]: time="2026-03-12T23:49:44.898168061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:44.898992 containerd[1628]: time="2026-03-12T23:49:44.898965419Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 1.910146629s" Mar 12 23:49:44.899032 containerd[1628]: time="2026-03-12T23:49:44.898993939Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 12 23:49:44.905204 containerd[1628]: time="2026-03-12T23:49:44.905174649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:49:44.915902 containerd[1628]: time="2026-03-12T23:49:44.915861550Z" level=info msg="CreateContainer within sandbox \"d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 12 23:49:44.927756 containerd[1628]: time="2026-03-12T23:49:44.927699610Z" level=info msg="Container d519d9bb7953eba64fd70ef85fd383cb5e52ca8f3153ebe1764b0689aec7a426: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:44.930404 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3080063773.mount: Deactivated successfully. Mar 12 23:49:44.939025 containerd[1628]: time="2026-03-12T23:49:44.938966630Z" level=info msg="CreateContainer within sandbox \"d3a2f9ce019525b583d50736f78bde63125be9fbf5407dbcb8c1a9c7cc9cb795\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d519d9bb7953eba64fd70ef85fd383cb5e52ca8f3153ebe1764b0689aec7a426\"" Mar 12 23:49:44.939473 containerd[1628]: time="2026-03-12T23:49:44.939449229Z" level=info msg="StartContainer for \"d519d9bb7953eba64fd70ef85fd383cb5e52ca8f3153ebe1764b0689aec7a426\"" Mar 12 23:49:44.940518 containerd[1628]: time="2026-03-12T23:49:44.940493388Z" level=info msg="connecting to shim d519d9bb7953eba64fd70ef85fd383cb5e52ca8f3153ebe1764b0689aec7a426" address="unix:///run/containerd/s/4dc717a38d2b56cd0c26954f4300d86189c24aebf024af3061933109cba7d165" protocol=ttrpc version=3 Mar 12 23:49:44.962462 systemd[1]: Started cri-containerd-d519d9bb7953eba64fd70ef85fd383cb5e52ca8f3153ebe1764b0689aec7a426.scope - libcontainer container d519d9bb7953eba64fd70ef85fd383cb5e52ca8f3153ebe1764b0689aec7a426. Mar 12 23:49:44.999123 containerd[1628]: time="2026-03-12T23:49:44.999084847Z" level=info msg="StartContainer for \"d519d9bb7953eba64fd70ef85fd383cb5e52ca8f3153ebe1764b0689aec7a426\" returns successfully" Mar 12 23:49:45.278151 containerd[1628]: time="2026-03-12T23:49:45.278094846Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:45.279341 containerd[1628]: time="2026-03-12T23:49:45.279312004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 12 23:49:45.281462 containerd[1628]: time="2026-03-12T23:49:45.281349361Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 376.137793ms" Mar 12 23:49:45.281462 containerd[1628]: time="2026-03-12T23:49:45.281381201Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:49:45.282444 containerd[1628]: time="2026-03-12T23:49:45.282379679Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 12 23:49:45.286152 containerd[1628]: time="2026-03-12T23:49:45.286121312Z" level=info msg="CreateContainer within sandbox \"15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:49:45.295580 containerd[1628]: time="2026-03-12T23:49:45.295515936Z" level=info msg="Container 860018feea1cbb3422e22e8c09daf200668713ba5b6088a124a978f888bfd9f1: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:45.308758 containerd[1628]: time="2026-03-12T23:49:45.308721793Z" level=info msg="CreateContainer within sandbox \"15d8c301580633e574ee9be4ec0211d844117b6f856a05fb6078cf65f78e3186\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"860018feea1cbb3422e22e8c09daf200668713ba5b6088a124a978f888bfd9f1\"" Mar 12 23:49:45.309414 containerd[1628]: time="2026-03-12T23:49:45.309393672Z" level=info msg="StartContainer for \"860018feea1cbb3422e22e8c09daf200668713ba5b6088a124a978f888bfd9f1\"" Mar 12 23:49:45.310900 containerd[1628]: time="2026-03-12T23:49:45.310800190Z" level=info msg="connecting to shim 860018feea1cbb3422e22e8c09daf200668713ba5b6088a124a978f888bfd9f1" address="unix:///run/containerd/s/e1a2a08001801e33c84e1da81ec4702414617f4d612be17bc3209e160d7fe6ec" protocol=ttrpc version=3 Mar 12 23:49:45.338621 systemd[1]: Started cri-containerd-860018feea1cbb3422e22e8c09daf200668713ba5b6088a124a978f888bfd9f1.scope - libcontainer container 860018feea1cbb3422e22e8c09daf200668713ba5b6088a124a978f888bfd9f1. Mar 12 23:49:45.374880 containerd[1628]: time="2026-03-12T23:49:45.374734560Z" level=info msg="StartContainer for \"860018feea1cbb3422e22e8c09daf200668713ba5b6088a124a978f888bfd9f1\" returns successfully" Mar 12 23:49:45.819333 kubelet[2863]: I0312 23:49:45.817083 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-64485f694f-jt869" podStartSLOduration=17.908027727 podStartE2EDuration="19.817070118s" podCreationTimestamp="2026-03-12 23:49:26 +0000 UTC" firstStartedPulling="2026-03-12 23:49:41.079198439 +0000 UTC m=+30.498157434" lastFinishedPulling="2026-03-12 23:49:42.98824083 +0000 UTC m=+32.407199825" observedRunningTime="2026-03-12 23:49:43.810720414 +0000 UTC m=+33.229679409" watchObservedRunningTime="2026-03-12 23:49:45.817070118 +0000 UTC m=+35.236029113" Mar 12 23:49:45.819333 kubelet[2863]: I0312 23:49:45.817769 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-64485f694f-pjqf8" podStartSLOduration=15.79894088 podStartE2EDuration="19.817760157s" podCreationTimestamp="2026-03-12 23:49:26 +0000 UTC" firstStartedPulling="2026-03-12 23:49:41.263383642 +0000 UTC m=+30.682342597" lastFinishedPulling="2026-03-12 23:49:45.282202879 +0000 UTC m=+34.701161874" observedRunningTime="2026-03-12 23:49:45.816763438 +0000 UTC m=+35.235722393" watchObservedRunningTime="2026-03-12 23:49:45.817760157 +0000 UTC m=+35.236719112" Mar 12 23:49:45.831663 kubelet[2863]: I0312 23:49:45.831586 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-86bf7fd9f7-thg7n" podStartSLOduration=14.114242896 podStartE2EDuration="17.831569973s" podCreationTimestamp="2026-03-12 23:49:28 +0000 UTC" firstStartedPulling="2026-03-12 23:49:41.187221893 +0000 UTC m=+30.606180888" lastFinishedPulling="2026-03-12 23:49:44.90454897 +0000 UTC m=+34.323507965" observedRunningTime="2026-03-12 23:49:45.829972896 +0000 UTC m=+35.248931891" watchObservedRunningTime="2026-03-12 23:49:45.831569973 +0000 UTC m=+35.250528928" Mar 12 23:49:46.615133 containerd[1628]: time="2026-03-12T23:49:46.614761424Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:46.616345 containerd[1628]: time="2026-03-12T23:49:46.616315221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 12 23:49:46.618414 containerd[1628]: time="2026-03-12T23:49:46.618351578Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:46.622991 containerd[1628]: time="2026-03-12T23:49:46.622557850Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:46.623445 containerd[1628]: time="2026-03-12T23:49:46.623415449Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.34100157s" Mar 12 23:49:46.623530 containerd[1628]: time="2026-03-12T23:49:46.623508409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 12 23:49:46.630100 containerd[1628]: time="2026-03-12T23:49:46.630067677Z" level=info msg="CreateContainer within sandbox \"675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 12 23:49:46.643060 containerd[1628]: time="2026-03-12T23:49:46.643026495Z" level=info msg="Container 3a52bc744b124ec0be51e389efb4e03c685353ac73f1304a1e87f45ea75a8132: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:46.655870 containerd[1628]: time="2026-03-12T23:49:46.655822153Z" level=info msg="CreateContainer within sandbox \"675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"3a52bc744b124ec0be51e389efb4e03c685353ac73f1304a1e87f45ea75a8132\"" Mar 12 23:49:46.656523 containerd[1628]: time="2026-03-12T23:49:46.656477752Z" level=info msg="StartContainer for \"3a52bc744b124ec0be51e389efb4e03c685353ac73f1304a1e87f45ea75a8132\"" Mar 12 23:49:46.657795 containerd[1628]: time="2026-03-12T23:49:46.657766350Z" level=info msg="connecting to shim 3a52bc744b124ec0be51e389efb4e03c685353ac73f1304a1e87f45ea75a8132" address="unix:///run/containerd/s/56156ef753da6fe65ae010bbdd8ad578cfa121fd4f365450200e7647b1fa1aa9" protocol=ttrpc version=3 Mar 12 23:49:46.680472 systemd[1]: Started cri-containerd-3a52bc744b124ec0be51e389efb4e03c685353ac73f1304a1e87f45ea75a8132.scope - libcontainer container 3a52bc744b124ec0be51e389efb4e03c685353ac73f1304a1e87f45ea75a8132. Mar 12 23:49:46.717599 containerd[1628]: time="2026-03-12T23:49:46.717559087Z" level=info msg="StartContainer for \"3a52bc744b124ec0be51e389efb4e03c685353ac73f1304a1e87f45ea75a8132\" returns successfully" Mar 12 23:49:46.718948 containerd[1628]: time="2026-03-12T23:49:46.718925124Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 12 23:49:46.811131 kubelet[2863]: I0312 23:49:46.811090 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:49:46.811277 kubelet[2863]: I0312 23:49:46.811250 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:49:48.146759 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1390033204.mount: Deactivated successfully. Mar 12 23:49:48.175687 containerd[1628]: time="2026-03-12T23:49:48.175615295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:48.177154 containerd[1628]: time="2026-03-12T23:49:48.177120653Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 12 23:49:48.178909 containerd[1628]: time="2026-03-12T23:49:48.178872370Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:48.181448 containerd[1628]: time="2026-03-12T23:49:48.181410085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:48.182141 containerd[1628]: time="2026-03-12T23:49:48.182092004Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 1.46313608s" Mar 12 23:49:48.182181 containerd[1628]: time="2026-03-12T23:49:48.182125284Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 12 23:49:48.186554 containerd[1628]: time="2026-03-12T23:49:48.186526996Z" level=info msg="CreateContainer within sandbox \"675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 12 23:49:48.197368 containerd[1628]: time="2026-03-12T23:49:48.196398899Z" level=info msg="Container 08e94e71ada624a692ef2730a9ca6feed68d32317b445097f1e140669881cc43: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:48.207263 containerd[1628]: time="2026-03-12T23:49:48.207210361Z" level=info msg="CreateContainer within sandbox \"675cc22bd0ca5330a99b8ab721aee5475f35787dd3fb71a20580add9971c310c\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"08e94e71ada624a692ef2730a9ca6feed68d32317b445097f1e140669881cc43\"" Mar 12 23:49:48.207880 containerd[1628]: time="2026-03-12T23:49:48.207854800Z" level=info msg="StartContainer for \"08e94e71ada624a692ef2730a9ca6feed68d32317b445097f1e140669881cc43\"" Mar 12 23:49:48.209108 containerd[1628]: time="2026-03-12T23:49:48.209042518Z" level=info msg="connecting to shim 08e94e71ada624a692ef2730a9ca6feed68d32317b445097f1e140669881cc43" address="unix:///run/containerd/s/56156ef753da6fe65ae010bbdd8ad578cfa121fd4f365450200e7647b1fa1aa9" protocol=ttrpc version=3 Mar 12 23:49:48.230491 systemd[1]: Started cri-containerd-08e94e71ada624a692ef2730a9ca6feed68d32317b445097f1e140669881cc43.scope - libcontainer container 08e94e71ada624a692ef2730a9ca6feed68d32317b445097f1e140669881cc43. Mar 12 23:49:48.266647 containerd[1628]: time="2026-03-12T23:49:48.266521739Z" level=info msg="StartContainer for \"08e94e71ada624a692ef2730a9ca6feed68d32317b445097f1e140669881cc43\" returns successfully" Mar 12 23:49:48.836172 kubelet[2863]: I0312 23:49:48.835609 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-d5fb6f4dd-b6lv7" podStartSLOduration=2.094434007 podStartE2EDuration="7.835583358s" podCreationTimestamp="2026-03-12 23:49:41 +0000 UTC" firstStartedPulling="2026-03-12 23:49:42.441665452 +0000 UTC m=+31.860624447" lastFinishedPulling="2026-03-12 23:49:48.182814803 +0000 UTC m=+37.601773798" observedRunningTime="2026-03-12 23:49:48.833683442 +0000 UTC m=+38.252642437" watchObservedRunningTime="2026-03-12 23:49:48.835583358 +0000 UTC m=+38.254542353" Mar 12 23:49:50.679288 containerd[1628]: time="2026-03-12T23:49:50.679253423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6lcvm,Uid:8aa4ec60-e1a1-4108-912f-bfa1021f9e1a,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:50.786280 systemd-networkd[1540]: cali1a2a1c1c41e: Link UP Mar 12 23:49:50.788777 systemd-networkd[1540]: cali1a2a1c1c41e: Gained carrier Mar 12 23:49:50.805225 containerd[1628]: 2026-03-12 23:49:50.715 [INFO][4932] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0 coredns-66bc5c9577- kube-system 8aa4ec60-e1a1-4108-912f-bfa1021f9e1a 813 0 2026-03-12 23:49:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae coredns-66bc5c9577-6lcvm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1a2a1c1c41e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-" Mar 12 23:49:50.805225 containerd[1628]: 2026-03-12 23:49:50.716 [INFO][4932] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" Mar 12 23:49:50.805225 containerd[1628]: 2026-03-12 23:49:50.740 [INFO][4946] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" HandleID="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.749 [INFO][4946] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" HandleID="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400050eec0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"coredns-66bc5c9577-6lcvm", "timestamp":"2026-03-12 23:49:50.740848916 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002f8160)} Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.750 [INFO][4946] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.750 [INFO][4946] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.750 [INFO][4946] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.753 [INFO][4946] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.759 [INFO][4946] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.764 [INFO][4946] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.766 [INFO][4946] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805604 containerd[1628]: 2026-03-12 23:49:50.769 [INFO][4946] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805795 containerd[1628]: 2026-03-12 23:49:50.769 [INFO][4946] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805795 containerd[1628]: 2026-03-12 23:49:50.771 [INFO][4946] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113 Mar 12 23:49:50.805795 containerd[1628]: 2026-03-12 23:49:50.775 [INFO][4946] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805795 containerd[1628]: 2026-03-12 23:49:50.782 [INFO][4946] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.133/26] block=192.168.124.128/26 handle="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805795 containerd[1628]: 2026-03-12 23:49:50.782 [INFO][4946] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.133/26] handle="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:50.805795 containerd[1628]: 2026-03-12 23:49:50.782 [INFO][4946] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:50.805795 containerd[1628]: 2026-03-12 23:49:50.782 [INFO][4946] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.133/26] IPv6=[] ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" HandleID="k8s-pod-network.cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" Mar 12 23:49:50.805928 containerd[1628]: 2026-03-12 23:49:50.784 [INFO][4932] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8aa4ec60-e1a1-4108-912f-bfa1021f9e1a", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"coredns-66bc5c9577-6lcvm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1a2a1c1c41e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:50.805928 containerd[1628]: 2026-03-12 23:49:50.784 [INFO][4932] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.133/32] ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" Mar 12 23:49:50.805928 containerd[1628]: 2026-03-12 23:49:50.784 [INFO][4932] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1a2a1c1c41e ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" Mar 12 23:49:50.805928 containerd[1628]: 2026-03-12 23:49:50.791 [INFO][4932] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" Mar 12 23:49:50.805928 containerd[1628]: 2026-03-12 23:49:50.791 [INFO][4932] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8aa4ec60-e1a1-4108-912f-bfa1021f9e1a", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113", Pod:"coredns-66bc5c9577-6lcvm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1a2a1c1c41e", MAC:"e6:d4:a0:be:bf:4d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:50.806095 containerd[1628]: 2026-03-12 23:49:50.802 [INFO][4932] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" Namespace="kube-system" Pod="coredns-66bc5c9577-6lcvm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--6lcvm-eth0" Mar 12 23:49:50.835084 containerd[1628]: time="2026-03-12T23:49:50.835026634Z" level=info msg="connecting to shim cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113" address="unix:///run/containerd/s/3dae442fff1bc640dde1cb0eb9988eef4d9270df921f899769ad0f2c8a7da570" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:50.861472 systemd[1]: Started cri-containerd-cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113.scope - libcontainer container cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113. Mar 12 23:49:50.893246 containerd[1628]: time="2026-03-12T23:49:50.893209134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6lcvm,Uid:8aa4ec60-e1a1-4108-912f-bfa1021f9e1a,Namespace:kube-system,Attempt:0,} returns sandbox id \"cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113\"" Mar 12 23:49:50.898590 containerd[1628]: time="2026-03-12T23:49:50.898563725Z" level=info msg="CreateContainer within sandbox \"cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:49:50.912132 containerd[1628]: time="2026-03-12T23:49:50.912087181Z" level=info msg="Container 37fa994ccfe49ffecb2e89ebaa4dd5ebc00e33f79fd834ea62761e77097e8fcd: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:50.923083 containerd[1628]: time="2026-03-12T23:49:50.923035603Z" level=info msg="CreateContainer within sandbox \"cda8f236c67e3648c07026b06c1aa5e03c2ce7256a4d8654af1ecabec9837113\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"37fa994ccfe49ffecb2e89ebaa4dd5ebc00e33f79fd834ea62761e77097e8fcd\"" Mar 12 23:49:50.924344 containerd[1628]: time="2026-03-12T23:49:50.924286640Z" level=info msg="StartContainer for \"37fa994ccfe49ffecb2e89ebaa4dd5ebc00e33f79fd834ea62761e77097e8fcd\"" Mar 12 23:49:50.926178 containerd[1628]: time="2026-03-12T23:49:50.925954998Z" level=info msg="connecting to shim 37fa994ccfe49ffecb2e89ebaa4dd5ebc00e33f79fd834ea62761e77097e8fcd" address="unix:///run/containerd/s/3dae442fff1bc640dde1cb0eb9988eef4d9270df921f899769ad0f2c8a7da570" protocol=ttrpc version=3 Mar 12 23:49:50.947644 systemd[1]: Started cri-containerd-37fa994ccfe49ffecb2e89ebaa4dd5ebc00e33f79fd834ea62761e77097e8fcd.scope - libcontainer container 37fa994ccfe49ffecb2e89ebaa4dd5ebc00e33f79fd834ea62761e77097e8fcd. Mar 12 23:49:50.973183 containerd[1628]: time="2026-03-12T23:49:50.973131076Z" level=info msg="StartContainer for \"37fa994ccfe49ffecb2e89ebaa4dd5ebc00e33f79fd834ea62761e77097e8fcd\" returns successfully" Mar 12 23:49:51.676072 containerd[1628]: time="2026-03-12T23:49:51.675980626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9szck,Uid:a240f634-9da5-444e-bd18-80be2ab75c23,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:51.678442 containerd[1628]: time="2026-03-12T23:49:51.678348622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-9qpsq,Uid:19579544-0590-4a13-b454-9d848e5ef1c8,Namespace:calico-system,Attempt:0,}" Mar 12 23:49:51.806060 systemd-networkd[1540]: calia959fac691e: Link UP Mar 12 23:49:51.806775 systemd-networkd[1540]: calia959fac691e: Gained carrier Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.726 [INFO][5060] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0 csi-node-driver- calico-system a240f634-9da5-444e-bd18-80be2ab75c23 702 0 2026-03-12 23:49:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae csi-node-driver-9szck eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calia959fac691e [] [] }} ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.726 [INFO][5060] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.750 [INFO][5089] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" HandleID="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.764 [INFO][5089] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" HandleID="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003f0b50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"csi-node-driver-9szck", "timestamp":"2026-03-12 23:49:51.750457057 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003d2c60)} Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.764 [INFO][5089] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.764 [INFO][5089] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.764 [INFO][5089] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.767 [INFO][5089] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.772 [INFO][5089] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.778 [INFO][5089] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.781 [INFO][5089] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.786 [INFO][5089] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.786 [INFO][5089] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.788 [INFO][5089] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.793 [INFO][5089] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.800 [INFO][5089] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.134/26] block=192.168.124.128/26 handle="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.800 [INFO][5089] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.134/26] handle="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.800 [INFO][5089] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:51.821133 containerd[1628]: 2026-03-12 23:49:51.800 [INFO][5089] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.134/26] IPv6=[] ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" HandleID="k8s-pod-network.01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" Mar 12 23:49:51.822243 containerd[1628]: 2026-03-12 23:49:51.802 [INFO][5060] cni-plugin/k8s.go 418: Populated endpoint ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a240f634-9da5-444e-bd18-80be2ab75c23", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"csi-node-driver-9szck", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.124.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia959fac691e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:51.822243 containerd[1628]: 2026-03-12 23:49:51.802 [INFO][5060] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.134/32] ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" Mar 12 23:49:51.822243 containerd[1628]: 2026-03-12 23:49:51.802 [INFO][5060] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia959fac691e ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" Mar 12 23:49:51.822243 containerd[1628]: 2026-03-12 23:49:51.807 [INFO][5060] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" Mar 12 23:49:51.822243 containerd[1628]: 2026-03-12 23:49:51.807 [INFO][5060] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a240f634-9da5-444e-bd18-80be2ab75c23", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de", Pod:"csi-node-driver-9szck", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.124.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia959fac691e", MAC:"12:99:b5:85:41:80", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:51.822243 containerd[1628]: 2026-03-12 23:49:51.819 [INFO][5060] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" Namespace="calico-system" Pod="csi-node-driver-9szck" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-csi--node--driver--9szck-eth0" Mar 12 23:49:51.860241 containerd[1628]: time="2026-03-12T23:49:51.860151068Z" level=info msg="connecting to shim 01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de" address="unix:///run/containerd/s/ec1c6cc74d4ffde03de201d3fdf608b477a0f783412a71aac92ee4dfe059053c" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:51.866312 kubelet[2863]: I0312 23:49:51.864951 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-6lcvm" podStartSLOduration=35.86493486 podStartE2EDuration="35.86493486s" podCreationTimestamp="2026-03-12 23:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:51.84744409 +0000 UTC m=+41.266403125" watchObservedRunningTime="2026-03-12 23:49:51.86493486 +0000 UTC m=+41.283893855" Mar 12 23:49:51.890592 systemd[1]: Started cri-containerd-01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de.scope - libcontainer container 01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de. Mar 12 23:49:51.924728 containerd[1628]: time="2026-03-12T23:49:51.924621117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9szck,Uid:a240f634-9da5-444e-bd18-80be2ab75c23,Namespace:calico-system,Attempt:0,} returns sandbox id \"01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de\"" Mar 12 23:49:51.927881 systemd-networkd[1540]: calic0ed913ca55: Link UP Mar 12 23:49:51.928497 systemd-networkd[1540]: calic0ed913ca55: Gained carrier Mar 12 23:49:51.929871 containerd[1628]: time="2026-03-12T23:49:51.929412109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.730 [INFO][5067] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0 goldmane-cccfbd5cf- calico-system 19579544-0590-4a13-b454-9d848e5ef1c8 816 0 2026-03-12 23:49:26 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae goldmane-cccfbd5cf-9qpsq eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic0ed913ca55 [] [] }} ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.730 [INFO][5067] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.752 [INFO][5095] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" HandleID="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.765 [INFO][5095] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" HandleID="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000365dd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"goldmane-cccfbd5cf-9qpsq", "timestamp":"2026-03-12 23:49:51.752670934 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400018d080)} Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.765 [INFO][5095] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.800 [INFO][5095] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.800 [INFO][5095] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.870 [INFO][5095] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.880 [INFO][5095] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.889 [INFO][5095] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.892 [INFO][5095] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.899 [INFO][5095] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.899 [INFO][5095] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.904 [INFO][5095] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.910 [INFO][5095] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.918 [INFO][5095] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.135/26] block=192.168.124.128/26 handle="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.918 [INFO][5095] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.135/26] handle="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.918 [INFO][5095] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:51.942611 containerd[1628]: 2026-03-12 23:49:51.918 [INFO][5095] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.135/26] IPv6=[] ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" HandleID="k8s-pod-network.944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" Mar 12 23:49:51.943250 containerd[1628]: 2026-03-12 23:49:51.921 [INFO][5067] cni-plugin/k8s.go 418: Populated endpoint ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"19579544-0590-4a13-b454-9d848e5ef1c8", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"goldmane-cccfbd5cf-9qpsq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.124.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic0ed913ca55", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:51.943250 containerd[1628]: 2026-03-12 23:49:51.921 [INFO][5067] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.135/32] ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" Mar 12 23:49:51.943250 containerd[1628]: 2026-03-12 23:49:51.922 [INFO][5067] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic0ed913ca55 ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" Mar 12 23:49:51.943250 containerd[1628]: 2026-03-12 23:49:51.929 [INFO][5067] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" Mar 12 23:49:51.943250 containerd[1628]: 2026-03-12 23:49:51.929 [INFO][5067] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"19579544-0590-4a13-b454-9d848e5ef1c8", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd", Pod:"goldmane-cccfbd5cf-9qpsq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.124.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic0ed913ca55", MAC:"b6:b2:33:a3:36:54", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:51.943250 containerd[1628]: 2026-03-12 23:49:51.939 [INFO][5067] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" Namespace="calico-system" Pod="goldmane-cccfbd5cf-9qpsq" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-goldmane--cccfbd5cf--9qpsq-eth0" Mar 12 23:49:51.971698 containerd[1628]: time="2026-03-12T23:49:51.971644916Z" level=info msg="connecting to shim 944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd" address="unix:///run/containerd/s/36f0aa891ec9df76d6b34bcf53e2f872cf5039ca8e75542b9e96e78363e22dce" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:51.994699 systemd[1]: Started cri-containerd-944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd.scope - libcontainer container 944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd. Mar 12 23:49:52.029471 containerd[1628]: time="2026-03-12T23:49:52.029410857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-9qpsq,Uid:19579544-0590-4a13-b454-9d848e5ef1c8,Namespace:calico-system,Attempt:0,} returns sandbox id \"944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd\"" Mar 12 23:49:52.487562 systemd-networkd[1540]: cali1a2a1c1c41e: Gained IPv6LL Mar 12 23:49:53.271721 containerd[1628]: time="2026-03-12T23:49:53.271649077Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:53.273205 containerd[1628]: time="2026-03-12T23:49:53.273156954Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 12 23:49:53.274933 containerd[1628]: time="2026-03-12T23:49:53.274865511Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:53.277506 containerd[1628]: time="2026-03-12T23:49:53.277438067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:53.278821 containerd[1628]: time="2026-03-12T23:49:53.278714065Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.348896637s" Mar 12 23:49:53.278821 containerd[1628]: time="2026-03-12T23:49:53.278741905Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 12 23:49:53.280619 containerd[1628]: time="2026-03-12T23:49:53.280596942Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 12 23:49:53.284618 containerd[1628]: time="2026-03-12T23:49:53.284585975Z" level=info msg="CreateContainer within sandbox \"01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 12 23:49:53.297584 containerd[1628]: time="2026-03-12T23:49:53.296477754Z" level=info msg="Container e42534c9c17ef9396aed085ce8aa422cc4603bc2b9df24608f91b4e69d40f918: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:53.316313 containerd[1628]: time="2026-03-12T23:49:53.316176960Z" level=info msg="CreateContainer within sandbox \"01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e42534c9c17ef9396aed085ce8aa422cc4603bc2b9df24608f91b4e69d40f918\"" Mar 12 23:49:53.316944 containerd[1628]: time="2026-03-12T23:49:53.316912759Z" level=info msg="StartContainer for \"e42534c9c17ef9396aed085ce8aa422cc4603bc2b9df24608f91b4e69d40f918\"" Mar 12 23:49:53.318540 containerd[1628]: time="2026-03-12T23:49:53.318510596Z" level=info msg="connecting to shim e42534c9c17ef9396aed085ce8aa422cc4603bc2b9df24608f91b4e69d40f918" address="unix:///run/containerd/s/ec1c6cc74d4ffde03de201d3fdf608b477a0f783412a71aac92ee4dfe059053c" protocol=ttrpc version=3 Mar 12 23:49:53.343487 systemd[1]: Started cri-containerd-e42534c9c17ef9396aed085ce8aa422cc4603bc2b9df24608f91b4e69d40f918.scope - libcontainer container e42534c9c17ef9396aed085ce8aa422cc4603bc2b9df24608f91b4e69d40f918. Mar 12 23:49:53.383493 systemd-networkd[1540]: calic0ed913ca55: Gained IPv6LL Mar 12 23:49:53.421366 containerd[1628]: time="2026-03-12T23:49:53.421326899Z" level=info msg="StartContainer for \"e42534c9c17ef9396aed085ce8aa422cc4603bc2b9df24608f91b4e69d40f918\" returns successfully" Mar 12 23:49:53.447565 systemd-networkd[1540]: calia959fac691e: Gained IPv6LL Mar 12 23:49:53.674715 containerd[1628]: time="2026-03-12T23:49:53.674675543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zvbsm,Uid:b1fd0cfb-c997-44dc-9932-e2477c6dd925,Namespace:kube-system,Attempt:0,}" Mar 12 23:49:53.785386 systemd-networkd[1540]: calide1b8f9dbbc: Link UP Mar 12 23:49:53.785621 systemd-networkd[1540]: calide1b8f9dbbc: Gained carrier Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.716 [INFO][5307] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0 coredns-66bc5c9577- kube-system b1fd0cfb-c997-44dc-9932-e2477c6dd925 814 0 2026-03-12 23:49:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-9e79e0a9ae coredns-66bc5c9577-zvbsm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calide1b8f9dbbc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.716 [INFO][5307] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.739 [INFO][5322] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" HandleID="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.750 [INFO][5322] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" HandleID="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000364150), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-9e79e0a9ae", "pod":"coredns-66bc5c9577-zvbsm", "timestamp":"2026-03-12 23:49:53.73987167 +0000 UTC"}, Hostname:"ci-4459-2-4-n-9e79e0a9ae", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003ee000)} Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.750 [INFO][5322] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.750 [INFO][5322] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.750 [INFO][5322] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-9e79e0a9ae' Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.753 [INFO][5322] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.758 [INFO][5322] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.764 [INFO][5322] ipam/ipam.go 526: Trying affinity for 192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.766 [INFO][5322] ipam/ipam.go 160: Attempting to load block cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.769 [INFO][5322] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.124.128/26 host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.769 [INFO][5322] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.124.128/26 handle="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.771 [INFO][5322] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95 Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.775 [INFO][5322] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.124.128/26 handle="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.781 [INFO][5322] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.124.136/26] block=192.168.124.128/26 handle="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.781 [INFO][5322] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.124.136/26] handle="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" host="ci-4459-2-4-n-9e79e0a9ae" Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.781 [INFO][5322] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:49:53.800239 containerd[1628]: 2026-03-12 23:49:53.781 [INFO][5322] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.124.136/26] IPv6=[] ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" HandleID="k8s-pod-network.621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Workload="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" Mar 12 23:49:53.800764 containerd[1628]: 2026-03-12 23:49:53.783 [INFO][5307] cni-plugin/k8s.go 418: Populated endpoint ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b1fd0cfb-c997-44dc-9932-e2477c6dd925", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"", Pod:"coredns-66bc5c9577-zvbsm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calide1b8f9dbbc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:53.800764 containerd[1628]: 2026-03-12 23:49:53.784 [INFO][5307] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.124.136/32] ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" Mar 12 23:49:53.800764 containerd[1628]: 2026-03-12 23:49:53.784 [INFO][5307] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calide1b8f9dbbc ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" Mar 12 23:49:53.800764 containerd[1628]: 2026-03-12 23:49:53.785 [INFO][5307] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" Mar 12 23:49:53.800764 containerd[1628]: 2026-03-12 23:49:53.786 [INFO][5307] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b1fd0cfb-c997-44dc-9932-e2477c6dd925", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 49, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-9e79e0a9ae", ContainerID:"621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95", Pod:"coredns-66bc5c9577-zvbsm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.124.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calide1b8f9dbbc", MAC:"0a:07:9a:1e:66:6e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:49:53.801124 containerd[1628]: 2026-03-12 23:49:53.797 [INFO][5307] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" Namespace="kube-system" Pod="coredns-66bc5c9577-zvbsm" WorkloadEndpoint="ci--4459--2--4--n--9e79e0a9ae-k8s-coredns--66bc5c9577--zvbsm-eth0" Mar 12 23:49:53.834265 containerd[1628]: time="2026-03-12T23:49:53.834160468Z" level=info msg="connecting to shim 621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95" address="unix:///run/containerd/s/abdd487980af212cab22d3064243e015c537a97a01b4a5d42fcf74abc515c2fa" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:49:53.859475 systemd[1]: Started cri-containerd-621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95.scope - libcontainer container 621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95. Mar 12 23:49:53.893790 containerd[1628]: time="2026-03-12T23:49:53.893747685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zvbsm,Uid:b1fd0cfb-c997-44dc-9932-e2477c6dd925,Namespace:kube-system,Attempt:0,} returns sandbox id \"621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95\"" Mar 12 23:49:53.901051 containerd[1628]: time="2026-03-12T23:49:53.901004713Z" level=info msg="CreateContainer within sandbox \"621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:49:53.913109 containerd[1628]: time="2026-03-12T23:49:53.912993132Z" level=info msg="Container abe98351506b4eeb959037b56f54c5738273e74555a307bd7780045ea779562c: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:53.923623 containerd[1628]: time="2026-03-12T23:49:53.923566314Z" level=info msg="CreateContainer within sandbox \"621458d3aaf80727c8e4c5932dba0af4fca739539dbcc4273ec741262c1b7c95\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"abe98351506b4eeb959037b56f54c5738273e74555a307bd7780045ea779562c\"" Mar 12 23:49:53.924314 containerd[1628]: time="2026-03-12T23:49:53.924234793Z" level=info msg="StartContainer for \"abe98351506b4eeb959037b56f54c5738273e74555a307bd7780045ea779562c\"" Mar 12 23:49:53.925791 containerd[1628]: time="2026-03-12T23:49:53.925477151Z" level=info msg="connecting to shim abe98351506b4eeb959037b56f54c5738273e74555a307bd7780045ea779562c" address="unix:///run/containerd/s/abdd487980af212cab22d3064243e015c537a97a01b4a5d42fcf74abc515c2fa" protocol=ttrpc version=3 Mar 12 23:49:53.945481 systemd[1]: Started cri-containerd-abe98351506b4eeb959037b56f54c5738273e74555a307bd7780045ea779562c.scope - libcontainer container abe98351506b4eeb959037b56f54c5738273e74555a307bd7780045ea779562c. Mar 12 23:49:53.971191 containerd[1628]: time="2026-03-12T23:49:53.971137192Z" level=info msg="StartContainer for \"abe98351506b4eeb959037b56f54c5738273e74555a307bd7780045ea779562c\" returns successfully" Mar 12 23:49:54.549314 kubelet[2863]: I0312 23:49:54.549274 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:49:54.866892 kubelet[2863]: I0312 23:49:54.866791 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-zvbsm" podStartSLOduration=38.866772209 podStartE2EDuration="38.866772209s" podCreationTimestamp="2026-03-12 23:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:49:54.865751371 +0000 UTC m=+44.284710366" watchObservedRunningTime="2026-03-12 23:49:54.866772209 +0000 UTC m=+44.285731204" Mar 12 23:49:55.105091 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4004351060.mount: Deactivated successfully. Mar 12 23:49:55.355962 containerd[1628]: time="2026-03-12T23:49:55.355916247Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:55.357433 containerd[1628]: time="2026-03-12T23:49:55.357404164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 12 23:49:55.359116 containerd[1628]: time="2026-03-12T23:49:55.359058081Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:55.362240 containerd[1628]: time="2026-03-12T23:49:55.361904956Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:55.362946 containerd[1628]: time="2026-03-12T23:49:55.362916835Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 2.082291733s" Mar 12 23:49:55.363042 containerd[1628]: time="2026-03-12T23:49:55.363026234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 12 23:49:55.363977 containerd[1628]: time="2026-03-12T23:49:55.363945073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 12 23:49:55.368119 containerd[1628]: time="2026-03-12T23:49:55.368094546Z" level=info msg="CreateContainer within sandbox \"944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 12 23:49:55.378527 containerd[1628]: time="2026-03-12T23:49:55.378397368Z" level=info msg="Container 2dd57f825146721307a665986e8dd0ffddbf0cc7fceb332705aa5de9fb581119: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:55.390675 containerd[1628]: time="2026-03-12T23:49:55.390628907Z" level=info msg="CreateContainer within sandbox \"944f0aa6caa36fd887211df10bde10d5ab9f2bfce5f192f859a97b7e0a733fbd\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"2dd57f825146721307a665986e8dd0ffddbf0cc7fceb332705aa5de9fb581119\"" Mar 12 23:49:55.391190 containerd[1628]: time="2026-03-12T23:49:55.391152946Z" level=info msg="StartContainer for \"2dd57f825146721307a665986e8dd0ffddbf0cc7fceb332705aa5de9fb581119\"" Mar 12 23:49:55.393136 containerd[1628]: time="2026-03-12T23:49:55.393106783Z" level=info msg="connecting to shim 2dd57f825146721307a665986e8dd0ffddbf0cc7fceb332705aa5de9fb581119" address="unix:///run/containerd/s/36f0aa891ec9df76d6b34bcf53e2f872cf5039ca8e75542b9e96e78363e22dce" protocol=ttrpc version=3 Mar 12 23:49:55.412456 systemd[1]: Started cri-containerd-2dd57f825146721307a665986e8dd0ffddbf0cc7fceb332705aa5de9fb581119.scope - libcontainer container 2dd57f825146721307a665986e8dd0ffddbf0cc7fceb332705aa5de9fb581119. Mar 12 23:49:55.432534 systemd-networkd[1540]: calide1b8f9dbbc: Gained IPv6LL Mar 12 23:49:55.467793 containerd[1628]: time="2026-03-12T23:49:55.467757614Z" level=info msg="StartContainer for \"2dd57f825146721307a665986e8dd0ffddbf0cc7fceb332705aa5de9fb581119\" returns successfully" Mar 12 23:49:55.864600 kubelet[2863]: I0312 23:49:55.864529 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-9qpsq" podStartSLOduration=26.533643429 podStartE2EDuration="29.864515011s" podCreationTimestamp="2026-03-12 23:49:26 +0000 UTC" firstStartedPulling="2026-03-12 23:49:52.032959171 +0000 UTC m=+41.451918126" lastFinishedPulling="2026-03-12 23:49:55.363830753 +0000 UTC m=+44.782789708" observedRunningTime="2026-03-12 23:49:55.864035811 +0000 UTC m=+45.282994806" watchObservedRunningTime="2026-03-12 23:49:55.864515011 +0000 UTC m=+45.283474006" Mar 12 23:49:56.751953 containerd[1628]: time="2026-03-12T23:49:56.751881002Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:56.753336 containerd[1628]: time="2026-03-12T23:49:56.753280160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 12 23:49:56.755033 containerd[1628]: time="2026-03-12T23:49:56.754989677Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:56.757714 containerd[1628]: time="2026-03-12T23:49:56.757661152Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:49:56.758335 containerd[1628]: time="2026-03-12T23:49:56.758284511Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.394303798s" Mar 12 23:49:56.758386 containerd[1628]: time="2026-03-12T23:49:56.758333711Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 12 23:49:56.762930 containerd[1628]: time="2026-03-12T23:49:56.762890143Z" level=info msg="CreateContainer within sandbox \"01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 12 23:49:56.775813 containerd[1628]: time="2026-03-12T23:49:56.774458323Z" level=info msg="Container 4890e4a320d25a038012196019343b5b95116b03f95093170adf08e75f0ed3dd: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:49:56.787041 containerd[1628]: time="2026-03-12T23:49:56.787003662Z" level=info msg="CreateContainer within sandbox \"01735f9cd6fa1d9231bdcb258e76414a7756d602cde9f9c99f493ef1015e61de\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4890e4a320d25a038012196019343b5b95116b03f95093170adf08e75f0ed3dd\"" Mar 12 23:49:56.787473 containerd[1628]: time="2026-03-12T23:49:56.787442181Z" level=info msg="StartContainer for \"4890e4a320d25a038012196019343b5b95116b03f95093170adf08e75f0ed3dd\"" Mar 12 23:49:56.789048 containerd[1628]: time="2026-03-12T23:49:56.789021658Z" level=info msg="connecting to shim 4890e4a320d25a038012196019343b5b95116b03f95093170adf08e75f0ed3dd" address="unix:///run/containerd/s/ec1c6cc74d4ffde03de201d3fdf608b477a0f783412a71aac92ee4dfe059053c" protocol=ttrpc version=3 Mar 12 23:49:56.815579 systemd[1]: Started cri-containerd-4890e4a320d25a038012196019343b5b95116b03f95093170adf08e75f0ed3dd.scope - libcontainer container 4890e4a320d25a038012196019343b5b95116b03f95093170adf08e75f0ed3dd. Mar 12 23:49:56.855951 kubelet[2863]: I0312 23:49:56.855896 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:49:56.888536 containerd[1628]: time="2026-03-12T23:49:56.888502287Z" level=info msg="StartContainer for \"4890e4a320d25a038012196019343b5b95116b03f95093170adf08e75f0ed3dd\" returns successfully" Mar 12 23:49:57.734797 kubelet[2863]: I0312 23:49:57.734769 2863 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 12 23:49:57.734797 kubelet[2863]: I0312 23:49:57.734800 2863 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 12 23:49:57.875413 kubelet[2863]: I0312 23:49:57.874857 2863 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9szck" podStartSLOduration=25.04365467 podStartE2EDuration="29.874840948s" podCreationTimestamp="2026-03-12 23:49:28 +0000 UTC" firstStartedPulling="2026-03-12 23:49:51.928092551 +0000 UTC m=+41.347051546" lastFinishedPulling="2026-03-12 23:49:56.759278829 +0000 UTC m=+46.178237824" observedRunningTime="2026-03-12 23:49:57.87373443 +0000 UTC m=+47.292693425" watchObservedRunningTime="2026-03-12 23:49:57.874840948 +0000 UTC m=+47.293799943" Mar 12 23:49:58.194748 kubelet[2863]: I0312 23:49:58.194544 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:26.557938 update_engine[1612]: I20260312 23:50:26.557816 1612 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 12 23:50:26.557938 update_engine[1612]: I20260312 23:50:26.557914 1612 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558145 1612 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558518 1612 omaha_request_params.cc:62] Current group set to stable Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558605 1612 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558614 1612 update_attempter.cc:643] Scheduling an action processor start. Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558627 1612 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558650 1612 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558693 1612 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558701 1612 omaha_request_action.cc:272] Request: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: Mar 12 23:50:26.558758 update_engine[1612]: I20260312 23:50:26.558705 1612 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:50:26.559258 locksmithd[1656]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 12 23:50:26.561506 update_engine[1612]: I20260312 23:50:26.561129 1612 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:50:26.561899 update_engine[1612]: I20260312 23:50:26.561851 1612 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:50:26.571127 update_engine[1612]: E20260312 23:50:26.571043 1612 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:50:26.571127 update_engine[1612]: I20260312 23:50:26.571123 1612 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 12 23:50:33.405431 kubelet[2863]: I0312 23:50:33.405278 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:35.592155 kubelet[2863]: I0312 23:50:35.592102 2863 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:50:36.529899 update_engine[1612]: I20260312 23:50:36.529821 1612 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:50:36.530250 update_engine[1612]: I20260312 23:50:36.529913 1612 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:50:36.530250 update_engine[1612]: I20260312 23:50:36.530232 1612 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:50:36.536701 update_engine[1612]: E20260312 23:50:36.536662 1612 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:50:36.536879 update_engine[1612]: I20260312 23:50:36.536859 1612 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 12 23:50:46.539166 update_engine[1612]: I20260312 23:50:46.538368 1612 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:50:46.539166 update_engine[1612]: I20260312 23:50:46.538498 1612 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:50:46.539166 update_engine[1612]: I20260312 23:50:46.539124 1612 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:50:46.545668 update_engine[1612]: E20260312 23:50:46.545604 1612 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:50:46.545902 update_engine[1612]: I20260312 23:50:46.545882 1612 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 12 23:50:56.531987 update_engine[1612]: I20260312 23:50:56.531432 1612 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:50:56.531987 update_engine[1612]: I20260312 23:50:56.531566 1612 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:50:56.531987 update_engine[1612]: I20260312 23:50:56.531929 1612 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:50:56.539040 update_engine[1612]: E20260312 23:50:56.538980 1612 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539211 1612 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539227 1612 omaha_request_action.cc:617] Omaha request response: Mar 12 23:50:56.539742 update_engine[1612]: E20260312 23:50:56.539319 1612 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539336 1612 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539340 1612 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539345 1612 update_attempter.cc:306] Processing Done. Mar 12 23:50:56.539742 update_engine[1612]: E20260312 23:50:56.539358 1612 update_attempter.cc:619] Update failed. Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539364 1612 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539368 1612 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539372 1612 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539433 1612 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539451 1612 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 12 23:50:56.539742 update_engine[1612]: I20260312 23:50:56.539458 1612 omaha_request_action.cc:272] Request: Mar 12 23:50:56.539742 update_engine[1612]: Mar 12 23:50:56.539742 update_engine[1612]: Mar 12 23:50:56.539742 update_engine[1612]: Mar 12 23:50:56.540071 update_engine[1612]: Mar 12 23:50:56.540071 update_engine[1612]: Mar 12 23:50:56.540071 update_engine[1612]: Mar 12 23:50:56.540071 update_engine[1612]: I20260312 23:50:56.539462 1612 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:50:56.540071 update_engine[1612]: I20260312 23:50:56.539476 1612 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:50:56.540071 update_engine[1612]: I20260312 23:50:56.539693 1612 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:50:56.540464 locksmithd[1656]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 12 23:50:56.544532 update_engine[1612]: E20260312 23:50:56.544474 1612 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:50:56.544648 update_engine[1612]: I20260312 23:50:56.544629 1612 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 12 23:50:56.544748 update_engine[1612]: I20260312 23:50:56.544731 1612 omaha_request_action.cc:617] Omaha request response: Mar 12 23:50:56.544961 update_engine[1612]: I20260312 23:50:56.544792 1612 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 12 23:50:56.544961 update_engine[1612]: I20260312 23:50:56.544803 1612 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 12 23:50:56.544961 update_engine[1612]: I20260312 23:50:56.544807 1612 update_attempter.cc:306] Processing Done. Mar 12 23:50:56.544961 update_engine[1612]: I20260312 23:50:56.544812 1612 update_attempter.cc:310] Error event sent. Mar 12 23:50:56.544961 update_engine[1612]: I20260312 23:50:56.544819 1612 update_check_scheduler.cc:74] Next update check in 41m10s Mar 12 23:50:56.545221 locksmithd[1656]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 12 23:52:11.817053 systemd[1]: Started sshd@7-10.0.8.7:22-20.161.92.111:39118.service - OpenSSH per-connection server daemon (20.161.92.111:39118). Mar 12 23:52:12.335511 sshd[6152]: Accepted publickey for core from 20.161.92.111 port 39118 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:12.336856 sshd-session[6152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:12.341077 systemd-logind[1610]: New session 8 of user core. Mar 12 23:52:12.352448 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 12 23:52:12.701825 sshd[6155]: Connection closed by 20.161.92.111 port 39118 Mar 12 23:52:12.702069 sshd-session[6152]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:12.706770 systemd[1]: sshd@7-10.0.8.7:22-20.161.92.111:39118.service: Deactivated successfully. Mar 12 23:52:12.711928 systemd[1]: session-8.scope: Deactivated successfully. Mar 12 23:52:12.714581 systemd-logind[1610]: Session 8 logged out. Waiting for processes to exit. Mar 12 23:52:12.715861 systemd-logind[1610]: Removed session 8. Mar 12 23:52:17.808750 systemd[1]: Started sshd@8-10.0.8.7:22-20.161.92.111:39132.service - OpenSSH per-connection server daemon (20.161.92.111:39132). Mar 12 23:52:18.319039 sshd[6196]: Accepted publickey for core from 20.161.92.111 port 39132 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:18.320224 sshd-session[6196]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:18.324355 systemd-logind[1610]: New session 9 of user core. Mar 12 23:52:18.334432 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 12 23:52:18.665358 sshd[6199]: Connection closed by 20.161.92.111 port 39132 Mar 12 23:52:18.665839 sshd-session[6196]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:18.669315 systemd[1]: sshd@8-10.0.8.7:22-20.161.92.111:39132.service: Deactivated successfully. Mar 12 23:52:18.671096 systemd[1]: session-9.scope: Deactivated successfully. Mar 12 23:52:18.673369 systemd-logind[1610]: Session 9 logged out. Waiting for processes to exit. Mar 12 23:52:18.674869 systemd-logind[1610]: Removed session 9. Mar 12 23:52:23.771586 systemd[1]: Started sshd@9-10.0.8.7:22-20.161.92.111:57618.service - OpenSSH per-connection server daemon (20.161.92.111:57618). Mar 12 23:52:24.288175 sshd[6251]: Accepted publickey for core from 20.161.92.111 port 57618 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:24.289555 sshd-session[6251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:24.293972 systemd-logind[1610]: New session 10 of user core. Mar 12 23:52:24.303467 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 12 23:52:24.636587 sshd[6255]: Connection closed by 20.161.92.111 port 57618 Mar 12 23:52:24.636906 sshd-session[6251]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:24.641793 systemd[1]: sshd@9-10.0.8.7:22-20.161.92.111:57618.service: Deactivated successfully. Mar 12 23:52:24.643966 systemd[1]: session-10.scope: Deactivated successfully. Mar 12 23:52:24.644839 systemd-logind[1610]: Session 10 logged out. Waiting for processes to exit. Mar 12 23:52:24.646655 systemd-logind[1610]: Removed session 10. Mar 12 23:52:29.743117 systemd[1]: Started sshd@10-10.0.8.7:22-20.161.92.111:57630.service - OpenSSH per-connection server daemon (20.161.92.111:57630). Mar 12 23:52:30.268400 sshd[6317]: Accepted publickey for core from 20.161.92.111 port 57630 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:30.269635 sshd-session[6317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:30.274557 systemd-logind[1610]: New session 11 of user core. Mar 12 23:52:30.284608 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 12 23:52:30.614532 sshd[6320]: Connection closed by 20.161.92.111 port 57630 Mar 12 23:52:30.615073 sshd-session[6317]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:30.618741 systemd[1]: sshd@10-10.0.8.7:22-20.161.92.111:57630.service: Deactivated successfully. Mar 12 23:52:30.620503 systemd[1]: session-11.scope: Deactivated successfully. Mar 12 23:52:30.621216 systemd-logind[1610]: Session 11 logged out. Waiting for processes to exit. Mar 12 23:52:30.622867 systemd-logind[1610]: Removed session 11. Mar 12 23:52:35.719880 systemd[1]: Started sshd@11-10.0.8.7:22-20.161.92.111:46786.service - OpenSSH per-connection server daemon (20.161.92.111:46786). Mar 12 23:52:36.240340 sshd[6360]: Accepted publickey for core from 20.161.92.111 port 46786 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:36.241453 sshd-session[6360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:36.245406 systemd-logind[1610]: New session 12 of user core. Mar 12 23:52:36.255562 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 12 23:52:36.582680 sshd[6363]: Connection closed by 20.161.92.111 port 46786 Mar 12 23:52:36.583365 sshd-session[6360]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:36.587516 systemd-logind[1610]: Session 12 logged out. Waiting for processes to exit. Mar 12 23:52:36.587723 systemd[1]: sshd@11-10.0.8.7:22-20.161.92.111:46786.service: Deactivated successfully. Mar 12 23:52:36.590381 systemd[1]: session-12.scope: Deactivated successfully. Mar 12 23:52:36.592421 systemd-logind[1610]: Removed session 12. Mar 12 23:52:36.689183 systemd[1]: Started sshd@12-10.0.8.7:22-20.161.92.111:46790.service - OpenSSH per-connection server daemon (20.161.92.111:46790). Mar 12 23:52:37.207354 sshd[6377]: Accepted publickey for core from 20.161.92.111 port 46790 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:37.208335 sshd-session[6377]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:37.212108 systemd-logind[1610]: New session 13 of user core. Mar 12 23:52:37.219616 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 12 23:52:37.582814 sshd[6380]: Connection closed by 20.161.92.111 port 46790 Mar 12 23:52:37.583522 sshd-session[6377]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:37.587107 systemd-logind[1610]: Session 13 logged out. Waiting for processes to exit. Mar 12 23:52:37.587339 systemd[1]: sshd@12-10.0.8.7:22-20.161.92.111:46790.service: Deactivated successfully. Mar 12 23:52:37.588918 systemd[1]: session-13.scope: Deactivated successfully. Mar 12 23:52:37.591224 systemd-logind[1610]: Removed session 13. Mar 12 23:52:37.688452 systemd[1]: Started sshd@13-10.0.8.7:22-20.161.92.111:46800.service - OpenSSH per-connection server daemon (20.161.92.111:46800). Mar 12 23:52:38.199523 sshd[6391]: Accepted publickey for core from 20.161.92.111 port 46800 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:38.201119 sshd-session[6391]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:38.205395 systemd-logind[1610]: New session 14 of user core. Mar 12 23:52:38.216445 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 12 23:52:38.543995 sshd[6416]: Connection closed by 20.161.92.111 port 46800 Mar 12 23:52:38.544579 sshd-session[6391]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:38.547671 systemd[1]: sshd@13-10.0.8.7:22-20.161.92.111:46800.service: Deactivated successfully. Mar 12 23:52:38.549775 systemd[1]: session-14.scope: Deactivated successfully. Mar 12 23:52:38.551234 systemd-logind[1610]: Session 14 logged out. Waiting for processes to exit. Mar 12 23:52:38.552824 systemd-logind[1610]: Removed session 14. Mar 12 23:52:43.656860 systemd[1]: Started sshd@14-10.0.8.7:22-20.161.92.111:33304.service - OpenSSH per-connection server daemon (20.161.92.111:33304). Mar 12 23:52:44.168617 sshd[6454]: Accepted publickey for core from 20.161.92.111 port 33304 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:44.169884 sshd-session[6454]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:44.174125 systemd-logind[1610]: New session 15 of user core. Mar 12 23:52:44.184654 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 12 23:52:44.514139 sshd[6457]: Connection closed by 20.161.92.111 port 33304 Mar 12 23:52:44.514568 sshd-session[6454]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:44.518238 systemd[1]: sshd@14-10.0.8.7:22-20.161.92.111:33304.service: Deactivated successfully. Mar 12 23:52:44.521858 systemd[1]: session-15.scope: Deactivated successfully. Mar 12 23:52:44.522709 systemd-logind[1610]: Session 15 logged out. Waiting for processes to exit. Mar 12 23:52:44.524127 systemd-logind[1610]: Removed session 15. Mar 12 23:52:44.622479 systemd[1]: Started sshd@15-10.0.8.7:22-20.161.92.111:33316.service - OpenSSH per-connection server daemon (20.161.92.111:33316). Mar 12 23:52:45.150466 sshd[6471]: Accepted publickey for core from 20.161.92.111 port 33316 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:45.151890 sshd-session[6471]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:45.155702 systemd-logind[1610]: New session 16 of user core. Mar 12 23:52:45.164637 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 12 23:52:45.540648 sshd[6486]: Connection closed by 20.161.92.111 port 33316 Mar 12 23:52:45.541345 sshd-session[6471]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:45.544852 systemd[1]: sshd@15-10.0.8.7:22-20.161.92.111:33316.service: Deactivated successfully. Mar 12 23:52:45.546570 systemd[1]: session-16.scope: Deactivated successfully. Mar 12 23:52:45.547640 systemd-logind[1610]: Session 16 logged out. Waiting for processes to exit. Mar 12 23:52:45.549551 systemd-logind[1610]: Removed session 16. Mar 12 23:52:45.645233 systemd[1]: Started sshd@16-10.0.8.7:22-20.161.92.111:33330.service - OpenSSH per-connection server daemon (20.161.92.111:33330). Mar 12 23:52:46.174597 sshd[6498]: Accepted publickey for core from 20.161.92.111 port 33330 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:46.175923 sshd-session[6498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:46.180042 systemd-logind[1610]: New session 17 of user core. Mar 12 23:52:46.186451 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 12 23:52:47.027791 sshd[6501]: Connection closed by 20.161.92.111 port 33330 Mar 12 23:52:47.028495 sshd-session[6498]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:47.032077 systemd-logind[1610]: Session 17 logged out. Waiting for processes to exit. Mar 12 23:52:47.032335 systemd[1]: sshd@16-10.0.8.7:22-20.161.92.111:33330.service: Deactivated successfully. Mar 12 23:52:47.034905 systemd[1]: session-17.scope: Deactivated successfully. Mar 12 23:52:47.036322 systemd-logind[1610]: Removed session 17. Mar 12 23:52:47.136443 systemd[1]: Started sshd@17-10.0.8.7:22-20.161.92.111:33332.service - OpenSSH per-connection server daemon (20.161.92.111:33332). Mar 12 23:52:47.648172 sshd[6528]: Accepted publickey for core from 20.161.92.111 port 33332 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:47.649594 sshd-session[6528]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:47.654250 systemd-logind[1610]: New session 18 of user core. Mar 12 23:52:47.663606 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 12 23:52:48.096191 sshd[6531]: Connection closed by 20.161.92.111 port 33332 Mar 12 23:52:48.096550 sshd-session[6528]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:48.100100 systemd[1]: sshd@17-10.0.8.7:22-20.161.92.111:33332.service: Deactivated successfully. Mar 12 23:52:48.101924 systemd[1]: session-18.scope: Deactivated successfully. Mar 12 23:52:48.102771 systemd-logind[1610]: Session 18 logged out. Waiting for processes to exit. Mar 12 23:52:48.104146 systemd-logind[1610]: Removed session 18. Mar 12 23:52:48.205338 systemd[1]: Started sshd@18-10.0.8.7:22-20.161.92.111:33340.service - OpenSSH per-connection server daemon (20.161.92.111:33340). Mar 12 23:52:48.714848 sshd[6545]: Accepted publickey for core from 20.161.92.111 port 33340 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:48.716651 sshd-session[6545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:48.720514 systemd-logind[1610]: New session 19 of user core. Mar 12 23:52:48.731626 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 12 23:52:49.057974 sshd[6548]: Connection closed by 20.161.92.111 port 33340 Mar 12 23:52:49.058524 sshd-session[6545]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:49.062690 systemd[1]: sshd@18-10.0.8.7:22-20.161.92.111:33340.service: Deactivated successfully. Mar 12 23:52:49.065483 systemd[1]: session-19.scope: Deactivated successfully. Mar 12 23:52:49.066800 systemd-logind[1610]: Session 19 logged out. Waiting for processes to exit. Mar 12 23:52:49.068342 systemd-logind[1610]: Removed session 19. Mar 12 23:52:54.163763 systemd[1]: Started sshd@19-10.0.8.7:22-20.161.92.111:33708.service - OpenSSH per-connection server daemon (20.161.92.111:33708). Mar 12 23:52:54.690145 sshd[6569]: Accepted publickey for core from 20.161.92.111 port 33708 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:52:54.691539 sshd-session[6569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:52:54.695361 systemd-logind[1610]: New session 20 of user core. Mar 12 23:52:54.705636 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 12 23:52:55.032459 sshd[6594]: Connection closed by 20.161.92.111 port 33708 Mar 12 23:52:55.032727 sshd-session[6569]: pam_unix(sshd:session): session closed for user core Mar 12 23:52:55.036609 systemd[1]: sshd@19-10.0.8.7:22-20.161.92.111:33708.service: Deactivated successfully. Mar 12 23:52:55.039222 systemd[1]: session-20.scope: Deactivated successfully. Mar 12 23:52:55.040000 systemd-logind[1610]: Session 20 logged out. Waiting for processes to exit. Mar 12 23:52:55.041008 systemd-logind[1610]: Removed session 20. Mar 12 23:53:00.143022 systemd[1]: Started sshd@20-10.0.8.7:22-20.161.92.111:33714.service - OpenSSH per-connection server daemon (20.161.92.111:33714). Mar 12 23:53:00.651364 sshd[6647]: Accepted publickey for core from 20.161.92.111 port 33714 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:53:00.652881 sshd-session[6647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:53:00.656599 systemd-logind[1610]: New session 21 of user core. Mar 12 23:53:00.664614 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 12 23:53:00.993198 sshd[6650]: Connection closed by 20.161.92.111 port 33714 Mar 12 23:53:00.993610 sshd-session[6647]: pam_unix(sshd:session): session closed for user core Mar 12 23:53:00.997349 systemd[1]: sshd@20-10.0.8.7:22-20.161.92.111:33714.service: Deactivated successfully. Mar 12 23:53:00.999613 systemd[1]: session-21.scope: Deactivated successfully. Mar 12 23:53:01.000288 systemd-logind[1610]: Session 21 logged out. Waiting for processes to exit. Mar 12 23:53:01.001518 systemd-logind[1610]: Removed session 21. Mar 12 23:53:06.102564 systemd[1]: Started sshd@21-10.0.8.7:22-20.161.92.111:39424.service - OpenSSH per-connection server daemon (20.161.92.111:39424). Mar 12 23:53:06.620322 sshd[6663]: Accepted publickey for core from 20.161.92.111 port 39424 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:53:06.621570 sshd-session[6663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:53:06.626158 systemd-logind[1610]: New session 22 of user core. Mar 12 23:53:06.632472 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 12 23:53:06.963389 sshd[6666]: Connection closed by 20.161.92.111 port 39424 Mar 12 23:53:06.964163 sshd-session[6663]: pam_unix(sshd:session): session closed for user core Mar 12 23:53:06.967669 systemd[1]: sshd@21-10.0.8.7:22-20.161.92.111:39424.service: Deactivated successfully. Mar 12 23:53:06.969404 systemd[1]: session-22.scope: Deactivated successfully. Mar 12 23:53:06.970780 systemd-logind[1610]: Session 22 logged out. Waiting for processes to exit. Mar 12 23:53:06.972332 systemd-logind[1610]: Removed session 22. Mar 12 23:53:12.075982 systemd[1]: Started sshd@22-10.0.8.7:22-20.161.92.111:50390.service - OpenSSH per-connection server daemon (20.161.92.111:50390). Mar 12 23:53:12.595216 sshd[6682]: Accepted publickey for core from 20.161.92.111 port 50390 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:53:12.596604 sshd-session[6682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:53:12.601312 systemd-logind[1610]: New session 23 of user core. Mar 12 23:53:12.610465 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 12 23:53:12.941254 sshd[6685]: Connection closed by 20.161.92.111 port 50390 Mar 12 23:53:12.941783 sshd-session[6682]: pam_unix(sshd:session): session closed for user core Mar 12 23:53:12.945234 systemd[1]: sshd@22-10.0.8.7:22-20.161.92.111:50390.service: Deactivated successfully. Mar 12 23:53:12.946991 systemd[1]: session-23.scope: Deactivated successfully. Mar 12 23:53:12.948390 systemd-logind[1610]: Session 23 logged out. Waiting for processes to exit. Mar 12 23:53:12.949646 systemd-logind[1610]: Removed session 23. Mar 12 23:53:18.047052 systemd[1]: Started sshd@23-10.0.8.7:22-20.161.92.111:50398.service - OpenSSH per-connection server daemon (20.161.92.111:50398). Mar 12 23:53:18.552194 sshd[6726]: Accepted publickey for core from 20.161.92.111 port 50398 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:53:18.553470 sshd-session[6726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:53:18.558815 systemd-logind[1610]: New session 24 of user core. Mar 12 23:53:18.571460 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 12 23:53:18.893554 sshd[6732]: Connection closed by 20.161.92.111 port 50398 Mar 12 23:53:18.894385 sshd-session[6726]: pam_unix(sshd:session): session closed for user core Mar 12 23:53:18.898722 systemd[1]: sshd@23-10.0.8.7:22-20.161.92.111:50398.service: Deactivated successfully. Mar 12 23:53:18.900481 systemd[1]: session-24.scope: Deactivated successfully. Mar 12 23:53:18.901161 systemd-logind[1610]: Session 24 logged out. Waiting for processes to exit. Mar 12 23:53:18.902143 systemd-logind[1610]: Removed session 24. Mar 12 23:53:46.419368 kubelet[2863]: E0312 23:53:46.418907 2863 controller.go:195] "Failed to update lease" err="Put \"https://10.0.8.7:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-9e79e0a9ae?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 23:53:46.856168 kubelet[2863]: E0312 23:53:46.856120 2863 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.7:37410->10.0.8.5:2379: read: connection timed out" Mar 12 23:53:47.299338 systemd[1]: cri-containerd-076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2.scope: Deactivated successfully. Mar 12 23:53:47.299649 systemd[1]: cri-containerd-076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2.scope: Consumed 15.438s CPU time, 115.7M memory peak. Mar 12 23:53:47.301485 containerd[1628]: time="2026-03-12T23:53:47.301450782Z" level=info msg="received container exit event container_id:\"076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2\" id:\"076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2\" pid:3195 exit_status:1 exited_at:{seconds:1773359627 nanos:301169703}" Mar 12 23:53:47.319855 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2-rootfs.mount: Deactivated successfully. Mar 12 23:53:47.391278 kubelet[2863]: I0312 23:53:47.391250 2863 scope.go:117] "RemoveContainer" containerID="076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2" Mar 12 23:53:47.392920 containerd[1628]: time="2026-03-12T23:53:47.392881065Z" level=info msg="CreateContainer within sandbox \"7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 12 23:53:47.403893 containerd[1628]: time="2026-03-12T23:53:47.403859206Z" level=info msg="Container fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:53:47.411310 containerd[1628]: time="2026-03-12T23:53:47.411253554Z" level=info msg="CreateContainer within sandbox \"7db50539b75a23cb131c217b59be7ee2e27303d12558f780c4c3a391e852c1ea\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a\"" Mar 12 23:53:47.411900 containerd[1628]: time="2026-03-12T23:53:47.411846513Z" level=info msg="StartContainer for \"fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a\"" Mar 12 23:53:47.412770 containerd[1628]: time="2026-03-12T23:53:47.412724591Z" level=info msg="connecting to shim fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a" address="unix:///run/containerd/s/199c938dc207ff6d8d26c40cc5656f5ba3754744d06b9e3c2283abadcd26ac4f" protocol=ttrpc version=3 Mar 12 23:53:47.434463 systemd[1]: Started cri-containerd-fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a.scope - libcontainer container fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a. Mar 12 23:53:47.463709 containerd[1628]: time="2026-03-12T23:53:47.463654784Z" level=info msg="StartContainer for \"fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a\" returns successfully" Mar 12 23:53:48.154444 systemd[1]: cri-containerd-c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7.scope: Deactivated successfully. Mar 12 23:53:48.154752 systemd[1]: cri-containerd-c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7.scope: Consumed 5.835s CPU time, 61.1M memory peak. Mar 12 23:53:48.156850 containerd[1628]: time="2026-03-12T23:53:48.156776752Z" level=info msg="received container exit event container_id:\"c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7\" id:\"c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7\" pid:2715 exit_status:1 exited_at:{seconds:1773359628 nanos:156524832}" Mar 12 23:53:48.176002 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7-rootfs.mount: Deactivated successfully. Mar 12 23:53:48.397427 kubelet[2863]: I0312 23:53:48.397401 2863 scope.go:117] "RemoveContainer" containerID="c4a9078a391761d225655747a95377345f31a98d379181abb9e1e9db8224afa7" Mar 12 23:53:48.398798 containerd[1628]: time="2026-03-12T23:53:48.398766416Z" level=info msg="CreateContainer within sandbox \"ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 12 23:53:48.411888 containerd[1628]: time="2026-03-12T23:53:48.410977915Z" level=info msg="Container e87d5931ace19876d1bcf1ecce9d6b638d75920083c0542a9db073184c92c9c5: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:53:48.421782 containerd[1628]: time="2026-03-12T23:53:48.421750417Z" level=info msg="CreateContainer within sandbox \"ac84a12ba35719ba07076a1468e00efc7f4e2f8f286997e542fac64274a25989\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"e87d5931ace19876d1bcf1ecce9d6b638d75920083c0542a9db073184c92c9c5\"" Mar 12 23:53:48.422258 containerd[1628]: time="2026-03-12T23:53:48.422236656Z" level=info msg="StartContainer for \"e87d5931ace19876d1bcf1ecce9d6b638d75920083c0542a9db073184c92c9c5\"" Mar 12 23:53:48.423492 containerd[1628]: time="2026-03-12T23:53:48.423459374Z" level=info msg="connecting to shim e87d5931ace19876d1bcf1ecce9d6b638d75920083c0542a9db073184c92c9c5" address="unix:///run/containerd/s/427792e458f9adbf24c8a4cecd1ad88828d167e3c416c5dc12d755628b0df4cc" protocol=ttrpc version=3 Mar 12 23:53:48.448425 systemd[1]: Started cri-containerd-e87d5931ace19876d1bcf1ecce9d6b638d75920083c0542a9db073184c92c9c5.scope - libcontainer container e87d5931ace19876d1bcf1ecce9d6b638d75920083c0542a9db073184c92c9c5. Mar 12 23:53:48.486076 containerd[1628]: time="2026-03-12T23:53:48.486040586Z" level=info msg="StartContainer for \"e87d5931ace19876d1bcf1ecce9d6b638d75920083c0542a9db073184c92c9c5\" returns successfully" Mar 12 23:53:49.096533 kubelet[2863]: E0312 23:53:49.096386 2863 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.7:37016->10.0.8.5:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-9e79e0a9ae.189c3d2d71aa86ae kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-9e79e0a9ae,UID:4f4fae850342de969b026666a272e751,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-9e79e0a9ae,},FirstTimestamp:2026-03-12 23:53:38.666956462 +0000 UTC m=+268.085915457,LastTimestamp:2026-03-12 23:53:38.666956462 +0000 UTC m=+268.085915457,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-9e79e0a9ae,}" Mar 12 23:53:51.839291 systemd[1]: cri-containerd-ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6.scope: Deactivated successfully. Mar 12 23:53:51.839707 systemd[1]: cri-containerd-ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6.scope: Consumed 3.246s CPU time, 23.1M memory peak. Mar 12 23:53:51.841310 containerd[1628]: time="2026-03-12T23:53:51.841254578Z" level=info msg="received container exit event container_id:\"ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6\" id:\"ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6\" pid:2721 exit_status:1 exited_at:{seconds:1773359631 nanos:840800019}" Mar 12 23:53:51.861428 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6-rootfs.mount: Deactivated successfully. Mar 12 23:53:52.417117 kubelet[2863]: I0312 23:53:52.417068 2863 scope.go:117] "RemoveContainer" containerID="ffc6f168afdc00c1fe4646b8a470cda109b050336ec7c57a5c235d4ee10270c6" Mar 12 23:53:52.418994 containerd[1628]: time="2026-03-12T23:53:52.418962265Z" level=info msg="CreateContainer within sandbox \"aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 12 23:53:52.429784 containerd[1628]: time="2026-03-12T23:53:52.429737006Z" level=info msg="Container e17bff21517f1c32183fd2369b1a9bbc34c195bd158bbbdf4752035f3896fa67: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:53:52.439952 containerd[1628]: time="2026-03-12T23:53:52.439909669Z" level=info msg="CreateContainer within sandbox \"aa365d46f3c94d6861feb79ca11be7fbba91b026bd383e588016ca1499c5f699\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"e17bff21517f1c32183fd2369b1a9bbc34c195bd158bbbdf4752035f3896fa67\"" Mar 12 23:53:52.440433 containerd[1628]: time="2026-03-12T23:53:52.440388628Z" level=info msg="StartContainer for \"e17bff21517f1c32183fd2369b1a9bbc34c195bd158bbbdf4752035f3896fa67\"" Mar 12 23:53:52.441551 containerd[1628]: time="2026-03-12T23:53:52.441510986Z" level=info msg="connecting to shim e17bff21517f1c32183fd2369b1a9bbc34c195bd158bbbdf4752035f3896fa67" address="unix:///run/containerd/s/6bab9b77ae75a443330889a4b31f7aad2789f26b15d3547b213b089b2d980b66" protocol=ttrpc version=3 Mar 12 23:53:52.459513 systemd[1]: Started cri-containerd-e17bff21517f1c32183fd2369b1a9bbc34c195bd158bbbdf4752035f3896fa67.scope - libcontainer container e17bff21517f1c32183fd2369b1a9bbc34c195bd158bbbdf4752035f3896fa67. Mar 12 23:53:52.493409 containerd[1628]: time="2026-03-12T23:53:52.493279337Z" level=info msg="StartContainer for \"e17bff21517f1c32183fd2369b1a9bbc34c195bd158bbbdf4752035f3896fa67\" returns successfully" Mar 12 23:53:56.858163 kubelet[2863]: E0312 23:53:56.857757 2863 controller.go:195] "Failed to update lease" err="Put \"https://10.0.8.7:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-9e79e0a9ae?timeout=10s\": context deadline exceeded" Mar 12 23:53:56.961452 systemd[1]: cri-containerd-fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a.scope: Deactivated successfully. Mar 12 23:53:56.962134 containerd[1628]: time="2026-03-12T23:53:56.962095415Z" level=info msg="received container exit event container_id:\"fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a\" id:\"fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a\" pid:6890 exit_status:1 exited_at:{seconds:1773359636 nanos:961687296}" Mar 12 23:53:56.981886 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a-rootfs.mount: Deactivated successfully. Mar 12 23:53:57.433985 kubelet[2863]: I0312 23:53:57.433935 2863 scope.go:117] "RemoveContainer" containerID="076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2" Mar 12 23:53:57.434253 kubelet[2863]: I0312 23:53:57.434233 2863 scope.go:117] "RemoveContainer" containerID="fd00a71d28e4d36c87025072171f39e1912d6bf87fa0a7e80e2d819e147efa6a" Mar 12 23:53:57.434402 kubelet[2863]: E0312 23:53:57.434380 2863 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-5588576f44-62l88_tigera-operator(688aa2c8-8aa6-461c-b728-f40bb078323b)\"" pod="tigera-operator/tigera-operator-5588576f44-62l88" podUID="688aa2c8-8aa6-461c-b728-f40bb078323b" Mar 12 23:53:57.435814 containerd[1628]: time="2026-03-12T23:53:57.435782281Z" level=info msg="RemoveContainer for \"076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2\"" Mar 12 23:53:57.442006 containerd[1628]: time="2026-03-12T23:53:57.441976390Z" level=info msg="RemoveContainer for \"076091c8e511c408023ec280b08c3016a2982bfb67486e65053c8040427aa0a2\" returns successfully" Mar 12 23:54:03.523340 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec