Mar 12 23:40:51.788237 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 12 23:40:51.788258 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Mar 12 22:07:21 -00 2026 Mar 12 23:40:51.788318 kernel: KASLR enabled Mar 12 23:40:51.788325 kernel: efi: EFI v2.7 by EDK II Mar 12 23:40:51.788330 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Mar 12 23:40:51.788336 kernel: random: crng init done Mar 12 23:40:51.788343 kernel: secureboot: Secure boot disabled Mar 12 23:40:51.788348 kernel: ACPI: Early table checksum verification disabled Mar 12 23:40:51.788354 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Mar 12 23:40:51.788360 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Mar 12 23:40:51.788368 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788374 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788380 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788386 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788393 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788399 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788406 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788413 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788419 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788425 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 23:40:51.788431 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Mar 12 23:40:51.788437 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Mar 12 23:40:51.788443 kernel: ACPI: Use ACPI SPCR as default console: Yes Mar 12 23:40:51.788449 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Mar 12 23:40:51.788455 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Mar 12 23:40:51.788461 kernel: Zone ranges: Mar 12 23:40:51.788468 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 12 23:40:51.788474 kernel: DMA32 empty Mar 12 23:40:51.788480 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Mar 12 23:40:51.788485 kernel: Device empty Mar 12 23:40:51.788491 kernel: Movable zone start for each node Mar 12 23:40:51.788497 kernel: Early memory node ranges Mar 12 23:40:51.788503 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Mar 12 23:40:51.788509 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Mar 12 23:40:51.788515 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Mar 12 23:40:51.788521 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Mar 12 23:40:51.788527 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Mar 12 23:40:51.788533 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Mar 12 23:40:51.788541 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Mar 12 23:40:51.788547 kernel: psci: probing for conduit method from ACPI. Mar 12 23:40:51.788556 kernel: psci: PSCIv1.3 detected in firmware. Mar 12 23:40:51.788562 kernel: psci: Using standard PSCI v0.2 function IDs Mar 12 23:40:51.788569 kernel: psci: Trusted OS migration not required Mar 12 23:40:51.788576 kernel: psci: SMC Calling Convention v1.1 Mar 12 23:40:51.788583 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 12 23:40:51.788589 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Mar 12 23:40:51.788596 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Mar 12 23:40:51.788602 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Mar 12 23:40:51.788608 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Mar 12 23:40:51.788615 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Mar 12 23:40:51.788621 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Mar 12 23:40:51.788628 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 12 23:40:51.788634 kernel: Detected PIPT I-cache on CPU0 Mar 12 23:40:51.788640 kernel: CPU features: detected: GIC system register CPU interface Mar 12 23:40:51.788647 kernel: CPU features: detected: Spectre-v4 Mar 12 23:40:51.788654 kernel: CPU features: detected: Spectre-BHB Mar 12 23:40:51.788661 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 12 23:40:51.788667 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 12 23:40:51.788673 kernel: CPU features: detected: ARM erratum 1418040 Mar 12 23:40:51.788680 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 12 23:40:51.788686 kernel: alternatives: applying boot alternatives Mar 12 23:40:51.788694 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:40:51.788703 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Mar 12 23:40:51.788710 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Mar 12 23:40:51.788717 kernel: Fallback order for Node 0: 0 Mar 12 23:40:51.788724 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Mar 12 23:40:51.788731 kernel: Policy zone: Normal Mar 12 23:40:51.788737 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 12 23:40:51.788743 kernel: software IO TLB: area num 4. Mar 12 23:40:51.788750 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Mar 12 23:40:51.788757 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 12 23:40:51.788763 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 12 23:40:51.788770 kernel: rcu: RCU event tracing is enabled. Mar 12 23:40:51.788777 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 12 23:40:51.788784 kernel: Trampoline variant of Tasks RCU enabled. Mar 12 23:40:51.788791 kernel: Tracing variant of Tasks RCU enabled. Mar 12 23:40:51.788797 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 12 23:40:51.788805 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 12 23:40:51.788812 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 12 23:40:51.788818 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 12 23:40:51.788825 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 12 23:40:51.788831 kernel: GICv3: 256 SPIs implemented Mar 12 23:40:51.788838 kernel: GICv3: 0 Extended SPIs implemented Mar 12 23:40:51.788844 kernel: Root IRQ handler: gic_handle_irq Mar 12 23:40:51.788851 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 12 23:40:51.788857 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Mar 12 23:40:51.788864 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 12 23:40:51.788870 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 12 23:40:51.788877 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Mar 12 23:40:51.788885 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Mar 12 23:40:51.788892 kernel: GICv3: using LPI property table @0x0000000100130000 Mar 12 23:40:51.788899 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Mar 12 23:40:51.788905 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 12 23:40:51.788912 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:40:51.788918 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 12 23:40:51.788925 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 12 23:40:51.788932 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 12 23:40:51.788938 kernel: arm-pv: using stolen time PV Mar 12 23:40:51.788945 kernel: Console: colour dummy device 80x25 Mar 12 23:40:51.788952 kernel: ACPI: Core revision 20240827 Mar 12 23:40:51.788959 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 12 23:40:51.788966 kernel: pid_max: default: 32768 minimum: 301 Mar 12 23:40:51.788973 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 12 23:40:51.788979 kernel: landlock: Up and running. Mar 12 23:40:51.788986 kernel: SELinux: Initializing. Mar 12 23:40:51.788992 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 23:40:51.788999 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 23:40:51.789005 kernel: rcu: Hierarchical SRCU implementation. Mar 12 23:40:51.789012 kernel: rcu: Max phase no-delay instances is 400. Mar 12 23:40:51.789020 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 12 23:40:51.789026 kernel: Remapping and enabling EFI services. Mar 12 23:40:51.789033 kernel: smp: Bringing up secondary CPUs ... Mar 12 23:40:51.789040 kernel: Detected PIPT I-cache on CPU1 Mar 12 23:40:51.789047 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 12 23:40:51.789053 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Mar 12 23:40:51.789060 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:40:51.789066 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 12 23:40:51.789073 kernel: Detected PIPT I-cache on CPU2 Mar 12 23:40:51.789086 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 12 23:40:51.789093 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Mar 12 23:40:51.789100 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:40:51.789107 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 12 23:40:51.789114 kernel: Detected PIPT I-cache on CPU3 Mar 12 23:40:51.789121 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 12 23:40:51.789128 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Mar 12 23:40:51.789135 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 12 23:40:51.789144 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 12 23:40:51.789150 kernel: smp: Brought up 1 node, 4 CPUs Mar 12 23:40:51.789157 kernel: SMP: Total of 4 processors activated. Mar 12 23:40:51.789164 kernel: CPU: All CPU(s) started at EL1 Mar 12 23:40:51.789171 kernel: CPU features: detected: 32-bit EL0 Support Mar 12 23:40:51.789178 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 12 23:40:51.789185 kernel: CPU features: detected: Common not Private translations Mar 12 23:40:51.789193 kernel: CPU features: detected: CRC32 instructions Mar 12 23:40:51.789200 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 12 23:40:51.789208 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 12 23:40:51.789215 kernel: CPU features: detected: LSE atomic instructions Mar 12 23:40:51.789221 kernel: CPU features: detected: Privileged Access Never Mar 12 23:40:51.789228 kernel: CPU features: detected: RAS Extension Support Mar 12 23:40:51.789235 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 12 23:40:51.789242 kernel: alternatives: applying system-wide alternatives Mar 12 23:40:51.789249 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Mar 12 23:40:51.789257 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Mar 12 23:40:51.789334 kernel: devtmpfs: initialized Mar 12 23:40:51.789351 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 12 23:40:51.789359 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 12 23:40:51.789366 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 12 23:40:51.789373 kernel: 0 pages in range for non-PLT usage Mar 12 23:40:51.789380 kernel: 508400 pages in range for PLT usage Mar 12 23:40:51.789387 kernel: pinctrl core: initialized pinctrl subsystem Mar 12 23:40:51.789394 kernel: SMBIOS 3.0.0 present. Mar 12 23:40:51.789401 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Mar 12 23:40:51.789408 kernel: DMI: Memory slots populated: 1/1 Mar 12 23:40:51.789416 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 12 23:40:51.789423 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Mar 12 23:40:51.789431 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 12 23:40:51.789438 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 12 23:40:51.789445 kernel: audit: initializing netlink subsys (disabled) Mar 12 23:40:51.789452 kernel: audit: type=2000 audit(0.040:1): state=initialized audit_enabled=0 res=1 Mar 12 23:40:51.789459 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 12 23:40:51.789466 kernel: cpuidle: using governor menu Mar 12 23:40:51.789473 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 12 23:40:51.789481 kernel: ASID allocator initialised with 32768 entries Mar 12 23:40:51.789488 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 12 23:40:51.789495 kernel: Serial: AMBA PL011 UART driver Mar 12 23:40:51.789502 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 12 23:40:51.789509 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 12 23:40:51.789516 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 12 23:40:51.789523 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 12 23:40:51.789530 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 12 23:40:51.789537 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 12 23:40:51.789545 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 12 23:40:51.789552 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 12 23:40:51.789559 kernel: ACPI: Added _OSI(Module Device) Mar 12 23:40:51.789566 kernel: ACPI: Added _OSI(Processor Device) Mar 12 23:40:51.789573 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 12 23:40:51.789580 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 12 23:40:51.789587 kernel: ACPI: Interpreter enabled Mar 12 23:40:51.789593 kernel: ACPI: Using GIC for interrupt routing Mar 12 23:40:51.789600 kernel: ACPI: MCFG table detected, 1 entries Mar 12 23:40:51.789608 kernel: ACPI: CPU0 has been hot-added Mar 12 23:40:51.789615 kernel: ACPI: CPU1 has been hot-added Mar 12 23:40:51.789622 kernel: ACPI: CPU2 has been hot-added Mar 12 23:40:51.789629 kernel: ACPI: CPU3 has been hot-added Mar 12 23:40:51.789636 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 12 23:40:51.789644 kernel: printk: legacy console [ttyAMA0] enabled Mar 12 23:40:51.789651 kernel: ACPI: PCI: Interrupt link L000 configured for IRQ 35 Mar 12 23:40:51.789658 kernel: ACPI: PCI: Interrupt link L001 configured for IRQ 36 Mar 12 23:40:51.789665 kernel: ACPI: PCI: Interrupt link L002 configured for IRQ 37 Mar 12 23:40:51.789673 kernel: ACPI: PCI: Interrupt link L003 configured for IRQ 38 Mar 12 23:40:51.789681 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 12 23:40:51.789813 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 12 23:40:51.789880 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 12 23:40:51.789940 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 12 23:40:51.789998 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 12 23:40:51.790058 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 12 23:40:51.790069 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 12 23:40:51.790076 kernel: PCI host bridge to bus 0000:00 Mar 12 23:40:51.790142 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 12 23:40:51.790197 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 12 23:40:51.790251 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 12 23:40:51.790327 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 12 23:40:51.790416 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Mar 12 23:40:51.790490 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.790552 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Mar 12 23:40:51.790611 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 12 23:40:51.790669 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Mar 12 23:40:51.790727 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Mar 12 23:40:51.790786 kernel: pci 0000:00:01.0: enabling Extended Tags Mar 12 23:40:51.790876 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.790947 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Mar 12 23:40:51.791006 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Mar 12 23:40:51.791065 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Mar 12 23:40:51.791123 kernel: pci 0000:00:01.1: enabling Extended Tags Mar 12 23:40:51.791191 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.791252 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Mar 12 23:40:51.791353 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Mar 12 23:40:51.791418 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Mar 12 23:40:51.791479 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Mar 12 23:40:51.791537 kernel: pci 0000:00:01.2: enabling Extended Tags Mar 12 23:40:51.791604 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.791663 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Mar 12 23:40:51.791728 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Mar 12 23:40:51.791789 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Mar 12 23:40:51.791853 kernel: pci 0000:00:01.3: enabling Extended Tags Mar 12 23:40:51.791919 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.791979 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Mar 12 23:40:51.792037 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Mar 12 23:40:51.792095 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Mar 12 23:40:51.792153 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Mar 12 23:40:51.792211 kernel: pci 0000:00:01.4: enabling Extended Tags Mar 12 23:40:51.792355 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.792435 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Mar 12 23:40:51.792494 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Mar 12 23:40:51.792553 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Mar 12 23:40:51.792610 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Mar 12 23:40:51.792668 kernel: pci 0000:00:01.5: enabling Extended Tags Mar 12 23:40:51.792735 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.792799 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Mar 12 23:40:51.792858 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Mar 12 23:40:51.792915 kernel: pci 0000:00:01.6: enabling Extended Tags Mar 12 23:40:51.792980 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.793039 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Mar 12 23:40:51.793096 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Mar 12 23:40:51.793157 kernel: pci 0000:00:01.7: enabling Extended Tags Mar 12 23:40:51.793221 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.793304 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Mar 12 23:40:51.793367 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Mar 12 23:40:51.793425 kernel: pci 0000:00:02.0: enabling Extended Tags Mar 12 23:40:51.793499 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.793564 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Mar 12 23:40:51.793627 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Mar 12 23:40:51.793687 kernel: pci 0000:00:02.1: enabling Extended Tags Mar 12 23:40:51.793757 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.793816 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Mar 12 23:40:51.793874 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Mar 12 23:40:51.793932 kernel: pci 0000:00:02.2: enabling Extended Tags Mar 12 23:40:51.793997 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.794058 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Mar 12 23:40:51.794116 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Mar 12 23:40:51.794174 kernel: pci 0000:00:02.3: enabling Extended Tags Mar 12 23:40:51.794239 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.794315 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Mar 12 23:40:51.794376 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Mar 12 23:40:51.794434 kernel: pci 0000:00:02.4: enabling Extended Tags Mar 12 23:40:51.794501 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.794560 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Mar 12 23:40:51.794618 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Mar 12 23:40:51.794676 kernel: pci 0000:00:02.5: enabling Extended Tags Mar 12 23:40:51.794740 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.794799 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Mar 12 23:40:51.794873 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Mar 12 23:40:51.794936 kernel: pci 0000:00:02.6: enabling Extended Tags Mar 12 23:40:51.795003 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.795063 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Mar 12 23:40:51.795121 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Mar 12 23:40:51.795179 kernel: pci 0000:00:02.7: enabling Extended Tags Mar 12 23:40:51.795243 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.795336 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Mar 12 23:40:51.795402 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Mar 12 23:40:51.795463 kernel: pci 0000:00:03.0: enabling Extended Tags Mar 12 23:40:51.795534 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.795600 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Mar 12 23:40:51.795669 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Mar 12 23:40:51.795737 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Mar 12 23:40:51.795806 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Mar 12 23:40:51.795865 kernel: pci 0000:00:03.1: enabling Extended Tags Mar 12 23:40:51.795934 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.796041 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Mar 12 23:40:51.796101 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Mar 12 23:40:51.796160 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Mar 12 23:40:51.796218 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Mar 12 23:40:51.796287 kernel: pci 0000:00:03.2: enabling Extended Tags Mar 12 23:40:51.796361 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.796425 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Mar 12 23:40:51.796483 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Mar 12 23:40:51.796541 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Mar 12 23:40:51.796598 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Mar 12 23:40:51.796657 kernel: pci 0000:00:03.3: enabling Extended Tags Mar 12 23:40:51.796742 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.796807 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Mar 12 23:40:51.796865 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Mar 12 23:40:51.796926 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Mar 12 23:40:51.798008 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Mar 12 23:40:51.798070 kernel: pci 0000:00:03.4: enabling Extended Tags Mar 12 23:40:51.798135 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.798194 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Mar 12 23:40:51.798356 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Mar 12 23:40:51.798540 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Mar 12 23:40:51.798598 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Mar 12 23:40:51.798656 kernel: pci 0000:00:03.5: enabling Extended Tags Mar 12 23:40:51.798721 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.798781 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Mar 12 23:40:51.798840 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Mar 12 23:40:51.798945 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Mar 12 23:40:51.799006 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Mar 12 23:40:51.799064 kernel: pci 0000:00:03.6: enabling Extended Tags Mar 12 23:40:51.799131 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.799190 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Mar 12 23:40:51.799248 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Mar 12 23:40:51.799322 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Mar 12 23:40:51.799405 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Mar 12 23:40:51.799489 kernel: pci 0000:00:03.7: enabling Extended Tags Mar 12 23:40:51.799567 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.799630 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Mar 12 23:40:51.799695 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Mar 12 23:40:51.799757 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Mar 12 23:40:51.799835 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Mar 12 23:40:51.799896 kernel: pci 0000:00:04.0: enabling Extended Tags Mar 12 23:40:51.799965 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.800033 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Mar 12 23:40:51.800091 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Mar 12 23:40:51.800151 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Mar 12 23:40:51.800208 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Mar 12 23:40:51.800276 kernel: pci 0000:00:04.1: enabling Extended Tags Mar 12 23:40:51.800344 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.800405 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Mar 12 23:40:51.800464 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Mar 12 23:40:51.800522 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Mar 12 23:40:51.800583 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Mar 12 23:40:51.800647 kernel: pci 0000:00:04.2: enabling Extended Tags Mar 12 23:40:51.800713 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.800780 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Mar 12 23:40:51.800841 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Mar 12 23:40:51.800899 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Mar 12 23:40:51.800958 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Mar 12 23:40:51.801018 kernel: pci 0000:00:04.3: enabling Extended Tags Mar 12 23:40:51.801092 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.801153 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Mar 12 23:40:51.801212 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Mar 12 23:40:51.801278 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Mar 12 23:40:51.801340 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Mar 12 23:40:51.801410 kernel: pci 0000:00:04.4: enabling Extended Tags Mar 12 23:40:51.801481 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.801540 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Mar 12 23:40:51.801599 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Mar 12 23:40:51.801658 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Mar 12 23:40:51.801718 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Mar 12 23:40:51.801778 kernel: pci 0000:00:04.5: enabling Extended Tags Mar 12 23:40:51.801850 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.801911 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Mar 12 23:40:51.801973 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Mar 12 23:40:51.802035 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Mar 12 23:40:51.802095 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Mar 12 23:40:51.802156 kernel: pci 0000:00:04.6: enabling Extended Tags Mar 12 23:40:51.802242 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.802317 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Mar 12 23:40:51.802378 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Mar 12 23:40:51.802436 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Mar 12 23:40:51.802494 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Mar 12 23:40:51.802553 kernel: pci 0000:00:04.7: enabling Extended Tags Mar 12 23:40:51.802618 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 12 23:40:51.802680 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Mar 12 23:40:51.802738 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Mar 12 23:40:51.802796 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Mar 12 23:40:51.802866 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Mar 12 23:40:51.802929 kernel: pci 0000:00:05.0: enabling Extended Tags Mar 12 23:40:51.803000 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 12 23:40:51.803070 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Mar 12 23:40:51.803135 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Mar 12 23:40:51.803196 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 12 23:40:51.803258 kernel: pci 0000:01:00.0: enabling Extended Tags Mar 12 23:40:51.803339 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 12 23:40:51.803401 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Mar 12 23:40:51.803461 kernel: pci 0000:02:00.0: enabling Extended Tags Mar 12 23:40:51.803531 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Mar 12 23:40:51.803596 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Mar 12 23:40:51.803657 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Mar 12 23:40:51.803718 kernel: pci 0000:03:00.0: enabling Extended Tags Mar 12 23:40:51.803792 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 12 23:40:51.803856 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Mar 12 23:40:51.803923 kernel: pci 0000:04:00.0: enabling Extended Tags Mar 12 23:40:51.803994 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 12 23:40:51.804061 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Mar 12 23:40:51.804124 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Mar 12 23:40:51.804185 kernel: pci 0000:05:00.0: enabling Extended Tags Mar 12 23:40:51.804253 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Mar 12 23:40:51.804327 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Mar 12 23:40:51.804389 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Mar 12 23:40:51.804450 kernel: pci 0000:06:00.0: enabling Extended Tags Mar 12 23:40:51.804514 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Mar 12 23:40:51.804585 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Mar 12 23:40:51.804644 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Mar 12 23:40:51.804705 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Mar 12 23:40:51.804765 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Mar 12 23:40:51.804825 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Mar 12 23:40:51.804888 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 12 23:40:51.804949 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Mar 12 23:40:51.805007 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Mar 12 23:40:51.805073 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 12 23:40:51.805134 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Mar 12 23:40:51.805193 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Mar 12 23:40:51.805255 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 12 23:40:51.805324 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Mar 12 23:40:51.805386 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Mar 12 23:40:51.805448 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 12 23:40:51.805507 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Mar 12 23:40:51.805566 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Mar 12 23:40:51.805629 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 12 23:40:51.805687 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Mar 12 23:40:51.805748 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Mar 12 23:40:51.805812 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 12 23:40:51.805871 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Mar 12 23:40:51.805930 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Mar 12 23:40:51.805992 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 12 23:40:51.806051 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Mar 12 23:40:51.806110 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Mar 12 23:40:51.806172 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Mar 12 23:40:51.806233 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Mar 12 23:40:51.806306 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Mar 12 23:40:51.806374 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Mar 12 23:40:51.806434 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Mar 12 23:40:51.806493 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Mar 12 23:40:51.806555 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Mar 12 23:40:51.806614 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Mar 12 23:40:51.806678 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Mar 12 23:40:51.806739 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Mar 12 23:40:51.806798 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Mar 12 23:40:51.806869 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Mar 12 23:40:51.806933 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Mar 12 23:40:51.806993 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Mar 12 23:40:51.807054 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Mar 12 23:40:51.807116 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Mar 12 23:40:51.807175 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Mar 12 23:40:51.807234 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Mar 12 23:40:51.807306 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Mar 12 23:40:51.807366 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Mar 12 23:40:51.807425 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Mar 12 23:40:51.807489 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Mar 12 23:40:51.807548 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Mar 12 23:40:51.807607 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Mar 12 23:40:51.807668 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Mar 12 23:40:51.807728 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Mar 12 23:40:51.807787 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Mar 12 23:40:51.807848 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Mar 12 23:40:51.807909 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Mar 12 23:40:51.807969 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Mar 12 23:40:51.808031 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Mar 12 23:40:51.808090 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Mar 12 23:40:51.808149 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Mar 12 23:40:51.808209 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Mar 12 23:40:51.808276 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Mar 12 23:40:51.808339 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Mar 12 23:40:51.808404 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Mar 12 23:40:51.808463 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Mar 12 23:40:51.808524 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Mar 12 23:40:51.808585 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Mar 12 23:40:51.808644 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Mar 12 23:40:51.808703 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Mar 12 23:40:51.808763 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Mar 12 23:40:51.808823 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Mar 12 23:40:51.808882 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Mar 12 23:40:51.808944 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Mar 12 23:40:51.809003 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Mar 12 23:40:51.809062 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Mar 12 23:40:51.809122 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Mar 12 23:40:51.809181 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Mar 12 23:40:51.809239 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Mar 12 23:40:51.809319 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Mar 12 23:40:51.809379 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Mar 12 23:40:51.809438 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Mar 12 23:40:51.809498 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Mar 12 23:40:51.809557 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Mar 12 23:40:51.809616 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Mar 12 23:40:51.809676 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Mar 12 23:40:51.809740 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Mar 12 23:40:51.809800 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Mar 12 23:40:51.809860 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Mar 12 23:40:51.809920 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Mar 12 23:40:51.809980 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Mar 12 23:40:51.810041 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Mar 12 23:40:51.810099 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Mar 12 23:40:51.810159 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Mar 12 23:40:51.810220 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Mar 12 23:40:51.810288 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Mar 12 23:40:51.810348 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Mar 12 23:40:51.810409 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Mar 12 23:40:51.810468 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Mar 12 23:40:51.810526 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Mar 12 23:40:51.810589 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Mar 12 23:40:51.810650 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Mar 12 23:40:51.810709 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Mar 12 23:40:51.810770 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Mar 12 23:40:51.810829 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Mar 12 23:40:51.810904 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Mar 12 23:40:51.810966 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Mar 12 23:40:51.811025 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Mar 12 23:40:51.811085 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Mar 12 23:40:51.811147 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Mar 12 23:40:51.811207 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Mar 12 23:40:51.811274 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Mar 12 23:40:51.811336 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Mar 12 23:40:51.811395 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Mar 12 23:40:51.811454 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Mar 12 23:40:51.811513 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Mar 12 23:40:51.811575 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Mar 12 23:40:51.811634 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Mar 12 23:40:51.811693 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Mar 12 23:40:51.811752 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Mar 12 23:40:51.811812 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Mar 12 23:40:51.811872 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Mar 12 23:40:51.811932 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Mar 12 23:40:51.811991 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Mar 12 23:40:51.812053 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Mar 12 23:40:51.812111 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Mar 12 23:40:51.812171 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Mar 12 23:40:51.812229 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Mar 12 23:40:51.812308 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Mar 12 23:40:51.812371 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Mar 12 23:40:51.812431 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Mar 12 23:40:51.812490 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Mar 12 23:40:51.812554 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Mar 12 23:40:51.812613 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Mar 12 23:40:51.812672 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Mar 12 23:40:51.812730 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Mar 12 23:40:51.812790 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Mar 12 23:40:51.812849 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Mar 12 23:40:51.812908 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Mar 12 23:40:51.812967 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Mar 12 23:40:51.813029 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Mar 12 23:40:51.813087 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Mar 12 23:40:51.813147 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Mar 12 23:40:51.813205 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Mar 12 23:40:51.813282 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Mar 12 23:40:51.813346 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Mar 12 23:40:51.813406 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Mar 12 23:40:51.813471 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Mar 12 23:40:51.813531 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Mar 12 23:40:51.813590 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Mar 12 23:40:51.813649 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Mar 12 23:40:51.813708 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Mar 12 23:40:51.813767 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Mar 12 23:40:51.813826 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Mar 12 23:40:51.813885 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Mar 12 23:40:51.813946 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Mar 12 23:40:51.814008 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Mar 12 23:40:51.814067 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Mar 12 23:40:51.814127 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Mar 12 23:40:51.814185 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Mar 12 23:40:51.814247 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Mar 12 23:40:51.814322 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Mar 12 23:40:51.814383 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Mar 12 23:40:51.814442 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Mar 12 23:40:51.814502 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Mar 12 23:40:51.814560 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Mar 12 23:40:51.814619 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Mar 12 23:40:51.814678 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Mar 12 23:40:51.814741 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Mar 12 23:40:51.814800 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Mar 12 23:40:51.814871 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Mar 12 23:40:51.814931 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Mar 12 23:40:51.814991 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Mar 12 23:40:51.815049 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Mar 12 23:40:51.815108 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Mar 12 23:40:51.815167 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Mar 12 23:40:51.815229 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Mar 12 23:40:51.815301 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Mar 12 23:40:51.815361 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Mar 12 23:40:51.815420 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Mar 12 23:40:51.815478 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Mar 12 23:40:51.815537 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Mar 12 23:40:51.815596 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Mar 12 23:40:51.815654 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Mar 12 23:40:51.815716 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Mar 12 23:40:51.815774 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Mar 12 23:40:51.815834 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Mar 12 23:40:51.815893 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Mar 12 23:40:51.815952 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Mar 12 23:40:51.816013 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Mar 12 23:40:51.816073 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Mar 12 23:40:51.816131 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Mar 12 23:40:51.816193 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Mar 12 23:40:51.816252 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Mar 12 23:40:51.816328 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Mar 12 23:40:51.816388 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Mar 12 23:40:51.816448 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Mar 12 23:40:51.816510 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.816569 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.816630 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Mar 12 23:40:51.816695 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.816759 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.816818 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Mar 12 23:40:51.816877 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.816937 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.816998 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Mar 12 23:40:51.817056 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.817118 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.817178 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Mar 12 23:40:51.817237 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.817307 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.817369 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Mar 12 23:40:51.817428 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.817495 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.817556 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Mar 12 23:40:51.817615 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.817674 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.817735 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Mar 12 23:40:51.817795 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.817857 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.817922 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Mar 12 23:40:51.817984 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.818046 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.818108 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Mar 12 23:40:51.818184 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.818246 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.818315 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Mar 12 23:40:51.818376 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.818434 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.818494 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Mar 12 23:40:51.818553 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.818617 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.818677 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Mar 12 23:40:51.818735 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.818795 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.818865 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Mar 12 23:40:51.818926 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.818989 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.819060 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Mar 12 23:40:51.819124 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.819192 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.819254 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Mar 12 23:40:51.819322 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.819384 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.819444 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Mar 12 23:40:51.819503 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.819561 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.819636 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Mar 12 23:40:51.819694 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.819755 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.819815 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Mar 12 23:40:51.819873 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Mar 12 23:40:51.819932 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Mar 12 23:40:51.819991 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Mar 12 23:40:51.820051 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Mar 12 23:40:51.820111 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Mar 12 23:40:51.820175 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Mar 12 23:40:51.820236 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Mar 12 23:40:51.820310 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Mar 12 23:40:51.820372 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Mar 12 23:40:51.820433 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Mar 12 23:40:51.820494 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Mar 12 23:40:51.820555 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Mar 12 23:40:51.820616 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Mar 12 23:40:51.820679 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Mar 12 23:40:51.820740 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.820802 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.820864 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.820926 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.820988 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821062 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.821125 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821189 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.821251 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821323 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.821385 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821444 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.821504 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821563 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.821625 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821691 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.821753 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821821 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.821883 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.821946 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822009 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822069 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822137 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822197 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822257 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822333 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822407 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822469 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822531 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822598 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822659 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822718 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822777 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822838 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.822913 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Mar 12 23:40:51.822973 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Mar 12 23:40:51.823040 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Mar 12 23:40:51.823102 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Mar 12 23:40:51.823163 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Mar 12 23:40:51.823223 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Mar 12 23:40:51.823300 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Mar 12 23:40:51.823363 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Mar 12 23:40:51.823434 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Mar 12 23:40:51.823494 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Mar 12 23:40:51.823554 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Mar 12 23:40:51.823614 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Mar 12 23:40:51.823681 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Mar 12 23:40:51.823745 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Mar 12 23:40:51.823808 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Mar 12 23:40:51.823867 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Mar 12 23:40:51.823928 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Mar 12 23:40:51.823996 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Mar 12 23:40:51.824056 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Mar 12 23:40:51.824117 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Mar 12 23:40:51.824176 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Mar 12 23:40:51.824241 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Mar 12 23:40:51.824340 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Mar 12 23:40:51.824403 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Mar 12 23:40:51.824463 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Mar 12 23:40:51.824530 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Mar 12 23:40:51.824598 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Mar 12 23:40:51.824662 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Mar 12 23:40:51.824721 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Mar 12 23:40:51.824781 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Mar 12 23:40:51.824844 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 12 23:40:51.824903 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Mar 12 23:40:51.824967 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Mar 12 23:40:51.825028 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 12 23:40:51.825088 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Mar 12 23:40:51.825148 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Mar 12 23:40:51.825209 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 12 23:40:51.825296 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Mar 12 23:40:51.825363 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Mar 12 23:40:51.825424 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Mar 12 23:40:51.825483 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Mar 12 23:40:51.825543 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Mar 12 23:40:51.825603 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Mar 12 23:40:51.825667 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Mar 12 23:40:51.825727 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Mar 12 23:40:51.825787 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Mar 12 23:40:51.825855 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Mar 12 23:40:51.825916 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Mar 12 23:40:51.825977 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Mar 12 23:40:51.826039 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Mar 12 23:40:51.826098 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Mar 12 23:40:51.826157 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Mar 12 23:40:51.826218 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Mar 12 23:40:51.826294 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Mar 12 23:40:51.826358 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Mar 12 23:40:51.826423 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Mar 12 23:40:51.826491 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Mar 12 23:40:51.826553 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Mar 12 23:40:51.826615 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Mar 12 23:40:51.826675 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Mar 12 23:40:51.826734 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Mar 12 23:40:51.826794 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Mar 12 23:40:51.826922 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Mar 12 23:40:51.827001 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Mar 12 23:40:51.827063 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Mar 12 23:40:51.827124 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Mar 12 23:40:51.827221 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Mar 12 23:40:51.827302 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Mar 12 23:40:51.827364 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Mar 12 23:40:51.827423 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Mar 12 23:40:51.827508 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Mar 12 23:40:51.827575 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Mar 12 23:40:51.827634 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Mar 12 23:40:51.827696 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Mar 12 23:40:51.827758 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Mar 12 23:40:51.827819 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Mar 12 23:40:51.827879 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Mar 12 23:40:51.827938 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Mar 12 23:40:51.827996 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Mar 12 23:40:51.828059 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Mar 12 23:40:51.828120 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Mar 12 23:40:51.828182 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Mar 12 23:40:51.828242 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Mar 12 23:40:51.828313 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Mar 12 23:40:51.828374 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Mar 12 23:40:51.828433 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Mar 12 23:40:51.828492 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Mar 12 23:40:51.828552 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Mar 12 23:40:51.828614 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Mar 12 23:40:51.828674 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Mar 12 23:40:51.828733 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Mar 12 23:40:51.828794 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Mar 12 23:40:51.828855 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Mar 12 23:40:51.828916 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Mar 12 23:40:51.828978 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Mar 12 23:40:51.829041 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Mar 12 23:40:51.829105 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Mar 12 23:40:51.829182 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Mar 12 23:40:51.829243 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Mar 12 23:40:51.829314 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Mar 12 23:40:51.829377 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Mar 12 23:40:51.829438 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Mar 12 23:40:51.829498 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Mar 12 23:40:51.829560 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Mar 12 23:40:51.829640 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Mar 12 23:40:51.829700 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Mar 12 23:40:51.829760 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Mar 12 23:40:51.829820 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Mar 12 23:40:51.829878 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Mar 12 23:40:51.829942 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Mar 12 23:40:51.830009 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Mar 12 23:40:51.830071 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Mar 12 23:40:51.830131 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Mar 12 23:40:51.830193 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Mar 12 23:40:51.830252 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Mar 12 23:40:51.830326 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Mar 12 23:40:51.830393 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Mar 12 23:40:51.830454 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Mar 12 23:40:51.830513 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Mar 12 23:40:51.830575 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Mar 12 23:40:51.830636 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Mar 12 23:40:51.830702 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Mar 12 23:40:51.830763 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Mar 12 23:40:51.830825 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Mar 12 23:40:51.830895 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Mar 12 23:40:51.830961 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Mar 12 23:40:51.831021 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Mar 12 23:40:51.831083 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 12 23:40:51.831136 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 12 23:40:51.831193 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 12 23:40:51.831258 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Mar 12 23:40:51.831330 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Mar 12 23:40:51.831398 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Mar 12 23:40:51.831457 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Mar 12 23:40:51.831520 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Mar 12 23:40:51.831581 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Mar 12 23:40:51.831645 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Mar 12 23:40:51.831702 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Mar 12 23:40:51.831764 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Mar 12 23:40:51.831820 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Mar 12 23:40:51.831882 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Mar 12 23:40:51.831940 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 12 23:40:51.832008 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Mar 12 23:40:51.832065 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 12 23:40:51.832126 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Mar 12 23:40:51.832186 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 12 23:40:51.832248 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Mar 12 23:40:51.832325 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Mar 12 23:40:51.832390 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Mar 12 23:40:51.832446 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Mar 12 23:40:51.832509 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Mar 12 23:40:51.832564 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Mar 12 23:40:51.832628 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Mar 12 23:40:51.832688 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Mar 12 23:40:51.832753 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Mar 12 23:40:51.832810 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Mar 12 23:40:51.832873 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Mar 12 23:40:51.832943 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Mar 12 23:40:51.833005 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Mar 12 23:40:51.833063 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Mar 12 23:40:51.833134 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Mar 12 23:40:51.833192 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Mar 12 23:40:51.833256 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Mar 12 23:40:51.833328 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Mar 12 23:40:51.833390 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Mar 12 23:40:51.833448 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Mar 12 23:40:51.833508 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Mar 12 23:40:51.833564 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Mar 12 23:40:51.833617 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Mar 12 23:40:51.833678 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Mar 12 23:40:51.833733 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Mar 12 23:40:51.833795 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Mar 12 23:40:51.833858 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Mar 12 23:40:51.833916 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Mar 12 23:40:51.833971 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Mar 12 23:40:51.834032 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Mar 12 23:40:51.834087 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Mar 12 23:40:51.834141 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Mar 12 23:40:51.834203 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Mar 12 23:40:51.834258 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Mar 12 23:40:51.834347 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Mar 12 23:40:51.834414 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Mar 12 23:40:51.834470 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Mar 12 23:40:51.834525 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Mar 12 23:40:51.834586 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Mar 12 23:40:51.834644 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Mar 12 23:40:51.834698 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Mar 12 23:40:51.834759 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Mar 12 23:40:51.834815 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Mar 12 23:40:51.834883 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Mar 12 23:40:51.834946 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Mar 12 23:40:51.835001 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Mar 12 23:40:51.835058 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Mar 12 23:40:51.835121 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Mar 12 23:40:51.835178 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Mar 12 23:40:51.835232 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Mar 12 23:40:51.835309 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Mar 12 23:40:51.835368 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Mar 12 23:40:51.835422 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Mar 12 23:40:51.835485 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Mar 12 23:40:51.835541 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Mar 12 23:40:51.835595 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Mar 12 23:40:51.835656 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Mar 12 23:40:51.835711 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Mar 12 23:40:51.835766 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Mar 12 23:40:51.835829 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Mar 12 23:40:51.835884 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Mar 12 23:40:51.835941 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Mar 12 23:40:51.836009 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Mar 12 23:40:51.836065 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Mar 12 23:40:51.836120 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Mar 12 23:40:51.836129 kernel: iommu: Default domain type: Translated Mar 12 23:40:51.836139 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 12 23:40:51.836147 kernel: efivars: Registered efivars operations Mar 12 23:40:51.836154 kernel: vgaarb: loaded Mar 12 23:40:51.836162 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 12 23:40:51.836169 kernel: VFS: Disk quotas dquot_6.6.0 Mar 12 23:40:51.836177 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 12 23:40:51.836184 kernel: pnp: PnP ACPI init Mar 12 23:40:51.836251 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 12 23:40:51.836262 kernel: pnp: PnP ACPI: found 1 devices Mar 12 23:40:51.836282 kernel: NET: Registered PF_INET protocol family Mar 12 23:40:51.836290 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 12 23:40:51.836297 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Mar 12 23:40:51.836305 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 12 23:40:51.836313 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 12 23:40:51.836320 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Mar 12 23:40:51.836328 kernel: TCP: Hash tables configured (established 131072 bind 65536) Mar 12 23:40:51.836335 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Mar 12 23:40:51.836343 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Mar 12 23:40:51.836351 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 12 23:40:51.836420 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Mar 12 23:40:51.836430 kernel: PCI: CLS 0 bytes, default 64 Mar 12 23:40:51.836438 kernel: kvm [1]: HYP mode not available Mar 12 23:40:51.836445 kernel: Initialise system trusted keyrings Mar 12 23:40:51.836452 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Mar 12 23:40:51.836460 kernel: Key type asymmetric registered Mar 12 23:40:51.836467 kernel: Asymmetric key parser 'x509' registered Mar 12 23:40:51.836474 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 12 23:40:51.836484 kernel: io scheduler mq-deadline registered Mar 12 23:40:51.836491 kernel: io scheduler kyber registered Mar 12 23:40:51.836498 kernel: io scheduler bfq registered Mar 12 23:40:51.836506 kernel: ACPI: \_SB_.L001: Enabled at IRQ 36 Mar 12 23:40:51.836567 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Mar 12 23:40:51.836628 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Mar 12 23:40:51.836687 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.836748 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Mar 12 23:40:51.836811 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Mar 12 23:40:51.836871 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.836932 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Mar 12 23:40:51.836992 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Mar 12 23:40:51.837053 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.837114 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Mar 12 23:40:51.837173 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Mar 12 23:40:51.837231 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.837309 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Mar 12 23:40:51.837371 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Mar 12 23:40:51.837430 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.837491 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Mar 12 23:40:51.837552 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Mar 12 23:40:51.837613 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.837674 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Mar 12 23:40:51.837733 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Mar 12 23:40:51.837795 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.837857 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Mar 12 23:40:51.837917 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Mar 12 23:40:51.837977 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.837987 kernel: ACPI: \_SB_.L002: Enabled at IRQ 37 Mar 12 23:40:51.838046 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Mar 12 23:40:51.838105 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Mar 12 23:40:51.838164 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.838227 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Mar 12 23:40:51.838299 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Mar 12 23:40:51.838360 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.838421 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Mar 12 23:40:51.838482 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Mar 12 23:40:51.838541 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.838602 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Mar 12 23:40:51.838667 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Mar 12 23:40:51.838726 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.838787 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Mar 12 23:40:51.838859 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Mar 12 23:40:51.838923 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.838985 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Mar 12 23:40:51.839044 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Mar 12 23:40:51.839104 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.839170 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Mar 12 23:40:51.839231 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Mar 12 23:40:51.839300 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.839365 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Mar 12 23:40:51.839426 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Mar 12 23:40:51.839486 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.839497 kernel: ACPI: \_SB_.L003: Enabled at IRQ 38 Mar 12 23:40:51.839555 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Mar 12 23:40:51.839619 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Mar 12 23:40:51.839679 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.839740 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Mar 12 23:40:51.839799 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Mar 12 23:40:51.839858 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.839919 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Mar 12 23:40:51.839979 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Mar 12 23:40:51.840038 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.840100 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Mar 12 23:40:51.840159 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Mar 12 23:40:51.840218 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.840292 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Mar 12 23:40:51.840353 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Mar 12 23:40:51.840412 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.840471 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Mar 12 23:40:51.840533 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Mar 12 23:40:51.840592 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.840655 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Mar 12 23:40:51.840716 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Mar 12 23:40:51.840776 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.840837 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Mar 12 23:40:51.840896 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Mar 12 23:40:51.840957 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.840969 kernel: ACPI: \_SB_.L000: Enabled at IRQ 35 Mar 12 23:40:51.841029 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Mar 12 23:40:51.841089 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Mar 12 23:40:51.841148 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.841208 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Mar 12 23:40:51.841278 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Mar 12 23:40:51.841341 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.841404 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Mar 12 23:40:51.841467 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Mar 12 23:40:51.841526 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.841586 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Mar 12 23:40:51.841645 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Mar 12 23:40:51.841704 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.841765 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Mar 12 23:40:51.841824 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Mar 12 23:40:51.841884 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.841949 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Mar 12 23:40:51.842008 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Mar 12 23:40:51.842066 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.842130 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Mar 12 23:40:51.842189 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Mar 12 23:40:51.842249 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.842322 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Mar 12 23:40:51.842387 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Mar 12 23:40:51.842446 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.842508 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Mar 12 23:40:51.842570 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Mar 12 23:40:51.842631 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 23:40:51.842641 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 12 23:40:51.842649 kernel: ACPI: button: Power Button [PWRB] Mar 12 23:40:51.842713 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Mar 12 23:40:51.842782 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Mar 12 23:40:51.842792 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 12 23:40:51.842799 kernel: thunder_xcv, ver 1.0 Mar 12 23:40:51.842807 kernel: thunder_bgx, ver 1.0 Mar 12 23:40:51.842814 kernel: nicpf, ver 1.0 Mar 12 23:40:51.842822 kernel: nicvf, ver 1.0 Mar 12 23:40:51.842912 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 12 23:40:51.842975 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-03-12T23:40:51 UTC (1773358851) Mar 12 23:40:51.842998 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 12 23:40:51.843007 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Mar 12 23:40:51.843015 kernel: watchdog: NMI not fully supported Mar 12 23:40:51.843023 kernel: watchdog: Hard watchdog permanently disabled Mar 12 23:40:51.843030 kernel: NET: Registered PF_INET6 protocol family Mar 12 23:40:51.843039 kernel: Segment Routing with IPv6 Mar 12 23:40:51.843046 kernel: In-situ OAM (IOAM) with IPv6 Mar 12 23:40:51.843054 kernel: NET: Registered PF_PACKET protocol family Mar 12 23:40:51.843062 kernel: Key type dns_resolver registered Mar 12 23:40:51.843070 kernel: registered taskstats version 1 Mar 12 23:40:51.843079 kernel: Loading compiled-in X.509 certificates Mar 12 23:40:51.843086 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 653709f5ad64856a37b70c07139630123477ee1c' Mar 12 23:40:51.843094 kernel: Demotion targets for Node 0: null Mar 12 23:40:51.843102 kernel: Key type .fscrypt registered Mar 12 23:40:51.843111 kernel: Key type fscrypt-provisioning registered Mar 12 23:40:51.843119 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 12 23:40:51.843126 kernel: ima: Allocated hash algorithm: sha1 Mar 12 23:40:51.843134 kernel: ima: No architecture policies found Mar 12 23:40:51.843143 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 12 23:40:51.843151 kernel: clk: Disabling unused clocks Mar 12 23:40:51.843159 kernel: PM: genpd: Disabling unused power domains Mar 12 23:40:51.843166 kernel: Warning: unable to open an initial console. Mar 12 23:40:51.843174 kernel: Freeing unused kernel memory: 39552K Mar 12 23:40:51.843182 kernel: Run /init as init process Mar 12 23:40:51.843190 kernel: with arguments: Mar 12 23:40:51.843198 kernel: /init Mar 12 23:40:51.843205 kernel: with environment: Mar 12 23:40:51.843214 kernel: HOME=/ Mar 12 23:40:51.843222 kernel: TERM=linux Mar 12 23:40:51.843231 systemd[1]: Successfully made /usr/ read-only. Mar 12 23:40:51.843242 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:40:51.843252 systemd[1]: Detected virtualization kvm. Mar 12 23:40:51.843260 systemd[1]: Detected architecture arm64. Mar 12 23:40:51.843280 systemd[1]: Running in initrd. Mar 12 23:40:51.843290 systemd[1]: No hostname configured, using default hostname. Mar 12 23:40:51.843298 systemd[1]: Hostname set to . Mar 12 23:40:51.843306 systemd[1]: Initializing machine ID from VM UUID. Mar 12 23:40:51.843314 systemd[1]: Queued start job for default target initrd.target. Mar 12 23:40:51.843323 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:40:51.843331 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:40:51.843339 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 12 23:40:51.843348 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:40:51.843357 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 12 23:40:51.843366 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 12 23:40:51.843375 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 12 23:40:51.843384 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 12 23:40:51.843392 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:40:51.843400 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:40:51.843408 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:40:51.843418 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:40:51.843426 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:40:51.843434 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:40:51.843442 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:40:51.843450 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:40:51.843458 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 12 23:40:51.843466 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 12 23:40:51.843475 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:40:51.843483 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:40:51.843492 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:40:51.843500 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:40:51.843508 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 12 23:40:51.843517 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:40:51.843525 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 12 23:40:51.843534 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 12 23:40:51.843542 systemd[1]: Starting systemd-fsck-usr.service... Mar 12 23:40:51.843551 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:40:51.843561 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:40:51.843569 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:40:51.843577 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 12 23:40:51.843586 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:40:51.843594 systemd[1]: Finished systemd-fsck-usr.service. Mar 12 23:40:51.843604 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 23:40:51.843636 systemd-journald[312]: Collecting audit messages is disabled. Mar 12 23:40:51.843656 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:40:51.843666 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 12 23:40:51.843675 kernel: Bridge firewalling registered Mar 12 23:40:51.843683 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 23:40:51.843692 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:40:51.843700 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 23:40:51.843708 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:40:51.843716 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:40:51.843725 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:40:51.843734 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 12 23:40:51.843743 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:40:51.843752 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:40:51.843762 systemd-journald[312]: Journal started Mar 12 23:40:51.843780 systemd-journald[312]: Runtime Journal (/run/log/journal/1d04046693844922b913f8388591d77d) is 8M, max 319.5M, 311.5M free. Mar 12 23:40:51.784292 systemd-modules-load[313]: Inserted module 'overlay' Mar 12 23:40:51.799944 systemd-modules-load[313]: Inserted module 'br_netfilter' Mar 12 23:40:51.846380 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:40:51.848364 dracut-cmdline[344]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=9bf054737b516803a47d5bd373cc1c618bc257c93cef3d2e2bc09897e693383d Mar 12 23:40:51.851231 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:40:51.864773 systemd-tmpfiles[371]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 12 23:40:51.868510 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:40:51.871830 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:40:51.903052 systemd-resolved[400]: Positive Trust Anchors: Mar 12 23:40:51.903071 systemd-resolved[400]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:40:51.903103 systemd-resolved[400]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:40:51.908597 systemd-resolved[400]: Defaulting to hostname 'linux'. Mar 12 23:40:51.909560 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:40:51.911532 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:40:51.924321 kernel: SCSI subsystem initialized Mar 12 23:40:51.928294 kernel: Loading iSCSI transport class v2.0-870. Mar 12 23:40:51.936293 kernel: iscsi: registered transport (tcp) Mar 12 23:40:51.949306 kernel: iscsi: registered transport (qla4xxx) Mar 12 23:40:51.949363 kernel: QLogic iSCSI HBA Driver Mar 12 23:40:51.966033 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:40:51.992843 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:40:51.995363 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:40:52.038337 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 12 23:40:52.040425 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 12 23:40:52.100332 kernel: raid6: neonx8 gen() 15728 MB/s Mar 12 23:40:52.117284 kernel: raid6: neonx4 gen() 15763 MB/s Mar 12 23:40:52.134307 kernel: raid6: neonx2 gen() 13173 MB/s Mar 12 23:40:52.151305 kernel: raid6: neonx1 gen() 10441 MB/s Mar 12 23:40:52.168307 kernel: raid6: int64x8 gen() 6884 MB/s Mar 12 23:40:52.185282 kernel: raid6: int64x4 gen() 7322 MB/s Mar 12 23:40:52.202306 kernel: raid6: int64x2 gen() 6090 MB/s Mar 12 23:40:52.219309 kernel: raid6: int64x1 gen() 5043 MB/s Mar 12 23:40:52.219362 kernel: raid6: using algorithm neonx4 gen() 15763 MB/s Mar 12 23:40:52.236308 kernel: raid6: .... xor() 12313 MB/s, rmw enabled Mar 12 23:40:52.236342 kernel: raid6: using neon recovery algorithm Mar 12 23:40:52.241523 kernel: xor: measuring software checksum speed Mar 12 23:40:52.241567 kernel: 8regs : 21618 MB/sec Mar 12 23:40:52.242644 kernel: 32regs : 21293 MB/sec Mar 12 23:40:52.242659 kernel: arm64_neon : 28138 MB/sec Mar 12 23:40:52.242669 kernel: xor: using function: arm64_neon (28138 MB/sec) Mar 12 23:40:52.296328 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 12 23:40:52.302231 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:40:52.304645 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:40:52.335609 systemd-udevd[568]: Using default interface naming scheme 'v255'. Mar 12 23:40:52.339664 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:40:52.341423 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 12 23:40:52.367312 dracut-pre-trigger[576]: rd.md=0: removing MD RAID activation Mar 12 23:40:52.389415 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:40:52.391207 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:40:52.468764 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:40:52.472426 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 12 23:40:52.518302 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Mar 12 23:40:52.524847 kernel: ACPI: bus type USB registered Mar 12 23:40:52.534586 kernel: usbcore: registered new interface driver usbfs Mar 12 23:40:52.534634 kernel: usbcore: registered new interface driver hub Mar 12 23:40:52.540322 kernel: usbcore: registered new device driver usb Mar 12 23:40:52.552297 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Mar 12 23:40:52.555899 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 12 23:40:52.556082 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 12 23:40:52.557486 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 12 23:40:52.558607 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 12 23:40:52.558744 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 12 23:40:52.559415 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 12 23:40:52.559441 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 12 23:40:52.560732 kernel: GPT:17805311 != 104857599 Mar 12 23:40:52.561399 kernel: hub 1-0:1.0: USB hub found Mar 12 23:40:52.561527 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 12 23:40:52.561539 kernel: hub 1-0:1.0: 4 ports detected Mar 12 23:40:52.562335 kernel: GPT:17805311 != 104857599 Mar 12 23:40:52.563509 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 12 23:40:52.563571 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 12 23:40:52.564373 kernel: hub 2-0:1.0: USB hub found Mar 12 23:40:52.564512 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 23:40:52.564523 kernel: hub 2-0:1.0: 4 ports detected Mar 12 23:40:52.568561 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:40:52.569607 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:40:52.571737 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:40:52.573737 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:40:52.615528 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 12 23:40:52.618293 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:40:52.629434 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 12 23:40:52.637248 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 12 23:40:52.639949 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 12 23:40:52.648308 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 12 23:40:52.655970 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 23:40:52.657024 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:40:52.658965 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:40:52.661038 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:40:52.663738 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 12 23:40:52.665317 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 12 23:40:52.691996 disk-uuid[667]: Primary Header is updated. Mar 12 23:40:52.691996 disk-uuid[667]: Secondary Entries is updated. Mar 12 23:40:52.691996 disk-uuid[667]: Secondary Header is updated. Mar 12 23:40:52.695890 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:40:52.698290 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 23:40:52.803307 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 12 23:40:52.933771 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Mar 12 23:40:52.933815 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 12 23:40:52.933966 kernel: usbcore: registered new interface driver usbhid Mar 12 23:40:52.934292 kernel: usbhid: USB HID core driver Mar 12 23:40:53.040316 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Mar 12 23:40:53.165308 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Mar 12 23:40:53.217339 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Mar 12 23:40:53.710759 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 23:40:53.710807 disk-uuid[672]: The operation has completed successfully. Mar 12 23:40:53.750144 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 12 23:40:53.750244 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 12 23:40:53.776096 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 12 23:40:53.793826 sh[689]: Success Mar 12 23:40:53.807220 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 12 23:40:53.807260 kernel: device-mapper: uevent: version 1.0.3 Mar 12 23:40:53.807290 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 12 23:40:53.814291 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Mar 12 23:40:53.859234 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 12 23:40:53.862035 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 12 23:40:53.878527 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 12 23:40:53.889303 kernel: BTRFS: device fsid fcbb17b2-5053-44fc-82f0-b24e4919d6d8 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (701) Mar 12 23:40:53.891284 kernel: BTRFS info (device dm-0): first mount of filesystem fcbb17b2-5053-44fc-82f0-b24e4919d6d8 Mar 12 23:40:53.891356 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:40:53.902336 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 12 23:40:53.902392 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 12 23:40:53.904033 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 12 23:40:53.905245 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:40:53.906413 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 12 23:40:53.907293 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 12 23:40:53.908640 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 12 23:40:53.937296 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (734) Mar 12 23:40:53.939407 kernel: BTRFS info (device vda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:40:53.939441 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:40:53.943631 kernel: BTRFS info (device vda6): turning on async discard Mar 12 23:40:53.943663 kernel: BTRFS info (device vda6): enabling free space tree Mar 12 23:40:53.947376 kernel: BTRFS info (device vda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:40:53.948936 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 12 23:40:53.950712 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 12 23:40:54.002010 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:40:54.005513 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:40:54.047128 systemd-networkd[872]: lo: Link UP Mar 12 23:40:54.047142 systemd-networkd[872]: lo: Gained carrier Mar 12 23:40:54.048095 systemd-networkd[872]: Enumeration completed Mar 12 23:40:54.048355 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:40:54.048521 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:40:54.048524 systemd-networkd[872]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:40:54.049229 systemd-networkd[872]: eth0: Link UP Mar 12 23:40:54.049378 systemd-networkd[872]: eth0: Gained carrier Mar 12 23:40:54.049387 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:40:54.049415 systemd[1]: Reached target network.target - Network. Mar 12 23:40:54.071323 systemd-networkd[872]: eth0: DHCPv4 address 10.0.4.241/25, gateway 10.0.4.129 acquired from 10.0.4.129 Mar 12 23:40:54.087213 ignition[791]: Ignition 2.22.0 Mar 12 23:40:54.087229 ignition[791]: Stage: fetch-offline Mar 12 23:40:54.087279 ignition[791]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:40:54.087288 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:40:54.089991 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:40:54.087377 ignition[791]: parsed url from cmdline: "" Mar 12 23:40:54.091974 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 12 23:40:54.087380 ignition[791]: no config URL provided Mar 12 23:40:54.087384 ignition[791]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:40:54.087391 ignition[791]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:40:54.087395 ignition[791]: failed to fetch config: resource requires networking Mar 12 23:40:54.087613 ignition[791]: Ignition finished successfully Mar 12 23:40:54.118506 ignition[886]: Ignition 2.22.0 Mar 12 23:40:54.118526 ignition[886]: Stage: fetch Mar 12 23:40:54.118666 ignition[886]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:40:54.118675 ignition[886]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:40:54.118748 ignition[886]: parsed url from cmdline: "" Mar 12 23:40:54.118751 ignition[886]: no config URL provided Mar 12 23:40:54.118755 ignition[886]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 23:40:54.118761 ignition[886]: no config at "/usr/lib/ignition/user.ign" Mar 12 23:40:54.119116 ignition[886]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 12 23:40:54.119314 ignition[886]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 12 23:40:54.119561 ignition[886]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 12 23:40:55.119624 ignition[886]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 12 23:40:55.119673 ignition[886]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 12 23:40:55.409494 systemd-networkd[872]: eth0: Gained IPv6LL Mar 12 23:40:55.418404 ignition[886]: GET result: OK Mar 12 23:40:55.418759 ignition[886]: parsing config with SHA512: 0fa6d01a759daa627b39d490a3cbe6efa7d261d9c232380e4e34d4fd9b4d05be66e6595ff77e9a3c5774a13550e6512a59222b22280e3f27a01c7a98eb959e12 Mar 12 23:40:55.424516 unknown[886]: fetched base config from "system" Mar 12 23:40:55.424527 unknown[886]: fetched base config from "system" Mar 12 23:40:55.424851 ignition[886]: fetch: fetch complete Mar 12 23:40:55.424532 unknown[886]: fetched user config from "openstack" Mar 12 23:40:55.424856 ignition[886]: fetch: fetch passed Mar 12 23:40:55.427105 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 12 23:40:55.424890 ignition[886]: Ignition finished successfully Mar 12 23:40:55.429168 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 12 23:40:55.456291 ignition[894]: Ignition 2.22.0 Mar 12 23:40:55.456306 ignition[894]: Stage: kargs Mar 12 23:40:55.456439 ignition[894]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:40:55.456448 ignition[894]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:40:55.457178 ignition[894]: kargs: kargs passed Mar 12 23:40:55.457219 ignition[894]: Ignition finished successfully Mar 12 23:40:55.459634 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 12 23:40:55.461414 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 12 23:40:55.490067 ignition[902]: Ignition 2.22.0 Mar 12 23:40:55.490088 ignition[902]: Stage: disks Mar 12 23:40:55.490218 ignition[902]: no configs at "/usr/lib/ignition/base.d" Mar 12 23:40:55.490227 ignition[902]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:40:55.490958 ignition[902]: disks: disks passed Mar 12 23:40:55.493259 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 12 23:40:55.491005 ignition[902]: Ignition finished successfully Mar 12 23:40:55.495169 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 12 23:40:55.496553 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 12 23:40:55.498239 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:40:55.500040 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:40:55.501589 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:40:55.503884 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 12 23:40:55.540134 systemd-fsck[913]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 12 23:40:55.542353 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 12 23:40:55.545047 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 12 23:40:55.645318 kernel: EXT4-fs (vda9): mounted filesystem 4b09db19-3beb-48c2-8dcb-3eec5602206c r/w with ordered data mode. Quota mode: none. Mar 12 23:40:55.646238 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 12 23:40:55.648083 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 12 23:40:55.652441 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:40:55.654794 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 12 23:40:55.655646 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 12 23:40:55.656213 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 12 23:40:55.658342 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 12 23:40:55.658370 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:40:55.669868 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 12 23:40:55.672072 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 12 23:40:55.680296 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (921) Mar 12 23:40:55.683133 kernel: BTRFS info (device vda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:40:55.683153 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:40:55.689382 kernel: BTRFS info (device vda6): turning on async discard Mar 12 23:40:55.689409 kernel: BTRFS info (device vda6): enabling free space tree Mar 12 23:40:55.690853 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:40:55.720313 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:40:55.724325 initrd-setup-root[949]: cut: /sysroot/etc/passwd: No such file or directory Mar 12 23:40:55.729012 initrd-setup-root[956]: cut: /sysroot/etc/group: No such file or directory Mar 12 23:40:55.733024 initrd-setup-root[963]: cut: /sysroot/etc/shadow: No such file or directory Mar 12 23:40:55.737738 initrd-setup-root[970]: cut: /sysroot/etc/gshadow: No such file or directory Mar 12 23:40:55.815772 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 12 23:40:55.817936 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 12 23:40:55.819405 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 12 23:40:55.836709 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 12 23:40:55.838624 kernel: BTRFS info (device vda6): last unmount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:40:55.855349 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 12 23:40:55.860063 ignition[1039]: INFO : Ignition 2.22.0 Mar 12 23:40:55.860063 ignition[1039]: INFO : Stage: mount Mar 12 23:40:55.862422 ignition[1039]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:40:55.862422 ignition[1039]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:40:55.862422 ignition[1039]: INFO : mount: mount passed Mar 12 23:40:55.862422 ignition[1039]: INFO : Ignition finished successfully Mar 12 23:40:55.864004 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 12 23:40:56.748305 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:40:58.758320 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:02.763308 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:02.770914 coreos-metadata[923]: Mar 12 23:41:02.770 WARN failed to locate config-drive, using the metadata service API instead Mar 12 23:41:02.788914 coreos-metadata[923]: Mar 12 23:41:02.788 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 23:41:03.508433 coreos-metadata[923]: Mar 12 23:41:03.508 INFO Fetch successful Mar 12 23:41:03.509375 coreos-metadata[923]: Mar 12 23:41:03.508 INFO wrote hostname ci-4459-2-4-n-27aefdfc79 to /sysroot/etc/hostname Mar 12 23:41:03.512250 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 12 23:41:03.513250 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 12 23:41:03.517618 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 12 23:41:03.538076 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 23:41:03.568290 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1057) Mar 12 23:41:03.570273 kernel: BTRFS info (device vda6): first mount of filesystem 3c8fd7d8-36f6-4dc1-84ec-9e522970376b Mar 12 23:41:03.570305 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 12 23:41:03.573819 kernel: BTRFS info (device vda6): turning on async discard Mar 12 23:41:03.573851 kernel: BTRFS info (device vda6): enabling free space tree Mar 12 23:41:03.575525 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 23:41:03.608172 ignition[1075]: INFO : Ignition 2.22.0 Mar 12 23:41:03.608172 ignition[1075]: INFO : Stage: files Mar 12 23:41:03.609621 ignition[1075]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:03.609621 ignition[1075]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:41:03.609621 ignition[1075]: DEBUG : files: compiled without relabeling support, skipping Mar 12 23:41:03.612420 ignition[1075]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 12 23:41:03.612420 ignition[1075]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 12 23:41:03.612420 ignition[1075]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 12 23:41:03.615630 ignition[1075]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 12 23:41:03.615630 ignition[1075]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 12 23:41:03.615630 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:41:03.615630 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Mar 12 23:41:03.612890 unknown[1075]: wrote ssh authorized keys file for user: core Mar 12 23:41:08.664180 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 12 23:41:08.767138 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:41:08.769007 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 23:41:08.780293 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:41:08.780293 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 23:41:08.780293 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 12 23:41:08.780293 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 12 23:41:08.780293 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 12 23:41:08.780293 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-arm64.raw: attempt #1 Mar 12 23:41:09.330202 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 12 23:41:11.038312 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-arm64.raw" Mar 12 23:41:11.038312 ignition[1075]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 12 23:41:11.041692 ignition[1075]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:41:11.044411 ignition[1075]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 23:41:11.044411 ignition[1075]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 12 23:41:11.044411 ignition[1075]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 12 23:41:11.044411 ignition[1075]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 12 23:41:11.044411 ignition[1075]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:41:11.044411 ignition[1075]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 12 23:41:11.044411 ignition[1075]: INFO : files: files passed Mar 12 23:41:11.044411 ignition[1075]: INFO : Ignition finished successfully Mar 12 23:41:11.044769 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 12 23:41:11.048110 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 12 23:41:11.050861 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 12 23:41:11.061158 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 12 23:41:11.061287 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 12 23:41:11.066222 initrd-setup-root-after-ignition[1106]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:41:11.066222 initrd-setup-root-after-ignition[1106]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:41:11.069452 initrd-setup-root-after-ignition[1110]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 23:41:11.070029 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:41:11.072034 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 12 23:41:11.074147 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 12 23:41:11.105667 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 12 23:41:11.105776 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 12 23:41:11.107562 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 12 23:41:11.109019 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 12 23:41:11.110680 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 12 23:41:11.111502 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 12 23:41:11.138123 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:41:11.140466 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 12 23:41:11.159483 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:41:11.160551 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:41:11.162272 systemd[1]: Stopped target timers.target - Timer Units. Mar 12 23:41:11.163795 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 12 23:41:11.163921 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 23:41:11.166033 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 12 23:41:11.167673 systemd[1]: Stopped target basic.target - Basic System. Mar 12 23:41:11.169012 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 12 23:41:11.170366 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 23:41:11.172321 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 12 23:41:11.173968 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 12 23:41:11.175516 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 12 23:41:11.177363 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 23:41:11.179384 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 12 23:41:11.180958 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 12 23:41:11.182332 systemd[1]: Stopped target swap.target - Swaps. Mar 12 23:41:11.183646 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 12 23:41:11.183778 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 12 23:41:11.185700 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:41:11.187212 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:41:11.188814 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 12 23:41:11.192408 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:41:11.193779 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 12 23:41:11.193897 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 12 23:41:11.196322 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 12 23:41:11.196443 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 23:41:11.198124 systemd[1]: ignition-files.service: Deactivated successfully. Mar 12 23:41:11.198227 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 12 23:41:11.200625 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 12 23:41:11.201918 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 12 23:41:11.202037 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:41:11.204300 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 12 23:41:11.205600 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 12 23:41:11.205718 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:41:11.207386 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 12 23:41:11.207487 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 23:41:11.212062 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 12 23:41:11.212396 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 12 23:41:11.223596 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 12 23:41:11.226657 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 12 23:41:11.226773 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 12 23:41:11.230043 ignition[1130]: INFO : Ignition 2.22.0 Mar 12 23:41:11.230043 ignition[1130]: INFO : Stage: umount Mar 12 23:41:11.232568 ignition[1130]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 23:41:11.232568 ignition[1130]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 23:41:11.232568 ignition[1130]: INFO : umount: umount passed Mar 12 23:41:11.232568 ignition[1130]: INFO : Ignition finished successfully Mar 12 23:41:11.233554 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 12 23:41:11.233691 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 12 23:41:11.235199 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 12 23:41:11.235245 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 12 23:41:11.236358 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 12 23:41:11.236397 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 12 23:41:11.238191 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 12 23:41:11.238229 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 12 23:41:11.239626 systemd[1]: Stopped target network.target - Network. Mar 12 23:41:11.240896 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 12 23:41:11.240945 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 23:41:11.242370 systemd[1]: Stopped target paths.target - Path Units. Mar 12 23:41:11.243677 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 12 23:41:11.244334 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:41:11.245286 systemd[1]: Stopped target slices.target - Slice Units. Mar 12 23:41:11.246694 systemd[1]: Stopped target sockets.target - Socket Units. Mar 12 23:41:11.248022 systemd[1]: iscsid.socket: Deactivated successfully. Mar 12 23:41:11.248062 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 23:41:11.249466 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 12 23:41:11.249500 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 23:41:11.250764 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 12 23:41:11.250820 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 12 23:41:11.252155 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 12 23:41:11.252192 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 12 23:41:11.253734 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 12 23:41:11.253776 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 12 23:41:11.255443 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 12 23:41:11.257179 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 12 23:41:11.268053 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 12 23:41:11.268183 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 12 23:41:11.271455 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 12 23:41:11.271654 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 12 23:41:11.271755 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 12 23:41:11.274995 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 12 23:41:11.275511 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 12 23:41:11.277159 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 12 23:41:11.277205 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:41:11.279592 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 12 23:41:11.280953 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 12 23:41:11.281007 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 23:41:11.282599 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 12 23:41:11.282640 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:41:11.285108 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 12 23:41:11.285148 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 12 23:41:11.286699 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 12 23:41:11.286738 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:41:11.289024 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:41:11.291581 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 12 23:41:11.291641 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:41:11.309559 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 12 23:41:11.309668 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 12 23:41:11.311635 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 12 23:41:11.311746 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:41:11.313477 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 12 23:41:11.313537 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 12 23:41:11.314495 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 12 23:41:11.314526 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:41:11.316147 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 12 23:41:11.316191 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 12 23:41:11.318943 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 12 23:41:11.318988 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 12 23:41:11.321120 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 23:41:11.321184 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 23:41:11.324350 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 12 23:41:11.325843 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 12 23:41:11.325898 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:41:11.328410 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 12 23:41:11.328451 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:41:11.331215 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 23:41:11.331256 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:11.335146 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 12 23:41:11.335195 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 12 23:41:11.335225 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 12 23:41:11.358456 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 12 23:41:11.359357 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 12 23:41:11.360474 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 12 23:41:11.362719 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 12 23:41:11.395980 systemd[1]: Switching root. Mar 12 23:41:11.434197 systemd-journald[312]: Journal stopped Mar 12 23:41:12.958016 systemd-journald[312]: Received SIGTERM from PID 1 (systemd). Mar 12 23:41:12.958311 kernel: SELinux: policy capability network_peer_controls=1 Mar 12 23:41:12.958353 kernel: SELinux: policy capability open_perms=1 Mar 12 23:41:12.958372 kernel: SELinux: policy capability extended_socket_class=1 Mar 12 23:41:12.958385 kernel: SELinux: policy capability always_check_network=0 Mar 12 23:41:12.958395 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 12 23:41:12.958405 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 12 23:41:12.958414 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 12 23:41:12.958427 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 12 23:41:12.958443 kernel: SELinux: policy capability userspace_initial_context=0 Mar 12 23:41:12.958453 kernel: audit: type=1403 audit(1773358872.233:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 12 23:41:12.958471 systemd[1]: Successfully loaded SELinux policy in 62.975ms. Mar 12 23:41:12.958491 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.627ms. Mar 12 23:41:12.958502 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 12 23:41:12.958514 systemd[1]: Detected virtualization kvm. Mar 12 23:41:12.958524 systemd[1]: Detected architecture arm64. Mar 12 23:41:12.958533 systemd[1]: Detected first boot. Mar 12 23:41:12.958544 systemd[1]: Hostname set to . Mar 12 23:41:12.958555 systemd[1]: Initializing machine ID from VM UUID. Mar 12 23:41:12.958568 zram_generator::config[1176]: No configuration found. Mar 12 23:41:12.958582 kernel: NET: Registered PF_VSOCK protocol family Mar 12 23:41:12.958591 systemd[1]: Populated /etc with preset unit settings. Mar 12 23:41:12.958606 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 12 23:41:12.958616 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 12 23:41:12.958626 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 12 23:41:12.958637 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 12 23:41:12.958648 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 12 23:41:12.958661 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 12 23:41:12.958671 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 12 23:41:12.958681 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 12 23:41:12.958691 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 12 23:41:12.958701 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 12 23:41:12.958711 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 12 23:41:12.958721 systemd[1]: Created slice user.slice - User and Session Slice. Mar 12 23:41:12.958732 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 23:41:12.958762 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 23:41:12.958776 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 12 23:41:12.958786 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 12 23:41:12.958796 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 12 23:41:12.958807 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 23:41:12.958817 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 12 23:41:12.958830 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 23:41:12.958840 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 23:41:12.958851 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 12 23:41:12.958861 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 12 23:41:12.958873 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 12 23:41:12.958883 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 12 23:41:12.958893 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 23:41:12.958908 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 23:41:12.958920 systemd[1]: Reached target slices.target - Slice Units. Mar 12 23:41:12.958930 systemd[1]: Reached target swap.target - Swaps. Mar 12 23:41:12.958941 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 12 23:41:12.958951 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 12 23:41:12.958964 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 12 23:41:12.958974 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 23:41:12.958984 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 23:41:12.958994 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 23:41:12.959004 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 12 23:41:12.959014 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 12 23:41:12.959025 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 12 23:41:12.959035 systemd[1]: Mounting media.mount - External Media Directory... Mar 12 23:41:12.959044 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 12 23:41:12.959054 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 12 23:41:12.959064 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 12 23:41:12.959074 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 12 23:41:12.959084 systemd[1]: Reached target machines.target - Containers. Mar 12 23:41:12.959094 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 12 23:41:12.959105 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:41:12.959115 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 23:41:12.959126 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 12 23:41:12.959136 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:41:12.959146 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:41:12.959156 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:41:12.959165 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 12 23:41:12.959175 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:41:12.959187 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 12 23:41:12.959197 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 12 23:41:12.959207 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 12 23:41:12.959219 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 12 23:41:12.959230 systemd[1]: Stopped systemd-fsck-usr.service. Mar 12 23:41:12.959242 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:41:12.959252 kernel: fuse: init (API version 7.41) Mar 12 23:41:12.959262 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 23:41:12.959320 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 23:41:12.959331 kernel: loop: module loaded Mar 12 23:41:12.959346 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 23:41:12.959358 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 12 23:41:12.959369 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 12 23:41:12.959379 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 23:41:12.959393 systemd[1]: verity-setup.service: Deactivated successfully. Mar 12 23:41:12.959403 systemd[1]: Stopped verity-setup.service. Mar 12 23:41:12.959413 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 12 23:41:12.959423 kernel: ACPI: bus type drm_connector registered Mar 12 23:41:12.959433 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 12 23:41:12.959445 systemd[1]: Mounted media.mount - External Media Directory. Mar 12 23:41:12.959455 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 12 23:41:12.959498 systemd-journald[1240]: Collecting audit messages is disabled. Mar 12 23:41:12.959531 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 12 23:41:12.959545 systemd-journald[1240]: Journal started Mar 12 23:41:12.959566 systemd-journald[1240]: Runtime Journal (/run/log/journal/1d04046693844922b913f8388591d77d) is 8M, max 319.5M, 311.5M free. Mar 12 23:41:12.740317 systemd[1]: Queued start job for default target multi-user.target. Mar 12 23:41:12.763706 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 12 23:41:12.764093 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 12 23:41:12.964359 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 23:41:12.964178 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 12 23:41:12.965345 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 12 23:41:12.968350 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 23:41:12.969618 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 12 23:41:12.969793 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 12 23:41:12.971001 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:41:12.971158 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:41:12.972431 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:41:12.972596 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:41:12.973688 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:41:12.973872 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:41:12.975164 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 12 23:41:12.975369 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 12 23:41:12.976449 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:41:12.976607 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:41:12.977963 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 23:41:12.979204 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 23:41:12.980603 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 12 23:41:12.981961 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 12 23:41:12.993832 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 23:41:12.996014 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 12 23:41:12.997902 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 12 23:41:12.998945 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 12 23:41:12.998993 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 23:41:13.000677 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 12 23:41:13.014432 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 12 23:41:13.015378 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:41:13.018705 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 12 23:41:13.020622 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 12 23:41:13.021549 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:41:13.022453 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 12 23:41:13.023350 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:41:13.025486 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 23:41:13.033331 systemd-journald[1240]: Time spent on flushing to /var/log/journal/1d04046693844922b913f8388591d77d is 23.585ms for 1723 entries. Mar 12 23:41:13.033331 systemd-journald[1240]: System Journal (/var/log/journal/1d04046693844922b913f8388591d77d) is 8M, max 584.8M, 576.8M free. Mar 12 23:41:13.073154 systemd-journald[1240]: Received client request to flush runtime journal. Mar 12 23:41:13.073212 kernel: loop0: detected capacity change from 0 to 209336 Mar 12 23:41:13.032174 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 12 23:41:13.037068 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 12 23:41:13.042244 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 23:41:13.044209 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 12 23:41:13.045473 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 12 23:41:13.051474 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 12 23:41:13.053495 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 23:41:13.057393 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 12 23:41:13.061443 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 12 23:41:13.075321 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 12 23:41:13.081405 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 12 23:41:13.093894 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 12 23:41:13.096646 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 23:41:13.097920 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 12 23:41:13.104289 kernel: loop1: detected capacity change from 0 to 119840 Mar 12 23:41:13.121941 systemd-tmpfiles[1313]: ACLs are not supported, ignoring. Mar 12 23:41:13.121962 systemd-tmpfiles[1313]: ACLs are not supported, ignoring. Mar 12 23:41:13.125341 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 23:41:13.142290 kernel: loop2: detected capacity change from 0 to 100632 Mar 12 23:41:13.185298 kernel: loop3: detected capacity change from 0 to 1632 Mar 12 23:41:13.229325 kernel: loop4: detected capacity change from 0 to 209336 Mar 12 23:41:13.317290 kernel: loop5: detected capacity change from 0 to 119840 Mar 12 23:41:13.402299 kernel: loop6: detected capacity change from 0 to 100632 Mar 12 23:41:13.456366 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 12 23:41:13.459152 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 23:41:13.489305 kernel: loop7: detected capacity change from 0 to 1632 Mar 12 23:41:13.494708 systemd-udevd[1325]: Using default interface naming scheme 'v255'. Mar 12 23:41:13.506501 (sd-merge)[1323]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Mar 12 23:41:13.506924 (sd-merge)[1323]: Merged extensions into '/usr'. Mar 12 23:41:13.510611 systemd[1]: Reload requested from client PID 1295 ('systemd-sysext') (unit systemd-sysext.service)... Mar 12 23:41:13.510623 systemd[1]: Reloading... Mar 12 23:41:13.564339 zram_generator::config[1350]: No configuration found. Mar 12 23:41:13.726760 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 12 23:41:13.727128 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 12 23:41:13.727527 systemd[1]: Reloading finished in 216 ms. Mar 12 23:41:13.738809 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 23:41:13.747690 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 12 23:41:13.759364 kernel: mousedev: PS/2 mouse device common for all mice Mar 12 23:41:13.768433 systemd[1]: Starting ensure-sysext.service... Mar 12 23:41:13.771410 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 23:41:13.773514 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 23:41:13.788455 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 12 23:41:13.797765 systemd[1]: Reload requested from client PID 1427 ('systemctl') (unit ensure-sysext.service)... Mar 12 23:41:13.797784 systemd[1]: Reloading... Mar 12 23:41:13.811694 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 12 23:41:13.831248 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 12 23:41:13.831560 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 12 23:41:13.831772 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 12 23:41:13.832418 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 12 23:41:13.832640 systemd-tmpfiles[1429]: ACLs are not supported, ignoring. Mar 12 23:41:13.832690 systemd-tmpfiles[1429]: ACLs are not supported, ignoring. Mar 12 23:41:13.857378 systemd-tmpfiles[1429]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:41:13.857391 systemd-tmpfiles[1429]: Skipping /boot Mar 12 23:41:13.865125 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Mar 12 23:41:13.865199 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 12 23:41:13.865214 kernel: [drm] features: -context_init Mar 12 23:41:13.865080 systemd-tmpfiles[1429]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 23:41:13.865086 systemd-tmpfiles[1429]: Skipping /boot Mar 12 23:41:13.867332 kernel: [drm] number of scanouts: 1 Mar 12 23:41:13.867398 kernel: [drm] number of cap sets: 0 Mar 12 23:41:13.871294 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Mar 12 23:41:13.885833 kernel: Console: switching to colour frame buffer device 160x50 Mar 12 23:41:13.887662 zram_generator::config[1480]: No configuration found. Mar 12 23:41:13.895302 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 12 23:41:14.004341 systemd-networkd[1428]: lo: Link UP Mar 12 23:41:14.004355 systemd-networkd[1428]: lo: Gained carrier Mar 12 23:41:14.005366 systemd-networkd[1428]: Enumeration completed Mar 12 23:41:14.005827 systemd-networkd[1428]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:14.005838 systemd-networkd[1428]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 23:41:14.006345 systemd-networkd[1428]: eth0: Link UP Mar 12 23:41:14.006453 systemd-networkd[1428]: eth0: Gained carrier Mar 12 23:41:14.006471 systemd-networkd[1428]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 23:41:14.034414 systemd-networkd[1428]: eth0: DHCPv4 address 10.0.4.241/25, gateway 10.0.4.129 acquired from 10.0.4.129 Mar 12 23:41:14.072441 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 23:41:14.073870 systemd[1]: Reloading finished in 275 ms. Mar 12 23:41:14.089371 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 12 23:41:14.090481 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 23:41:14.104667 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 23:41:14.126299 systemd[1]: Finished ensure-sysext.service. Mar 12 23:41:14.140096 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:41:14.164501 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 12 23:41:14.165744 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 23:41:14.166753 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 23:41:14.168491 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 23:41:14.173843 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 23:41:14.176385 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 23:41:14.179528 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Mar 12 23:41:14.180702 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 23:41:14.181620 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 12 23:41:14.182594 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 12 23:41:14.183613 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 12 23:41:14.185446 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 12 23:41:14.189426 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 12 23:41:14.189644 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 12 23:41:14.190653 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 12 23:41:14.194371 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 23:41:14.195255 systemd[1]: Reached target time-set.target - System Time Set. Mar 12 23:41:14.196956 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 12 23:41:14.199610 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 23:41:14.201427 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 23:41:14.201926 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 23:41:14.204289 kernel: PTP clock support registered Mar 12 23:41:14.204678 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 23:41:14.204846 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 23:41:14.205959 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 23:41:14.209486 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 23:41:14.210929 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 23:41:14.211070 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 23:41:14.212725 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Mar 12 23:41:14.212896 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Mar 12 23:41:14.215493 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 12 23:41:14.223459 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 23:41:14.223614 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 23:41:14.224946 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 12 23:41:14.233864 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 12 23:41:14.242020 augenrules[1565]: No rules Mar 12 23:41:14.243407 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:41:14.243626 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:41:14.268937 systemd-resolved[1540]: Positive Trust Anchors: Mar 12 23:41:14.268957 systemd-resolved[1540]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 23:41:14.268987 systemd-resolved[1540]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 23:41:14.272881 systemd-resolved[1540]: Using system hostname 'ci-4459-2-4-n-27aefdfc79'. Mar 12 23:41:14.274709 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 23:41:14.275906 systemd[1]: Reached target network.target - Network. Mar 12 23:41:14.276699 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 23:41:14.282333 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 12 23:41:14.563283 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 23:41:14.584709 ldconfig[1290]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 12 23:41:14.598807 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 12 23:41:14.601196 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 12 23:41:14.618669 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 12 23:41:14.620405 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 12 23:41:14.625411 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 12 23:41:14.626512 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 23:41:14.627475 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 12 23:41:14.628423 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 12 23:41:14.629677 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 12 23:41:14.630612 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 12 23:41:14.631622 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 12 23:41:14.632580 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 12 23:41:14.632609 systemd[1]: Reached target paths.target - Path Units. Mar 12 23:41:14.633284 systemd[1]: Reached target timers.target - Timer Units. Mar 12 23:41:14.635481 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 12 23:41:14.637537 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 12 23:41:14.640041 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 12 23:41:14.641284 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 12 23:41:14.642252 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 12 23:41:14.648089 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 12 23:41:14.649255 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 12 23:41:14.650695 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 12 23:41:14.651646 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 23:41:14.652400 systemd[1]: Reached target basic.target - Basic System. Mar 12 23:41:14.653125 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:41:14.653159 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 12 23:41:14.655387 systemd[1]: Starting chronyd.service - NTP client/server... Mar 12 23:41:14.656907 systemd[1]: Starting containerd.service - containerd container runtime... Mar 12 23:41:14.658861 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 12 23:41:14.661422 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 12 23:41:14.663061 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 12 23:41:14.666291 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:14.666709 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 12 23:41:14.668492 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 12 23:41:14.670178 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 12 23:41:14.673489 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 12 23:41:14.675788 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 12 23:41:14.676091 jq[1589]: false Mar 12 23:41:14.679645 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 12 23:41:14.682504 extend-filesystems[1590]: Found /dev/vda6 Mar 12 23:41:14.684464 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 12 23:41:14.687372 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 12 23:41:14.689878 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 12 23:41:14.690259 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 12 23:41:14.691185 extend-filesystems[1590]: Found /dev/vda9 Mar 12 23:41:14.695477 extend-filesystems[1590]: Checking size of /dev/vda9 Mar 12 23:41:14.692330 systemd[1]: Starting update-engine.service - Update Engine... Mar 12 23:41:14.694142 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 12 23:41:14.698647 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 12 23:41:14.704814 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 12 23:41:14.704991 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 12 23:41:14.707419 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 12 23:41:14.707596 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 12 23:41:14.714377 extend-filesystems[1590]: Resized partition /dev/vda9 Mar 12 23:41:14.717196 jq[1607]: true Mar 12 23:41:14.719987 extend-filesystems[1623]: resize2fs 1.47.3 (8-Jul-2025) Mar 12 23:41:14.722035 systemd[1]: motdgen.service: Deactivated successfully. Mar 12 23:41:14.722234 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 12 23:41:14.731366 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Mar 12 23:41:14.738678 (ntainerd)[1626]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 12 23:41:14.748513 jq[1627]: true Mar 12 23:41:14.751857 update_engine[1603]: I20260312 23:41:14.751471 1603 main.cc:92] Flatcar Update Engine starting Mar 12 23:41:14.755281 chronyd[1582]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Mar 12 23:41:14.756391 systemd[1]: Started chronyd.service - NTP client/server. Mar 12 23:41:14.756279 chronyd[1582]: Loaded seccomp filter (level 2) Mar 12 23:41:14.759638 tar[1615]: linux-arm64/LICENSE Mar 12 23:41:14.759834 tar[1615]: linux-arm64/helm Mar 12 23:41:14.785358 dbus-daemon[1585]: [system] SELinux support is enabled Mar 12 23:41:14.785663 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 12 23:41:14.788422 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 12 23:41:14.788455 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 12 23:41:14.789690 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 12 23:41:14.789713 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 12 23:41:14.796200 update_engine[1603]: I20260312 23:41:14.796098 1603 update_check_scheduler.cc:74] Next update check in 3m12s Mar 12 23:41:14.796419 systemd[1]: Started update-engine.service - Update Engine. Mar 12 23:41:14.798524 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 12 23:41:14.801179 systemd-logind[1600]: New seat seat0. Mar 12 23:41:14.874334 locksmithd[1650]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 12 23:41:14.882982 systemd-logind[1600]: Watching system buttons on /dev/input/event0 (Power Button) Mar 12 23:41:14.882999 systemd-logind[1600]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Mar 12 23:41:14.883193 systemd[1]: Started systemd-logind.service - User Login Management. Mar 12 23:41:14.964468 containerd[1626]: time="2026-03-12T23:41:14Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 12 23:41:15.180907 bash[1649]: Updated "/home/core/.ssh/authorized_keys" Mar 12 23:41:15.181255 containerd[1626]: time="2026-03-12T23:41:15.180911600Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 12 23:41:15.184857 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 12 23:41:15.190499 systemd[1]: Starting sshkeys.service... Mar 12 23:41:15.192330 containerd[1626]: time="2026-03-12T23:41:15.192026720Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.72µs" Mar 12 23:41:15.192330 containerd[1626]: time="2026-03-12T23:41:15.192058440Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 12 23:41:15.192330 containerd[1626]: time="2026-03-12T23:41:15.192075640Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 12 23:41:15.192330 containerd[1626]: time="2026-03-12T23:41:15.192214480Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 12 23:41:15.192330 containerd[1626]: time="2026-03-12T23:41:15.192229640Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 12 23:41:15.192330 containerd[1626]: time="2026-03-12T23:41:15.192252760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195398 containerd[1626]: time="2026-03-12T23:41:15.192582720Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195398 containerd[1626]: time="2026-03-12T23:41:15.194323400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195398 containerd[1626]: time="2026-03-12T23:41:15.194834840Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195398 containerd[1626]: time="2026-03-12T23:41:15.194863920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195398 containerd[1626]: time="2026-03-12T23:41:15.194875160Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195398 containerd[1626]: time="2026-03-12T23:41:15.194882560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195398 containerd[1626]: time="2026-03-12T23:41:15.195210880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195749 containerd[1626]: time="2026-03-12T23:41:15.195703680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195783 containerd[1626]: time="2026-03-12T23:41:15.195759240Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 12 23:41:15.195783 containerd[1626]: time="2026-03-12T23:41:15.195770720Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 12 23:41:15.195822 containerd[1626]: time="2026-03-12T23:41:15.195810520Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 12 23:41:15.196083 containerd[1626]: time="2026-03-12T23:41:15.196045600Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 12 23:41:15.196325 containerd[1626]: time="2026-03-12T23:41:15.196288560Z" level=info msg="metadata content store policy set" policy=shared Mar 12 23:41:15.213638 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 12 23:41:15.217008 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 12 23:41:15.231294 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:15.290883 containerd[1626]: time="2026-03-12T23:41:15.290801000Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 12 23:41:15.291052 containerd[1626]: time="2026-03-12T23:41:15.290940600Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 12 23:41:15.291052 containerd[1626]: time="2026-03-12T23:41:15.290972760Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 12 23:41:15.291052 containerd[1626]: time="2026-03-12T23:41:15.290986040Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 12 23:41:15.291128 containerd[1626]: time="2026-03-12T23:41:15.291066640Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 12 23:41:15.291128 containerd[1626]: time="2026-03-12T23:41:15.291089120Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 12 23:41:15.291128 containerd[1626]: time="2026-03-12T23:41:15.291106040Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 12 23:41:15.291128 containerd[1626]: time="2026-03-12T23:41:15.291117880Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 12 23:41:15.291192 containerd[1626]: time="2026-03-12T23:41:15.291148320Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 12 23:41:15.291192 containerd[1626]: time="2026-03-12T23:41:15.291160800Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 12 23:41:15.291192 containerd[1626]: time="2026-03-12T23:41:15.291170440Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 12 23:41:15.291192 containerd[1626]: time="2026-03-12T23:41:15.291182760Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 12 23:41:15.291438 containerd[1626]: time="2026-03-12T23:41:15.291367880Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 12 23:41:15.291438 containerd[1626]: time="2026-03-12T23:41:15.291396840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 12 23:41:15.291438 containerd[1626]: time="2026-03-12T23:41:15.291411640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 12 23:41:15.291438 containerd[1626]: time="2026-03-12T23:41:15.291438240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 12 23:41:15.291646 containerd[1626]: time="2026-03-12T23:41:15.291451720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 12 23:41:15.291646 containerd[1626]: time="2026-03-12T23:41:15.291462400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 12 23:41:15.291646 containerd[1626]: time="2026-03-12T23:41:15.291473080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 12 23:41:15.291646 containerd[1626]: time="2026-03-12T23:41:15.291484280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 12 23:41:15.291646 containerd[1626]: time="2026-03-12T23:41:15.291495080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 12 23:41:15.291646 containerd[1626]: time="2026-03-12T23:41:15.291512920Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 12 23:41:15.291646 containerd[1626]: time="2026-03-12T23:41:15.291524200Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 12 23:41:15.291772 containerd[1626]: time="2026-03-12T23:41:15.291750280Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 12 23:41:15.291772 containerd[1626]: time="2026-03-12T23:41:15.291765280Z" level=info msg="Start snapshots syncer" Mar 12 23:41:15.291809 containerd[1626]: time="2026-03-12T23:41:15.291793400Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 12 23:41:15.293924 containerd[1626]: time="2026-03-12T23:41:15.293837840Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 12 23:41:15.293924 containerd[1626]: time="2026-03-12T23:41:15.293906920Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 12 23:41:15.294123 containerd[1626]: time="2026-03-12T23:41:15.293971920Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 12 23:41:15.294123 containerd[1626]: time="2026-03-12T23:41:15.294115040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 12 23:41:15.294157 containerd[1626]: time="2026-03-12T23:41:15.294142320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 12 23:41:15.294174 containerd[1626]: time="2026-03-12T23:41:15.294156880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 12 23:41:15.294191 containerd[1626]: time="2026-03-12T23:41:15.294170000Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 12 23:41:15.294208 containerd[1626]: time="2026-03-12T23:41:15.294185680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 12 23:41:15.294231 containerd[1626]: time="2026-03-12T23:41:15.294207840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 12 23:41:15.294231 containerd[1626]: time="2026-03-12T23:41:15.294221080Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 12 23:41:15.294263 containerd[1626]: time="2026-03-12T23:41:15.294252920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 12 23:41:15.294299 containerd[1626]: time="2026-03-12T23:41:15.294290400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 12 23:41:15.294317 containerd[1626]: time="2026-03-12T23:41:15.294304680Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 12 23:41:15.294388 containerd[1626]: time="2026-03-12T23:41:15.294346760Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:41:15.294388 containerd[1626]: time="2026-03-12T23:41:15.294374320Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 12 23:41:15.294505 containerd[1626]: time="2026-03-12T23:41:15.294387960Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:41:15.294505 containerd[1626]: time="2026-03-12T23:41:15.294398560Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 12 23:41:15.294505 containerd[1626]: time="2026-03-12T23:41:15.294411240Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 12 23:41:15.294505 containerd[1626]: time="2026-03-12T23:41:15.294423560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 12 23:41:15.294505 containerd[1626]: time="2026-03-12T23:41:15.294436960Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 12 23:41:15.294600 containerd[1626]: time="2026-03-12T23:41:15.294535960Z" level=info msg="runtime interface created" Mar 12 23:41:15.294600 containerd[1626]: time="2026-03-12T23:41:15.294543040Z" level=info msg="created NRI interface" Mar 12 23:41:15.294600 containerd[1626]: time="2026-03-12T23:41:15.294555360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 12 23:41:15.294600 containerd[1626]: time="2026-03-12T23:41:15.294569040Z" level=info msg="Connect containerd service" Mar 12 23:41:15.294600 containerd[1626]: time="2026-03-12T23:41:15.294597680Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 12 23:41:15.295782 containerd[1626]: time="2026-03-12T23:41:15.295743000Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 23:41:15.376304 containerd[1626]: time="2026-03-12T23:41:15.376141200Z" level=info msg="Start subscribing containerd event" Mar 12 23:41:15.376304 containerd[1626]: time="2026-03-12T23:41:15.376201320Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 12 23:41:15.376304 containerd[1626]: time="2026-03-12T23:41:15.376220240Z" level=info msg="Start recovering state" Mar 12 23:41:15.376304 containerd[1626]: time="2026-03-12T23:41:15.376243960Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 12 23:41:15.376530 containerd[1626]: time="2026-03-12T23:41:15.376514440Z" level=info msg="Start event monitor" Mar 12 23:41:15.376582 containerd[1626]: time="2026-03-12T23:41:15.376571520Z" level=info msg="Start cni network conf syncer for default" Mar 12 23:41:15.376624 containerd[1626]: time="2026-03-12T23:41:15.376614880Z" level=info msg="Start streaming server" Mar 12 23:41:15.376670 containerd[1626]: time="2026-03-12T23:41:15.376659920Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 12 23:41:15.376711 containerd[1626]: time="2026-03-12T23:41:15.376700440Z" level=info msg="runtime interface starting up..." Mar 12 23:41:15.376758 containerd[1626]: time="2026-03-12T23:41:15.376747840Z" level=info msg="starting plugins..." Mar 12 23:41:15.376807 containerd[1626]: time="2026-03-12T23:41:15.376797160Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 12 23:41:15.376979 containerd[1626]: time="2026-03-12T23:41:15.376963320Z" level=info msg="containerd successfully booted in 0.412965s" Mar 12 23:41:15.377063 systemd[1]: Started containerd.service - containerd container runtime. Mar 12 23:41:15.441449 systemd-networkd[1428]: eth0: Gained IPv6LL Mar 12 23:41:15.444260 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 12 23:41:15.445819 systemd[1]: Reached target network-online.target - Network is Online. Mar 12 23:41:15.448344 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:41:15.451519 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 12 23:41:15.488750 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 12 23:41:15.512794 tar[1615]: linux-arm64/README.md Mar 12 23:41:15.532101 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 12 23:41:15.681358 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:15.854158 sshd_keygen[1613]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 12 23:41:15.874389 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 12 23:41:15.879153 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 12 23:41:15.896258 systemd[1]: issuegen.service: Deactivated successfully. Mar 12 23:41:15.896470 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 12 23:41:15.899408 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 12 23:41:15.918352 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Mar 12 23:41:15.928324 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 12 23:41:15.930852 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 12 23:41:15.933066 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 12 23:41:15.934451 systemd[1]: Reached target getty.target - Login Prompts. Mar 12 23:41:16.096929 extend-filesystems[1623]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 12 23:41:16.096929 extend-filesystems[1623]: old_desc_blocks = 1, new_desc_blocks = 6 Mar 12 23:41:16.096929 extend-filesystems[1623]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Mar 12 23:41:16.102401 extend-filesystems[1590]: Resized filesystem in /dev/vda9 Mar 12 23:41:16.098507 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 12 23:41:16.099641 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 12 23:41:16.242302 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:16.686882 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:41:16.690852 (kubelet)[1722]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:41:17.206735 kubelet[1722]: E0312 23:41:17.206673 1722 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:41:17.209181 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:41:17.209342 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:41:17.210413 systemd[1]: kubelet.service: Consumed 740ms CPU time, 258.9M memory peak. Mar 12 23:41:17.692293 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:18.253320 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:21.699509 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:21.706526 coreos-metadata[1584]: Mar 12 23:41:21.706 WARN failed to locate config-drive, using the metadata service API instead Mar 12 23:41:21.722970 coreos-metadata[1584]: Mar 12 23:41:21.722 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 12 23:41:22.269310 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 12 23:41:22.276421 coreos-metadata[1664]: Mar 12 23:41:22.276 WARN failed to locate config-drive, using the metadata service API instead Mar 12 23:41:22.289158 coreos-metadata[1664]: Mar 12 23:41:22.289 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 12 23:41:23.375653 coreos-metadata[1664]: Mar 12 23:41:23.375 INFO Fetch successful Mar 12 23:41:23.375653 coreos-metadata[1664]: Mar 12 23:41:23.375 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 12 23:41:24.060480 coreos-metadata[1584]: Mar 12 23:41:24.060 INFO Fetch successful Mar 12 23:41:24.060858 coreos-metadata[1584]: Mar 12 23:41:24.060 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 23:41:24.801150 coreos-metadata[1664]: Mar 12 23:41:24.801 INFO Fetch successful Mar 12 23:41:24.803002 unknown[1664]: wrote ssh authorized keys file for user: core Mar 12 23:41:24.811430 coreos-metadata[1584]: Mar 12 23:41:24.811 INFO Fetch successful Mar 12 23:41:24.811430 coreos-metadata[1584]: Mar 12 23:41:24.811 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 12 23:41:24.827594 update-ssh-keys[1741]: Updated "/home/core/.ssh/authorized_keys" Mar 12 23:41:24.829382 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 12 23:41:24.831090 systemd[1]: Finished sshkeys.service. Mar 12 23:41:26.227004 coreos-metadata[1584]: Mar 12 23:41:26.226 INFO Fetch successful Mar 12 23:41:26.227004 coreos-metadata[1584]: Mar 12 23:41:26.226 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 12 23:41:26.970050 coreos-metadata[1584]: Mar 12 23:41:26.969 INFO Fetch successful Mar 12 23:41:26.970050 coreos-metadata[1584]: Mar 12 23:41:26.970 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 12 23:41:27.460148 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 12 23:41:27.461624 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:41:27.609548 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:41:27.613041 (kubelet)[1752]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:41:27.641565 kubelet[1752]: E0312 23:41:27.641511 1752 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:41:27.644507 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:41:27.644637 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:41:27.646351 systemd[1]: kubelet.service: Consumed 133ms CPU time, 105.4M memory peak. Mar 12 23:41:27.711713 coreos-metadata[1584]: Mar 12 23:41:27.711 INFO Fetch successful Mar 12 23:41:27.711713 coreos-metadata[1584]: Mar 12 23:41:27.711 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 12 23:41:28.453170 coreos-metadata[1584]: Mar 12 23:41:28.453 INFO Fetch successful Mar 12 23:41:28.485348 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 12 23:41:28.486429 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 12 23:41:28.487381 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 12 23:41:28.491789 systemd[1]: Startup finished in 3.276s (kernel) + 20.618s (initrd) + 16.321s (userspace) = 40.217s. Mar 12 23:41:37.811041 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 12 23:41:37.812833 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:41:37.936481 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:41:37.940610 (kubelet)[1773]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:41:38.292924 kubelet[1773]: E0312 23:41:38.292825 1773 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:41:38.295319 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:41:38.295448 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:41:38.297352 systemd[1]: kubelet.service: Consumed 427ms CPU time, 107.5M memory peak. Mar 12 23:41:38.570871 chronyd[1582]: Selected source PHC0 Mar 12 23:41:48.310972 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 12 23:41:48.312405 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:41:48.468062 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:41:48.471542 (kubelet)[1790]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:41:48.836431 kubelet[1790]: E0312 23:41:48.836370 1790 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:41:48.838693 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:41:48.838827 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:41:48.839196 systemd[1]: kubelet.service: Consumed 402ms CPU time, 107.3M memory peak. Mar 12 23:41:52.379384 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 12 23:41:52.380699 systemd[1]: Started sshd@0-10.0.4.241:22-20.161.92.111:49250.service - OpenSSH per-connection server daemon (20.161.92.111:49250). Mar 12 23:41:52.910718 sshd[1800]: Accepted publickey for core from 20.161.92.111 port 49250 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:41:52.912569 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:41:52.926225 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 12 23:41:52.926925 systemd-logind[1600]: New session 1 of user core. Mar 12 23:41:52.928115 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 12 23:41:52.952335 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 12 23:41:52.954509 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 12 23:41:52.972979 (systemd)[1805]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 12 23:41:52.975410 systemd-logind[1600]: New session c1 of user core. Mar 12 23:41:53.097303 systemd[1805]: Queued start job for default target default.target. Mar 12 23:41:53.121587 systemd[1805]: Created slice app.slice - User Application Slice. Mar 12 23:41:53.121619 systemd[1805]: Reached target paths.target - Paths. Mar 12 23:41:53.121658 systemd[1805]: Reached target timers.target - Timers. Mar 12 23:41:53.122869 systemd[1805]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 12 23:41:53.132174 systemd[1805]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 12 23:41:53.132239 systemd[1805]: Reached target sockets.target - Sockets. Mar 12 23:41:53.132294 systemd[1805]: Reached target basic.target - Basic System. Mar 12 23:41:53.132323 systemd[1805]: Reached target default.target - Main User Target. Mar 12 23:41:53.132348 systemd[1805]: Startup finished in 151ms. Mar 12 23:41:53.132840 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 12 23:41:53.134530 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 12 23:41:53.429342 systemd[1]: Started sshd@1-10.0.4.241:22-20.161.92.111:49262.service - OpenSSH per-connection server daemon (20.161.92.111:49262). Mar 12 23:41:53.943551 sshd[1816]: Accepted publickey for core from 20.161.92.111 port 49262 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:41:53.944804 sshd-session[1816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:41:53.948464 systemd-logind[1600]: New session 2 of user core. Mar 12 23:41:53.964613 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 12 23:41:54.231238 sshd[1819]: Connection closed by 20.161.92.111 port 49262 Mar 12 23:41:54.231633 sshd-session[1816]: pam_unix(sshd:session): session closed for user core Mar 12 23:41:54.235531 systemd[1]: sshd@1-10.0.4.241:22-20.161.92.111:49262.service: Deactivated successfully. Mar 12 23:41:54.237402 systemd[1]: session-2.scope: Deactivated successfully. Mar 12 23:41:54.240388 systemd-logind[1600]: Session 2 logged out. Waiting for processes to exit. Mar 12 23:41:54.241324 systemd-logind[1600]: Removed session 2. Mar 12 23:41:54.340107 systemd[1]: Started sshd@2-10.0.4.241:22-20.161.92.111:49270.service - OpenSSH per-connection server daemon (20.161.92.111:49270). Mar 12 23:41:54.851160 sshd[1825]: Accepted publickey for core from 20.161.92.111 port 49270 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:41:54.852438 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:41:54.856717 systemd-logind[1600]: New session 3 of user core. Mar 12 23:41:54.864619 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 12 23:41:55.132624 sshd[1828]: Connection closed by 20.161.92.111 port 49270 Mar 12 23:41:55.132734 sshd-session[1825]: pam_unix(sshd:session): session closed for user core Mar 12 23:41:55.135690 systemd[1]: sshd@2-10.0.4.241:22-20.161.92.111:49270.service: Deactivated successfully. Mar 12 23:41:55.137785 systemd[1]: session-3.scope: Deactivated successfully. Mar 12 23:41:55.140397 systemd-logind[1600]: Session 3 logged out. Waiting for processes to exit. Mar 12 23:41:55.141685 systemd-logind[1600]: Removed session 3. Mar 12 23:41:55.235738 systemd[1]: Started sshd@3-10.0.4.241:22-20.161.92.111:49284.service - OpenSSH per-connection server daemon (20.161.92.111:49284). Mar 12 23:41:55.762414 sshd[1834]: Accepted publickey for core from 20.161.92.111 port 49284 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:41:55.764207 sshd-session[1834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:41:55.767959 systemd-logind[1600]: New session 4 of user core. Mar 12 23:41:55.780595 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 12 23:41:56.047750 sshd[1837]: Connection closed by 20.161.92.111 port 49284 Mar 12 23:41:56.048512 sshd-session[1834]: pam_unix(sshd:session): session closed for user core Mar 12 23:41:56.051841 systemd[1]: sshd@3-10.0.4.241:22-20.161.92.111:49284.service: Deactivated successfully. Mar 12 23:41:56.054581 systemd[1]: session-4.scope: Deactivated successfully. Mar 12 23:41:56.055203 systemd-logind[1600]: Session 4 logged out. Waiting for processes to exit. Mar 12 23:41:56.056169 systemd-logind[1600]: Removed session 4. Mar 12 23:41:56.151586 systemd[1]: Started sshd@4-10.0.4.241:22-20.161.92.111:49296.service - OpenSSH per-connection server daemon (20.161.92.111:49296). Mar 12 23:41:56.670784 sshd[1843]: Accepted publickey for core from 20.161.92.111 port 49296 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:41:56.672538 sshd-session[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:41:56.676070 systemd-logind[1600]: New session 5 of user core. Mar 12 23:41:56.685498 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 12 23:41:56.874302 sudo[1847]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 12 23:41:56.874554 sudo[1847]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:41:56.886575 sudo[1847]: pam_unix(sudo:session): session closed for user root Mar 12 23:41:56.980568 sshd[1846]: Connection closed by 20.161.92.111 port 49296 Mar 12 23:41:56.981070 sshd-session[1843]: pam_unix(sshd:session): session closed for user core Mar 12 23:41:56.985230 systemd[1]: sshd@4-10.0.4.241:22-20.161.92.111:49296.service: Deactivated successfully. Mar 12 23:41:56.988805 systemd[1]: session-5.scope: Deactivated successfully. Mar 12 23:41:56.989602 systemd-logind[1600]: Session 5 logged out. Waiting for processes to exit. Mar 12 23:41:56.990905 systemd-logind[1600]: Removed session 5. Mar 12 23:41:57.095713 systemd[1]: Started sshd@5-10.0.4.241:22-20.161.92.111:49298.service - OpenSSH per-connection server daemon (20.161.92.111:49298). Mar 12 23:41:57.626070 sshd[1853]: Accepted publickey for core from 20.161.92.111 port 49298 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:41:57.629419 sshd-session[1853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:41:57.633333 systemd-logind[1600]: New session 6 of user core. Mar 12 23:41:57.645555 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 12 23:41:57.818223 sudo[1858]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 12 23:41:57.818558 sudo[1858]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:41:57.822490 sudo[1858]: pam_unix(sudo:session): session closed for user root Mar 12 23:41:57.827141 sudo[1857]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 12 23:41:57.827415 sudo[1857]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:41:57.835403 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 12 23:41:57.867833 augenrules[1880]: No rules Mar 12 23:41:57.868923 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 23:41:57.869182 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 12 23:41:57.870668 sudo[1857]: pam_unix(sudo:session): session closed for user root Mar 12 23:41:57.964542 sshd[1856]: Connection closed by 20.161.92.111 port 49298 Mar 12 23:41:57.965316 sshd-session[1853]: pam_unix(sshd:session): session closed for user core Mar 12 23:41:57.969083 systemd[1]: sshd@5-10.0.4.241:22-20.161.92.111:49298.service: Deactivated successfully. Mar 12 23:41:57.970599 systemd[1]: session-6.scope: Deactivated successfully. Mar 12 23:41:57.971811 systemd-logind[1600]: Session 6 logged out. Waiting for processes to exit. Mar 12 23:41:57.973022 systemd-logind[1600]: Removed session 6. Mar 12 23:41:58.068559 systemd[1]: Started sshd@6-10.0.4.241:22-20.161.92.111:49308.service - OpenSSH per-connection server daemon (20.161.92.111:49308). Mar 12 23:41:58.580307 sshd[1889]: Accepted publickey for core from 20.161.92.111 port 49308 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:41:58.581510 sshd-session[1889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:41:58.585214 systemd-logind[1600]: New session 7 of user core. Mar 12 23:41:58.595463 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 12 23:41:58.771929 sudo[1893]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 12 23:41:58.772197 sudo[1893]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 23:41:59.060520 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 12 23:41:59.062128 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:41:59.082505 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 12 23:41:59.091960 (dockerd)[1916]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 12 23:41:59.536060 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:41:59.540951 (kubelet)[1927]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:41:59.657735 kubelet[1927]: E0312 23:41:59.657665 1927 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:41:59.660151 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:41:59.660312 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:41:59.660686 systemd[1]: kubelet.service: Consumed 143ms CPU time, 106.6M memory peak. Mar 12 23:41:59.762031 dockerd[1916]: time="2026-03-12T23:41:59.761953789Z" level=info msg="Starting up" Mar 12 23:41:59.763032 dockerd[1916]: time="2026-03-12T23:41:59.762842553Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 12 23:41:59.773497 dockerd[1916]: time="2026-03-12T23:41:59.773456804Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 12 23:41:59.817075 dockerd[1916]: time="2026-03-12T23:41:59.816739455Z" level=info msg="Loading containers: start." Mar 12 23:41:59.828301 kernel: Initializing XFRM netlink socket Mar 12 23:42:00.039367 systemd-networkd[1428]: docker0: Link UP Mar 12 23:42:00.043442 dockerd[1916]: time="2026-03-12T23:42:00.043390275Z" level=info msg="Loading containers: done." Mar 12 23:42:00.058563 dockerd[1916]: time="2026-03-12T23:42:00.058208907Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 12 23:42:00.058563 dockerd[1916]: time="2026-03-12T23:42:00.058315868Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 12 23:42:00.058563 dockerd[1916]: time="2026-03-12T23:42:00.058401548Z" level=info msg="Initializing buildkit" Mar 12 23:42:00.081716 dockerd[1916]: time="2026-03-12T23:42:00.081618261Z" level=info msg="Completed buildkit initialization" Mar 12 23:42:00.086826 dockerd[1916]: time="2026-03-12T23:42:00.086786006Z" level=info msg="Daemon has completed initialization" Mar 12 23:42:00.086969 dockerd[1916]: time="2026-03-12T23:42:00.086838246Z" level=info msg="API listen on /run/docker.sock" Mar 12 23:42:00.087156 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 12 23:42:00.274986 update_engine[1603]: I20260312 23:42:00.274793 1603 update_attempter.cc:509] Updating boot flags... Mar 12 23:42:00.688239 containerd[1626]: time="2026-03-12T23:42:00.688199966Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 12 23:42:01.426312 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3569045934.mount: Deactivated successfully. Mar 12 23:42:02.345117 containerd[1626]: time="2026-03-12T23:42:02.344390288Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:02.346300 containerd[1626]: time="2026-03-12T23:42:02.346231697Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=27390272" Mar 12 23:42:02.347524 containerd[1626]: time="2026-03-12T23:42:02.347493783Z" level=info msg="ImageCreate event name:\"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:02.350360 containerd[1626]: time="2026-03-12T23:42:02.350321477Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:02.351342 containerd[1626]: time="2026-03-12T23:42:02.351313001Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"27386773\" in 1.663071795s" Mar 12 23:42:02.351387 containerd[1626]: time="2026-03-12T23:42:02.351351362Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:6dbc3c6e88c8bca1294fa5fafe73dbe01fb58d40e562dbfc8b8b4195940270c8\"" Mar 12 23:42:02.351860 containerd[1626]: time="2026-03-12T23:42:02.351832004Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 12 23:42:03.735380 containerd[1626]: time="2026-03-12T23:42:03.735295481Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:03.736804 containerd[1626]: time="2026-03-12T23:42:03.736748328Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=23552126" Mar 12 23:42:03.737811 containerd[1626]: time="2026-03-12T23:42:03.737764773Z" level=info msg="ImageCreate event name:\"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:03.740793 containerd[1626]: time="2026-03-12T23:42:03.740344986Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:03.741376 containerd[1626]: time="2026-03-12T23:42:03.741342591Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"25136510\" in 1.389476507s" Mar 12 23:42:03.741376 containerd[1626]: time="2026-03-12T23:42:03.741375311Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:c58be92c40cc41b6c83c361b92110b587104386f93c5b7a9fc66dffdd1523d17\"" Mar 12 23:42:03.741855 containerd[1626]: time="2026-03-12T23:42:03.741824433Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 12 23:42:04.871445 containerd[1626]: time="2026-03-12T23:42:04.871397878Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:04.873595 containerd[1626]: time="2026-03-12T23:42:04.873526128Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=18301325" Mar 12 23:42:04.874314 containerd[1626]: time="2026-03-12T23:42:04.874284852Z" level=info msg="ImageCreate event name:\"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:04.877748 containerd[1626]: time="2026-03-12T23:42:04.877718188Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:04.878680 containerd[1626]: time="2026-03-12T23:42:04.878653793Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"19885727\" in 1.13679232s" Mar 12 23:42:04.878733 containerd[1626]: time="2026-03-12T23:42:04.878685633Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:5dcd4a0c93d95bd92241ba240a130ffbde67814e2b417a13c25738a7b0204e95\"" Mar 12 23:42:04.879086 containerd[1626]: time="2026-03-12T23:42:04.879053035Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 12 23:42:05.921661 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3089470958.mount: Deactivated successfully. Mar 12 23:42:06.164834 containerd[1626]: time="2026-03-12T23:42:06.164762798Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:06.166047 containerd[1626]: time="2026-03-12T23:42:06.165998604Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=28148896" Mar 12 23:42:06.167109 containerd[1626]: time="2026-03-12T23:42:06.167057329Z" level=info msg="ImageCreate event name:\"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:06.169681 containerd[1626]: time="2026-03-12T23:42:06.169638981Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:06.170233 containerd[1626]: time="2026-03-12T23:42:06.170194784Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"28147889\" in 1.291113989s" Mar 12 23:42:06.170233 containerd[1626]: time="2026-03-12T23:42:06.170225424Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:fb4f3cb8cccaec5975890c2ee802236a557e3f108da9c3c66ebec335ac73dcc9\"" Mar 12 23:42:06.171017 containerd[1626]: time="2026-03-12T23:42:06.170919388Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 12 23:42:06.815549 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2146100616.mount: Deactivated successfully. Mar 12 23:42:07.684796 containerd[1626]: time="2026-03-12T23:42:07.684745738Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:07.685575 containerd[1626]: time="2026-03-12T23:42:07.685541742Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Mar 12 23:42:07.687301 containerd[1626]: time="2026-03-12T23:42:07.686934709Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:07.690459 containerd[1626]: time="2026-03-12T23:42:07.690424125Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:07.691526 containerd[1626]: time="2026-03-12T23:42:07.691499571Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.520532023s" Mar 12 23:42:07.691526 containerd[1626]: time="2026-03-12T23:42:07.691530651Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Mar 12 23:42:07.692052 containerd[1626]: time="2026-03-12T23:42:07.691985573Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 12 23:42:08.244456 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount233480213.mount: Deactivated successfully. Mar 12 23:42:08.249507 containerd[1626]: time="2026-03-12T23:42:08.249434440Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:08.250550 containerd[1626]: time="2026-03-12T23:42:08.250510445Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Mar 12 23:42:08.251271 containerd[1626]: time="2026-03-12T23:42:08.251226248Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:08.254236 containerd[1626]: time="2026-03-12T23:42:08.254188983Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 23:42:08.254689 containerd[1626]: time="2026-03-12T23:42:08.254630145Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 562.542891ms" Mar 12 23:42:08.254689 containerd[1626]: time="2026-03-12T23:42:08.254660785Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 12 23:42:08.255300 containerd[1626]: time="2026-03-12T23:42:08.255259308Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 12 23:42:08.818090 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3397938414.mount: Deactivated successfully. Mar 12 23:42:09.443295 containerd[1626]: time="2026-03-12T23:42:09.443203956Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:09.445004 containerd[1626]: time="2026-03-12T23:42:09.444970805Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=21885878" Mar 12 23:42:09.445745 containerd[1626]: time="2026-03-12T23:42:09.445721208Z" level=info msg="ImageCreate event name:\"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:09.449572 containerd[1626]: time="2026-03-12T23:42:09.449522067Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:09.450581 containerd[1626]: time="2026-03-12T23:42:09.450549272Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"21882972\" in 1.195243844s" Mar 12 23:42:09.450629 containerd[1626]: time="2026-03-12T23:42:09.450581352Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:1211402d28f5813ed906916bfcdd0a7404c2f9048ef5bb54387a6745bc410eca\"" Mar 12 23:42:09.810886 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 12 23:42:09.812210 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:09.953158 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:09.965925 (kubelet)[2382]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 23:42:10.002053 kubelet[2382]: E0312 23:42:10.001997 2382 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 23:42:10.004133 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 23:42:10.004256 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 23:42:10.006360 systemd[1]: kubelet.service: Consumed 141ms CPU time, 105.5M memory peak. Mar 12 23:42:13.444810 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:13.444964 systemd[1]: kubelet.service: Consumed 141ms CPU time, 105.5M memory peak. Mar 12 23:42:13.446940 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:13.468549 systemd[1]: Reload requested from client PID 2398 ('systemctl') (unit session-7.scope)... Mar 12 23:42:13.468677 systemd[1]: Reloading... Mar 12 23:42:13.555338 zram_generator::config[2441]: No configuration found. Mar 12 23:42:13.713506 systemd[1]: Reloading finished in 244 ms. Mar 12 23:42:13.775155 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 12 23:42:13.775242 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 12 23:42:13.775535 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:13.775585 systemd[1]: kubelet.service: Consumed 93ms CPU time, 95M memory peak. Mar 12 23:42:13.777083 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:14.733117 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:14.750877 (kubelet)[2488]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:42:14.781020 kubelet[2488]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:42:14.781020 kubelet[2488]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 23:42:14.781020 kubelet[2488]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:42:14.781363 kubelet[2488]: I0312 23:42:14.781044 2488 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 23:42:15.007366 kubelet[2488]: I0312 23:42:15.006670 2488 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 12 23:42:15.007366 kubelet[2488]: I0312 23:42:15.006701 2488 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:42:15.007366 kubelet[2488]: I0312 23:42:15.006903 2488 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 23:42:15.038777 kubelet[2488]: I0312 23:42:15.038726 2488 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:42:15.039036 kubelet[2488]: E0312 23:42:15.039006 2488 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.4.241:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.4.241:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 23:42:15.048588 kubelet[2488]: I0312 23:42:15.048558 2488 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:42:15.052101 kubelet[2488]: I0312 23:42:15.052076 2488 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 12 23:42:15.053422 kubelet[2488]: I0312 23:42:15.053360 2488 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:42:15.053648 kubelet[2488]: I0312 23:42:15.053409 2488 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-27aefdfc79","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:42:15.053648 kubelet[2488]: I0312 23:42:15.053635 2488 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 23:42:15.053648 kubelet[2488]: I0312 23:42:15.053644 2488 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 23:42:15.054662 kubelet[2488]: I0312 23:42:15.054620 2488 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:42:15.059552 kubelet[2488]: I0312 23:42:15.059515 2488 kubelet.go:480] "Attempting to sync node with API server" Mar 12 23:42:15.059552 kubelet[2488]: I0312 23:42:15.059550 2488 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:42:15.059626 kubelet[2488]: I0312 23:42:15.059575 2488 kubelet.go:386] "Adding apiserver pod source" Mar 12 23:42:15.061757 kubelet[2488]: I0312 23:42:15.061714 2488 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:42:15.064078 kubelet[2488]: E0312 23:42:15.064036 2488 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.4.241:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-27aefdfc79&limit=500&resourceVersion=0\": dial tcp 10.0.4.241:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 23:42:15.065692 kubelet[2488]: E0312 23:42:15.065658 2488 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.4.241:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.4.241:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 23:42:15.066225 kubelet[2488]: I0312 23:42:15.066188 2488 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:42:15.066888 kubelet[2488]: I0312 23:42:15.066858 2488 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:42:15.067027 kubelet[2488]: W0312 23:42:15.067007 2488 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 12 23:42:15.069397 kubelet[2488]: I0312 23:42:15.069322 2488 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 12 23:42:15.069397 kubelet[2488]: I0312 23:42:15.069373 2488 server.go:1289] "Started kubelet" Mar 12 23:42:15.070810 kubelet[2488]: I0312 23:42:15.070761 2488 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 23:42:15.073294 kubelet[2488]: I0312 23:42:15.072542 2488 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:42:15.073294 kubelet[2488]: I0312 23:42:15.072880 2488 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:42:15.073294 kubelet[2488]: I0312 23:42:15.072943 2488 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:42:15.073465 kubelet[2488]: I0312 23:42:15.073431 2488 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:42:15.073839 kubelet[2488]: I0312 23:42:15.073814 2488 server.go:317] "Adding debug handlers to kubelet server" Mar 12 23:42:15.082424 kubelet[2488]: E0312 23:42:15.074446 2488 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.4.241:6443/api/v1/namespaces/default/events\": dial tcp 10.0.4.241:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-27aefdfc79.189c3c8e4812b722 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-27aefdfc79,UID:ci-4459-2-4-n-27aefdfc79,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-27aefdfc79,},FirstTimestamp:2026-03-12 23:42:15.069341474 +0000 UTC m=+0.315147572,LastTimestamp:2026-03-12 23:42:15.069341474 +0000 UTC m=+0.315147572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-27aefdfc79,}" Mar 12 23:42:15.082826 kubelet[2488]: E0312 23:42:15.082789 2488 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 23:42:15.083290 kubelet[2488]: E0312 23:42:15.083249 2488 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" Mar 12 23:42:15.083328 kubelet[2488]: I0312 23:42:15.083321 2488 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 12 23:42:15.083621 kubelet[2488]: I0312 23:42:15.083580 2488 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 12 23:42:15.083687 kubelet[2488]: I0312 23:42:15.083660 2488 reconciler.go:26] "Reconciler: start to sync state" Mar 12 23:42:15.084549 kubelet[2488]: E0312 23:42:15.084512 2488 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.4.241:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-27aefdfc79?timeout=10s\": dial tcp 10.0.4.241:6443: connect: connection refused" interval="200ms" Mar 12 23:42:15.084760 kubelet[2488]: I0312 23:42:15.084735 2488 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:42:15.084833 kubelet[2488]: I0312 23:42:15.084812 2488 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:42:15.085293 kubelet[2488]: E0312 23:42:15.085243 2488 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.4.241:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.4.241:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 23:42:15.086564 kubelet[2488]: I0312 23:42:15.086368 2488 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:42:15.098456 kubelet[2488]: I0312 23:42:15.098434 2488 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 23:42:15.098576 kubelet[2488]: I0312 23:42:15.098564 2488 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 23:42:15.098628 kubelet[2488]: I0312 23:42:15.098620 2488 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:42:15.098931 kubelet[2488]: I0312 23:42:15.098902 2488 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 12 23:42:15.100292 kubelet[2488]: I0312 23:42:15.100116 2488 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 12 23:42:15.100292 kubelet[2488]: I0312 23:42:15.100148 2488 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 12 23:42:15.100292 kubelet[2488]: I0312 23:42:15.100166 2488 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:42:15.100292 kubelet[2488]: I0312 23:42:15.100174 2488 kubelet.go:2436] "Starting kubelet main sync loop" Mar 12 23:42:15.100292 kubelet[2488]: E0312 23:42:15.100215 2488 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:42:15.102454 kubelet[2488]: I0312 23:42:15.102433 2488 policy_none.go:49] "None policy: Start" Mar 12 23:42:15.102454 kubelet[2488]: I0312 23:42:15.102455 2488 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 12 23:42:15.102550 kubelet[2488]: I0312 23:42:15.102468 2488 state_mem.go:35] "Initializing new in-memory state store" Mar 12 23:42:15.103681 kubelet[2488]: E0312 23:42:15.103644 2488 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.4.241:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.4.241:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 23:42:15.107237 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 12 23:42:15.121079 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 12 23:42:15.124288 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 12 23:42:15.135294 kubelet[2488]: E0312 23:42:15.135225 2488 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:42:15.135486 kubelet[2488]: I0312 23:42:15.135446 2488 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 23:42:15.135486 kubelet[2488]: I0312 23:42:15.135466 2488 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:42:15.135708 kubelet[2488]: I0312 23:42:15.135675 2488 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 23:42:15.136541 kubelet[2488]: E0312 23:42:15.136514 2488 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:42:15.136600 kubelet[2488]: E0312 23:42:15.136550 2488 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-27aefdfc79\" not found" Mar 12 23:42:15.210735 systemd[1]: Created slice kubepods-burstable-pod3dadde19eb176ad149b114910369edcb.slice - libcontainer container kubepods-burstable-pod3dadde19eb176ad149b114910369edcb.slice. Mar 12 23:42:15.222117 kubelet[2488]: E0312 23:42:15.222061 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.224647 systemd[1]: Created slice kubepods-burstable-pod615c97b4dc9204c05f0d693b702b0e0f.slice - libcontainer container kubepods-burstable-pod615c97b4dc9204c05f0d693b702b0e0f.slice. Mar 12 23:42:15.236746 kubelet[2488]: E0312 23:42:15.236519 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.237669 kubelet[2488]: I0312 23:42:15.237627 2488 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.238686 systemd[1]: Created slice kubepods-burstable-pod259ce4b96e2ad7916ac0952ad444a9b0.slice - libcontainer container kubepods-burstable-pod259ce4b96e2ad7916ac0952ad444a9b0.slice. Mar 12 23:42:15.238971 kubelet[2488]: E0312 23:42:15.238941 2488 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.4.241:6443/api/v1/nodes\": dial tcp 10.0.4.241:6443: connect: connection refused" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.240178 kubelet[2488]: E0312 23:42:15.240143 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.284511 kubelet[2488]: I0312 23:42:15.284397 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/259ce4b96e2ad7916ac0952ad444a9b0-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-27aefdfc79\" (UID: \"259ce4b96e2ad7916ac0952ad444a9b0\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.285649 kubelet[2488]: E0312 23:42:15.285597 2488 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.4.241:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-27aefdfc79?timeout=10s\": dial tcp 10.0.4.241:6443: connect: connection refused" interval="400ms" Mar 12 23:42:15.385547 kubelet[2488]: I0312 23:42:15.385467 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3dadde19eb176ad149b114910369edcb-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" (UID: \"3dadde19eb176ad149b114910369edcb\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.385547 kubelet[2488]: I0312 23:42:15.385520 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3dadde19eb176ad149b114910369edcb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" (UID: \"3dadde19eb176ad149b114910369edcb\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.385699 kubelet[2488]: I0312 23:42:15.385568 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.385699 kubelet[2488]: I0312 23:42:15.385637 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.385699 kubelet[2488]: I0312 23:42:15.385653 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3dadde19eb176ad149b114910369edcb-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" (UID: \"3dadde19eb176ad149b114910369edcb\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.385699 kubelet[2488]: I0312 23:42:15.385692 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.385777 kubelet[2488]: I0312 23:42:15.385707 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.385777 kubelet[2488]: I0312 23:42:15.385725 2488 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.441293 kubelet[2488]: I0312 23:42:15.441128 2488 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.441568 kubelet[2488]: E0312 23:42:15.441544 2488 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.4.241:6443/api/v1/nodes\": dial tcp 10.0.4.241:6443: connect: connection refused" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:15.523927 containerd[1626]: time="2026-03-12T23:42:15.523878201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-27aefdfc79,Uid:3dadde19eb176ad149b114910369edcb,Namespace:kube-system,Attempt:0,}" Mar 12 23:42:15.538388 containerd[1626]: time="2026-03-12T23:42:15.537996109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-27aefdfc79,Uid:615c97b4dc9204c05f0d693b702b0e0f,Namespace:kube-system,Attempt:0,}" Mar 12 23:42:15.541855 containerd[1626]: time="2026-03-12T23:42:15.541816168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-27aefdfc79,Uid:259ce4b96e2ad7916ac0952ad444a9b0,Namespace:kube-system,Attempt:0,}" Mar 12 23:42:15.542842 containerd[1626]: time="2026-03-12T23:42:15.542809653Z" level=info msg="connecting to shim e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410" address="unix:///run/containerd/s/76a1168e961f60a709a71af71a886240693eaee12a9545b7c7b2e768965fa957" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:15.567476 systemd[1]: Started cri-containerd-e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410.scope - libcontainer container e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410. Mar 12 23:42:15.569220 containerd[1626]: time="2026-03-12T23:42:15.568069415Z" level=info msg="connecting to shim 6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d" address="unix:///run/containerd/s/c2fbe77d15511d197e932da34b2a13ba989d460dc39f0ab7adaf9ba343f5ccc1" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:15.591956 containerd[1626]: time="2026-03-12T23:42:15.591913731Z" level=info msg="connecting to shim 1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974" address="unix:///run/containerd/s/0d49336a1967a7ea33df0f9953ab76eb5eba8f826e37be83a80468c0921e10c3" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:15.594446 systemd[1]: Started cri-containerd-6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d.scope - libcontainer container 6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d. Mar 12 23:42:15.611583 containerd[1626]: time="2026-03-12T23:42:15.611459186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-27aefdfc79,Uid:3dadde19eb176ad149b114910369edcb,Namespace:kube-system,Attempt:0,} returns sandbox id \"e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410\"" Mar 12 23:42:15.616792 containerd[1626]: time="2026-03-12T23:42:15.616752492Z" level=info msg="CreateContainer within sandbox \"e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 12 23:42:15.618487 systemd[1]: Started cri-containerd-1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974.scope - libcontainer container 1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974. Mar 12 23:42:15.629562 containerd[1626]: time="2026-03-12T23:42:15.629493953Z" level=info msg="Container b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:15.638583 containerd[1626]: time="2026-03-12T23:42:15.638399477Z" level=info msg="CreateContainer within sandbox \"e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7\"" Mar 12 23:42:15.639498 containerd[1626]: time="2026-03-12T23:42:15.639461962Z" level=info msg="StartContainer for \"b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7\"" Mar 12 23:42:15.640636 containerd[1626]: time="2026-03-12T23:42:15.640548367Z" level=info msg="connecting to shim b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7" address="unix:///run/containerd/s/76a1168e961f60a709a71af71a886240693eaee12a9545b7c7b2e768965fa957" protocol=ttrpc version=3 Mar 12 23:42:15.645295 containerd[1626]: time="2026-03-12T23:42:15.643626102Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-27aefdfc79,Uid:615c97b4dc9204c05f0d693b702b0e0f,Namespace:kube-system,Attempt:0,} returns sandbox id \"6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d\"" Mar 12 23:42:15.649801 containerd[1626]: time="2026-03-12T23:42:15.649483651Z" level=info msg="CreateContainer within sandbox \"6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 12 23:42:15.661413 containerd[1626]: time="2026-03-12T23:42:15.661368508Z" level=info msg="Container 71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:15.666521 systemd[1]: Started cri-containerd-b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7.scope - libcontainer container b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7. Mar 12 23:42:15.669821 containerd[1626]: time="2026-03-12T23:42:15.669776389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-27aefdfc79,Uid:259ce4b96e2ad7916ac0952ad444a9b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974\"" Mar 12 23:42:15.671091 containerd[1626]: time="2026-03-12T23:42:15.671055195Z" level=info msg="CreateContainer within sandbox \"6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95\"" Mar 12 23:42:15.671753 containerd[1626]: time="2026-03-12T23:42:15.671719439Z" level=info msg="StartContainer for \"71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95\"" Mar 12 23:42:15.673784 containerd[1626]: time="2026-03-12T23:42:15.673486687Z" level=info msg="connecting to shim 71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95" address="unix:///run/containerd/s/c2fbe77d15511d197e932da34b2a13ba989d460dc39f0ab7adaf9ba343f5ccc1" protocol=ttrpc version=3 Mar 12 23:42:15.674041 containerd[1626]: time="2026-03-12T23:42:15.674010050Z" level=info msg="CreateContainer within sandbox \"1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 12 23:42:15.684237 containerd[1626]: time="2026-03-12T23:42:15.684075138Z" level=info msg="Container a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:15.687384 kubelet[2488]: E0312 23:42:15.686786 2488 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.4.241:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-27aefdfc79?timeout=10s\": dial tcp 10.0.4.241:6443: connect: connection refused" interval="800ms" Mar 12 23:42:15.691402 containerd[1626]: time="2026-03-12T23:42:15.691359694Z" level=info msg="CreateContainer within sandbox \"1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5\"" Mar 12 23:42:15.691895 containerd[1626]: time="2026-03-12T23:42:15.691874736Z" level=info msg="StartContainer for \"a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5\"" Mar 12 23:42:15.693460 containerd[1626]: time="2026-03-12T23:42:15.693393624Z" level=info msg="connecting to shim a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5" address="unix:///run/containerd/s/0d49336a1967a7ea33df0f9953ab76eb5eba8f826e37be83a80468c0921e10c3" protocol=ttrpc version=3 Mar 12 23:42:15.697501 systemd[1]: Started cri-containerd-71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95.scope - libcontainer container 71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95. Mar 12 23:42:15.714232 containerd[1626]: time="2026-03-12T23:42:15.714192045Z" level=info msg="StartContainer for \"b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7\" returns successfully" Mar 12 23:42:15.714431 systemd[1]: Started cri-containerd-a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5.scope - libcontainer container a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5. Mar 12 23:42:15.754516 containerd[1626]: time="2026-03-12T23:42:15.754470920Z" level=info msg="StartContainer for \"71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95\" returns successfully" Mar 12 23:42:15.759422 containerd[1626]: time="2026-03-12T23:42:15.759223103Z" level=info msg="StartContainer for \"a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5\" returns successfully" Mar 12 23:42:15.844715 kubelet[2488]: I0312 23:42:15.844671 2488 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:16.107815 kubelet[2488]: E0312 23:42:16.107716 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:16.111391 kubelet[2488]: E0312 23:42:16.111359 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:16.112971 kubelet[2488]: E0312 23:42:16.112949 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.040989 kubelet[2488]: E0312 23:42:17.040949 2488 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.064851 kubelet[2488]: I0312 23:42:17.064821 2488 apiserver.go:52] "Watching apiserver" Mar 12 23:42:17.084322 kubelet[2488]: I0312 23:42:17.084261 2488 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 12 23:42:17.118152 kubelet[2488]: E0312 23:42:17.118070 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.118280 kubelet[2488]: E0312 23:42:17.118224 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.118524 kubelet[2488]: E0312 23:42:17.118508 2488 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.122732 kubelet[2488]: I0312 23:42:17.122697 2488 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.185347 kubelet[2488]: I0312 23:42:17.185076 2488 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.198356 kubelet[2488]: E0312 23:42:17.198294 2488 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.198356 kubelet[2488]: I0312 23:42:17.198325 2488 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.203697 kubelet[2488]: E0312 23:42:17.203655 2488 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.203697 kubelet[2488]: I0312 23:42:17.203681 2488 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:17.205785 kubelet[2488]: E0312 23:42:17.205743 2488 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-27aefdfc79\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:18.119320 kubelet[2488]: I0312 23:42:18.118815 2488 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:18.119320 kubelet[2488]: I0312 23:42:18.119125 2488 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:19.141208 kubelet[2488]: I0312 23:42:19.141158 2488 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:19.247758 systemd[1]: Reload requested from client PID 2774 ('systemctl') (unit session-7.scope)... Mar 12 23:42:19.247775 systemd[1]: Reloading... Mar 12 23:42:19.315374 zram_generator::config[2817]: No configuration found. Mar 12 23:42:19.491527 systemd[1]: Reloading finished in 243 ms. Mar 12 23:42:19.523094 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:19.534612 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 23:42:19.535118 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:19.535258 systemd[1]: kubelet.service: Consumed 713ms CPU time, 128.4M memory peak. Mar 12 23:42:19.537029 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 23:42:19.776383 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 23:42:19.781961 (kubelet)[2862]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 23:42:19.817123 kubelet[2862]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:42:19.817123 kubelet[2862]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 23:42:19.817123 kubelet[2862]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 23:42:19.817482 kubelet[2862]: I0312 23:42:19.817167 2862 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 23:42:19.824246 kubelet[2862]: I0312 23:42:19.824201 2862 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 12 23:42:19.824246 kubelet[2862]: I0312 23:42:19.824232 2862 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 23:42:19.824477 kubelet[2862]: I0312 23:42:19.824459 2862 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 23:42:19.825802 kubelet[2862]: I0312 23:42:19.825786 2862 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 12 23:42:19.827996 kubelet[2862]: I0312 23:42:19.827977 2862 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 23:42:19.831324 kubelet[2862]: I0312 23:42:19.831302 2862 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 23:42:19.836131 kubelet[2862]: I0312 23:42:19.836108 2862 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 12 23:42:19.836321 kubelet[2862]: I0312 23:42:19.836293 2862 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 23:42:19.836459 kubelet[2862]: I0312 23:42:19.836321 2862 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-27aefdfc79","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 23:42:19.836531 kubelet[2862]: I0312 23:42:19.836462 2862 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 23:42:19.836531 kubelet[2862]: I0312 23:42:19.836471 2862 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 23:42:19.836531 kubelet[2862]: I0312 23:42:19.836514 2862 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:42:19.836660 kubelet[2862]: I0312 23:42:19.836645 2862 kubelet.go:480] "Attempting to sync node with API server" Mar 12 23:42:19.836685 kubelet[2862]: I0312 23:42:19.836661 2862 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 23:42:19.836685 kubelet[2862]: I0312 23:42:19.836683 2862 kubelet.go:386] "Adding apiserver pod source" Mar 12 23:42:19.836723 kubelet[2862]: I0312 23:42:19.836698 2862 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 23:42:19.840707 kubelet[2862]: I0312 23:42:19.840673 2862 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 12 23:42:19.841397 kubelet[2862]: I0312 23:42:19.841373 2862 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 23:42:19.844351 kubelet[2862]: I0312 23:42:19.844100 2862 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 12 23:42:19.844351 kubelet[2862]: I0312 23:42:19.844145 2862 server.go:1289] "Started kubelet" Mar 12 23:42:19.845350 kubelet[2862]: I0312 23:42:19.845324 2862 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 23:42:19.845525 kubelet[2862]: I0312 23:42:19.845485 2862 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 23:42:19.846377 kubelet[2862]: I0312 23:42:19.846219 2862 server.go:317] "Adding debug handlers to kubelet server" Mar 12 23:42:19.849678 kubelet[2862]: I0312 23:42:19.849610 2862 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 23:42:19.849845 kubelet[2862]: I0312 23:42:19.849819 2862 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 23:42:19.850103 kubelet[2862]: I0312 23:42:19.850086 2862 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 12 23:42:19.850410 kubelet[2862]: I0312 23:42:19.850343 2862 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 23:42:19.854348 kubelet[2862]: I0312 23:42:19.854326 2862 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 12 23:42:19.854553 kubelet[2862]: I0312 23:42:19.854541 2862 reconciler.go:26] "Reconciler: start to sync state" Mar 12 23:42:19.854608 kubelet[2862]: E0312 23:42:19.850461 2862 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-27aefdfc79\" not found" Mar 12 23:42:19.989849 kubelet[2862]: E0312 23:42:19.989502 2862 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 23:42:19.989849 kubelet[2862]: I0312 23:42:19.989696 2862 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 12 23:42:19.992934 kubelet[2862]: I0312 23:42:19.991613 2862 factory.go:223] Registration of the containerd container factory successfully Mar 12 23:42:19.992934 kubelet[2862]: I0312 23:42:19.991634 2862 factory.go:223] Registration of the systemd container factory successfully Mar 12 23:42:19.992934 kubelet[2862]: I0312 23:42:19.991732 2862 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 23:42:20.001073 kubelet[2862]: I0312 23:42:20.001025 2862 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 12 23:42:20.001073 kubelet[2862]: I0312 23:42:20.001058 2862 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 12 23:42:20.001216 kubelet[2862]: I0312 23:42:20.001084 2862 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 23:42:20.001216 kubelet[2862]: I0312 23:42:20.001092 2862 kubelet.go:2436] "Starting kubelet main sync loop" Mar 12 23:42:20.001216 kubelet[2862]: E0312 23:42:20.001143 2862 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 23:42:20.030807 kubelet[2862]: I0312 23:42:20.030667 2862 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 23:42:20.030807 kubelet[2862]: I0312 23:42:20.030690 2862 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 23:42:20.030807 kubelet[2862]: I0312 23:42:20.030711 2862 state_mem.go:36] "Initialized new in-memory state store" Mar 12 23:42:20.031711 kubelet[2862]: I0312 23:42:20.031669 2862 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 12 23:42:20.031711 kubelet[2862]: I0312 23:42:20.031690 2862 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 12 23:42:20.031711 kubelet[2862]: I0312 23:42:20.031709 2862 policy_none.go:49] "None policy: Start" Mar 12 23:42:20.031793 kubelet[2862]: I0312 23:42:20.031717 2862 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 12 23:42:20.031793 kubelet[2862]: I0312 23:42:20.031728 2862 state_mem.go:35] "Initializing new in-memory state store" Mar 12 23:42:20.031846 kubelet[2862]: I0312 23:42:20.031826 2862 state_mem.go:75] "Updated machine memory state" Mar 12 23:42:20.035451 kubelet[2862]: E0312 23:42:20.035420 2862 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 23:42:20.035604 kubelet[2862]: I0312 23:42:20.035575 2862 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 23:42:20.035637 kubelet[2862]: I0312 23:42:20.035594 2862 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 23:42:20.035908 kubelet[2862]: I0312 23:42:20.035886 2862 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 23:42:20.038917 kubelet[2862]: E0312 23:42:20.037748 2862 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 23:42:20.102053 kubelet[2862]: I0312 23:42:20.102005 2862 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.102156 kubelet[2862]: I0312 23:42:20.102122 2862 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.102533 kubelet[2862]: I0312 23:42:20.102466 2862 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.109040 kubelet[2862]: E0312 23:42:20.108970 2862 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.109620 kubelet[2862]: E0312 23:42:20.109337 2862 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-27aefdfc79\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.109620 kubelet[2862]: E0312 23:42:20.109582 2862 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.142592 kubelet[2862]: I0312 23:42:20.142534 2862 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.155901 kubelet[2862]: I0312 23:42:20.155848 2862 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.156020 kubelet[2862]: I0312 23:42:20.155941 2862 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.190904 kubelet[2862]: I0312 23:42:20.190798 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/259ce4b96e2ad7916ac0952ad444a9b0-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-27aefdfc79\" (UID: \"259ce4b96e2ad7916ac0952ad444a9b0\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191076 kubelet[2862]: I0312 23:42:20.190999 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3dadde19eb176ad149b114910369edcb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" (UID: \"3dadde19eb176ad149b114910369edcb\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191076 kubelet[2862]: I0312 23:42:20.191048 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191076 kubelet[2862]: I0312 23:42:20.191066 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191076 kubelet[2862]: I0312 23:42:20.191080 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191342 kubelet[2862]: I0312 23:42:20.191127 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3dadde19eb176ad149b114910369edcb-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" (UID: \"3dadde19eb176ad149b114910369edcb\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191342 kubelet[2862]: I0312 23:42:20.191146 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3dadde19eb176ad149b114910369edcb-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" (UID: \"3dadde19eb176ad149b114910369edcb\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191342 kubelet[2862]: I0312 23:42:20.191159 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.191342 kubelet[2862]: I0312 23:42:20.191202 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/615c97b4dc9204c05f0d693b702b0e0f-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-27aefdfc79\" (UID: \"615c97b4dc9204c05f0d693b702b0e0f\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:20.839065 kubelet[2862]: I0312 23:42:20.838977 2862 apiserver.go:52] "Watching apiserver" Mar 12 23:42:20.854766 kubelet[2862]: I0312 23:42:20.854659 2862 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 12 23:42:21.015806 kubelet[2862]: I0312 23:42:21.015540 2862 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:21.015806 kubelet[2862]: I0312 23:42:21.015622 2862 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:21.025655 kubelet[2862]: E0312 23:42:21.024366 2862 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-27aefdfc79\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:21.025655 kubelet[2862]: E0312 23:42:21.024645 2862 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-27aefdfc79\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" Mar 12 23:42:21.037427 kubelet[2862]: I0312 23:42:21.037350 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-27aefdfc79" podStartSLOduration=3.037318771 podStartE2EDuration="3.037318771s" podCreationTimestamp="2026-03-12 23:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:42:21.036772408 +0000 UTC m=+1.250656553" watchObservedRunningTime="2026-03-12 23:42:21.037318771 +0000 UTC m=+1.251202916" Mar 12 23:42:21.046057 kubelet[2862]: I0312 23:42:21.046009 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-27aefdfc79" podStartSLOduration=2.045995613 podStartE2EDuration="2.045995613s" podCreationTimestamp="2026-03-12 23:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:42:21.045979893 +0000 UTC m=+1.259863998" watchObservedRunningTime="2026-03-12 23:42:21.045995613 +0000 UTC m=+1.259879758" Mar 12 23:42:21.060510 kubelet[2862]: I0312 23:42:21.060150 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-27aefdfc79" podStartSLOduration=3.060135242 podStartE2EDuration="3.060135242s" podCreationTimestamp="2026-03-12 23:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:42:21.057570509 +0000 UTC m=+1.271454654" watchObservedRunningTime="2026-03-12 23:42:21.060135242 +0000 UTC m=+1.274019347" Mar 12 23:42:26.331138 kubelet[2862]: I0312 23:42:26.331107 2862 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 12 23:42:26.331782 containerd[1626]: time="2026-03-12T23:42:26.331687238Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 12 23:42:26.332135 kubelet[2862]: I0312 23:42:26.331857 2862 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 12 23:42:27.234851 systemd[1]: Created slice kubepods-besteffort-podd9183f45_b6c6_41fd_8de1_4fc051b2b247.slice - libcontainer container kubepods-besteffort-podd9183f45_b6c6_41fd_8de1_4fc051b2b247.slice. Mar 12 23:42:27.334487 kubelet[2862]: I0312 23:42:27.334433 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d9183f45-b6c6-41fd-8de1-4fc051b2b247-kube-proxy\") pod \"kube-proxy-bmk9b\" (UID: \"d9183f45-b6c6-41fd-8de1-4fc051b2b247\") " pod="kube-system/kube-proxy-bmk9b" Mar 12 23:42:27.334964 kubelet[2862]: I0312 23:42:27.334518 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9183f45-b6c6-41fd-8de1-4fc051b2b247-lib-modules\") pod \"kube-proxy-bmk9b\" (UID: \"d9183f45-b6c6-41fd-8de1-4fc051b2b247\") " pod="kube-system/kube-proxy-bmk9b" Mar 12 23:42:27.334964 kubelet[2862]: I0312 23:42:27.334703 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d9183f45-b6c6-41fd-8de1-4fc051b2b247-xtables-lock\") pod \"kube-proxy-bmk9b\" (UID: \"d9183f45-b6c6-41fd-8de1-4fc051b2b247\") " pod="kube-system/kube-proxy-bmk9b" Mar 12 23:42:27.334964 kubelet[2862]: I0312 23:42:27.334733 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmgl\" (UniqueName: \"kubernetes.io/projected/d9183f45-b6c6-41fd-8de1-4fc051b2b247-kube-api-access-kvmgl\") pod \"kube-proxy-bmk9b\" (UID: \"d9183f45-b6c6-41fd-8de1-4fc051b2b247\") " pod="kube-system/kube-proxy-bmk9b" Mar 12 23:42:27.348733 systemd[1]: Created slice kubepods-besteffort-podf417acc2_68bd_4197_af5c_6d971908b99c.slice - libcontainer container kubepods-besteffort-podf417acc2_68bd_4197_af5c_6d971908b99c.slice. Mar 12 23:42:27.436058 kubelet[2862]: I0312 23:42:27.435505 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxv9c\" (UniqueName: \"kubernetes.io/projected/f417acc2-68bd-4197-af5c-6d971908b99c-kube-api-access-lxv9c\") pod \"tigera-operator-6bf85f8dd-f7j7x\" (UID: \"f417acc2-68bd-4197-af5c-6d971908b99c\") " pod="tigera-operator/tigera-operator-6bf85f8dd-f7j7x" Mar 12 23:42:27.436058 kubelet[2862]: I0312 23:42:27.435638 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f417acc2-68bd-4197-af5c-6d971908b99c-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-f7j7x\" (UID: \"f417acc2-68bd-4197-af5c-6d971908b99c\") " pod="tigera-operator/tigera-operator-6bf85f8dd-f7j7x" Mar 12 23:42:27.545979 containerd[1626]: time="2026-03-12T23:42:27.545876093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bmk9b,Uid:d9183f45-b6c6-41fd-8de1-4fc051b2b247,Namespace:kube-system,Attempt:0,}" Mar 12 23:42:27.565801 containerd[1626]: time="2026-03-12T23:42:27.565761110Z" level=info msg="connecting to shim bacea05e203522e2b4a848a0887cb32eebf592b93daa138bb6ab1bb06d507f49" address="unix:///run/containerd/s/2df5b00aadefdb1291838bb098e2305c0aaf18409d6be68695aba1c37c2e296e" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:27.599469 systemd[1]: Started cri-containerd-bacea05e203522e2b4a848a0887cb32eebf592b93daa138bb6ab1bb06d507f49.scope - libcontainer container bacea05e203522e2b4a848a0887cb32eebf592b93daa138bb6ab1bb06d507f49. Mar 12 23:42:27.621153 containerd[1626]: time="2026-03-12T23:42:27.621112259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bmk9b,Uid:d9183f45-b6c6-41fd-8de1-4fc051b2b247,Namespace:kube-system,Attempt:0,} returns sandbox id \"bacea05e203522e2b4a848a0887cb32eebf592b93daa138bb6ab1bb06d507f49\"" Mar 12 23:42:27.625932 containerd[1626]: time="2026-03-12T23:42:27.625891522Z" level=info msg="CreateContainer within sandbox \"bacea05e203522e2b4a848a0887cb32eebf592b93daa138bb6ab1bb06d507f49\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 12 23:42:27.635311 containerd[1626]: time="2026-03-12T23:42:27.634395603Z" level=info msg="Container 100f2ed0612514bb219a654824827713ef2fd6e9f24140874aae04f9561dd552: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:27.641993 containerd[1626]: time="2026-03-12T23:42:27.641952440Z" level=info msg="CreateContainer within sandbox \"bacea05e203522e2b4a848a0887cb32eebf592b93daa138bb6ab1bb06d507f49\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"100f2ed0612514bb219a654824827713ef2fd6e9f24140874aae04f9561dd552\"" Mar 12 23:42:27.642580 containerd[1626]: time="2026-03-12T23:42:27.642503602Z" level=info msg="StartContainer for \"100f2ed0612514bb219a654824827713ef2fd6e9f24140874aae04f9561dd552\"" Mar 12 23:42:27.644135 containerd[1626]: time="2026-03-12T23:42:27.644103810Z" level=info msg="connecting to shim 100f2ed0612514bb219a654824827713ef2fd6e9f24140874aae04f9561dd552" address="unix:///run/containerd/s/2df5b00aadefdb1291838bb098e2305c0aaf18409d6be68695aba1c37c2e296e" protocol=ttrpc version=3 Mar 12 23:42:27.652947 containerd[1626]: time="2026-03-12T23:42:27.652713052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-f7j7x,Uid:f417acc2-68bd-4197-af5c-6d971908b99c,Namespace:tigera-operator,Attempt:0,}" Mar 12 23:42:27.669483 systemd[1]: Started cri-containerd-100f2ed0612514bb219a654824827713ef2fd6e9f24140874aae04f9561dd552.scope - libcontainer container 100f2ed0612514bb219a654824827713ef2fd6e9f24140874aae04f9561dd552. Mar 12 23:42:27.672564 containerd[1626]: time="2026-03-12T23:42:27.672383028Z" level=info msg="connecting to shim 0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4" address="unix:///run/containerd/s/2aa3b40f6586b8579ea303c76fd9f6699dd8b1e83a16d44c8e6c7ba1439b2fba" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:27.701472 systemd[1]: Started cri-containerd-0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4.scope - libcontainer container 0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4. Mar 12 23:42:27.740737 containerd[1626]: time="2026-03-12T23:42:27.740698839Z" level=info msg="StartContainer for \"100f2ed0612514bb219a654824827713ef2fd6e9f24140874aae04f9561dd552\" returns successfully" Mar 12 23:42:27.741934 containerd[1626]: time="2026-03-12T23:42:27.741855965Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-f7j7x,Uid:f417acc2-68bd-4197-af5c-6d971908b99c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4\"" Mar 12 23:42:27.743798 containerd[1626]: time="2026-03-12T23:42:27.743770974Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 12 23:42:29.249856 kubelet[2862]: I0312 23:42:29.249738 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bmk9b" podStartSLOduration=2.249721886 podStartE2EDuration="2.249721886s" podCreationTimestamp="2026-03-12 23:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:42:28.04356495 +0000 UTC m=+8.257449055" watchObservedRunningTime="2026-03-12 23:42:29.249721886 +0000 UTC m=+9.463606031" Mar 12 23:42:29.921758 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount383734937.mount: Deactivated successfully. Mar 12 23:42:31.673133 containerd[1626]: time="2026-03-12T23:42:31.673074373Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:31.674821 containerd[1626]: time="2026-03-12T23:42:31.674786261Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Mar 12 23:42:31.675231 containerd[1626]: time="2026-03-12T23:42:31.675204983Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:31.677411 containerd[1626]: time="2026-03-12T23:42:31.677372754Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:31.678311 containerd[1626]: time="2026-03-12T23:42:31.678285558Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 3.934483544s" Mar 12 23:42:31.678351 containerd[1626]: time="2026-03-12T23:42:31.678313758Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Mar 12 23:42:31.682771 containerd[1626]: time="2026-03-12T23:42:31.682733060Z" level=info msg="CreateContainer within sandbox \"0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 12 23:42:31.690293 containerd[1626]: time="2026-03-12T23:42:31.689794374Z" level=info msg="Container b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:31.697513 containerd[1626]: time="2026-03-12T23:42:31.697472011Z" level=info msg="CreateContainer within sandbox \"0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366\"" Mar 12 23:42:31.698505 containerd[1626]: time="2026-03-12T23:42:31.698471936Z" level=info msg="StartContainer for \"b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366\"" Mar 12 23:42:31.699363 containerd[1626]: time="2026-03-12T23:42:31.699336940Z" level=info msg="connecting to shim b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366" address="unix:///run/containerd/s/2aa3b40f6586b8579ea303c76fd9f6699dd8b1e83a16d44c8e6c7ba1439b2fba" protocol=ttrpc version=3 Mar 12 23:42:31.723596 systemd[1]: Started cri-containerd-b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366.scope - libcontainer container b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366. Mar 12 23:42:31.750290 containerd[1626]: time="2026-03-12T23:42:31.750133307Z" level=info msg="StartContainer for \"b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366\" returns successfully" Mar 12 23:42:32.050340 kubelet[2862]: I0312 23:42:32.050170 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-f7j7x" podStartSLOduration=1.114743096 podStartE2EDuration="5.050155964s" podCreationTimestamp="2026-03-12 23:42:27 +0000 UTC" firstStartedPulling="2026-03-12 23:42:27.743487013 +0000 UTC m=+7.957371158" lastFinishedPulling="2026-03-12 23:42:31.678899921 +0000 UTC m=+11.892784026" observedRunningTime="2026-03-12 23:42:32.049871122 +0000 UTC m=+12.263755267" watchObservedRunningTime="2026-03-12 23:42:32.050155964 +0000 UTC m=+12.264040109" Mar 12 23:42:36.882403 sudo[1893]: pam_unix(sudo:session): session closed for user root Mar 12 23:42:36.976432 sshd[1892]: Connection closed by 20.161.92.111 port 49308 Mar 12 23:42:36.977454 sshd-session[1889]: pam_unix(sshd:session): session closed for user core Mar 12 23:42:36.983168 systemd-logind[1600]: Session 7 logged out. Waiting for processes to exit. Mar 12 23:42:36.983734 systemd[1]: sshd@6-10.0.4.241:22-20.161.92.111:49308.service: Deactivated successfully. Mar 12 23:42:36.985882 systemd[1]: session-7.scope: Deactivated successfully. Mar 12 23:42:36.986103 systemd[1]: session-7.scope: Consumed 6.042s CPU time, 227.4M memory peak. Mar 12 23:42:36.987925 systemd-logind[1600]: Removed session 7. Mar 12 23:42:41.195458 systemd[1]: Created slice kubepods-besteffort-pod0865d8b6_52ff_4de2_b2a6_565656482020.slice - libcontainer container kubepods-besteffort-pod0865d8b6_52ff_4de2_b2a6_565656482020.slice. Mar 12 23:42:41.243326 systemd[1]: Created slice kubepods-besteffort-podd0888567_7ad6_498e_a459_42bedca5636e.slice - libcontainer container kubepods-besteffort-podd0888567_7ad6_498e_a459_42bedca5636e.slice. Mar 12 23:42:41.322281 kubelet[2862]: I0312 23:42:41.322217 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0865d8b6-52ff-4de2-b2a6-565656482020-tigera-ca-bundle\") pod \"calico-typha-7bd9f97c6c-tstpd\" (UID: \"0865d8b6-52ff-4de2-b2a6-565656482020\") " pod="calico-system/calico-typha-7bd9f97c6c-tstpd" Mar 12 23:42:41.322281 kubelet[2862]: I0312 23:42:41.322275 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-nodeproc\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.322869 kubelet[2862]: I0312 23:42:41.322328 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0888567-7ad6-498e-a459-42bedca5636e-tigera-ca-bundle\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.322869 kubelet[2862]: I0312 23:42:41.322364 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-cni-net-dir\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.322869 kubelet[2862]: I0312 23:42:41.322383 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-var-lib-calico\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.322869 kubelet[2862]: I0312 23:42:41.322398 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svwv\" (UniqueName: \"kubernetes.io/projected/d0888567-7ad6-498e-a459-42bedca5636e-kube-api-access-4svwv\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.322869 kubelet[2862]: I0312 23:42:41.322415 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0865d8b6-52ff-4de2-b2a6-565656482020-typha-certs\") pod \"calico-typha-7bd9f97c6c-tstpd\" (UID: \"0865d8b6-52ff-4de2-b2a6-565656482020\") " pod="calico-system/calico-typha-7bd9f97c6c-tstpd" Mar 12 23:42:41.323111 kubelet[2862]: I0312 23:42:41.322428 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-bpffs\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323111 kubelet[2862]: I0312 23:42:41.322578 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-flexvol-driver-host\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323111 kubelet[2862]: I0312 23:42:41.322625 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d0888567-7ad6-498e-a459-42bedca5636e-node-certs\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323111 kubelet[2862]: I0312 23:42:41.322673 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-cni-bin-dir\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323111 kubelet[2862]: I0312 23:42:41.322715 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-cni-log-dir\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323259 kubelet[2862]: I0312 23:42:41.323212 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-lib-modules\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323259 kubelet[2862]: I0312 23:42:41.323253 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-var-run-calico\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323322 kubelet[2862]: I0312 23:42:41.323286 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnf9\" (UniqueName: \"kubernetes.io/projected/0865d8b6-52ff-4de2-b2a6-565656482020-kube-api-access-tqnf9\") pod \"calico-typha-7bd9f97c6c-tstpd\" (UID: \"0865d8b6-52ff-4de2-b2a6-565656482020\") " pod="calico-system/calico-typha-7bd9f97c6c-tstpd" Mar 12 23:42:41.323604 kubelet[2862]: I0312 23:42:41.323579 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-policysync\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323662 kubelet[2862]: I0312 23:42:41.323613 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-xtables-lock\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.323662 kubelet[2862]: I0312 23:42:41.323631 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d0888567-7ad6-498e-a459-42bedca5636e-sys-fs\") pod \"calico-node-hr4b9\" (UID: \"d0888567-7ad6-498e-a459-42bedca5636e\") " pod="calico-system/calico-node-hr4b9" Mar 12 23:42:41.343998 kubelet[2862]: E0312 23:42:41.343759 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:41.424453 kubelet[2862]: I0312 23:42:41.424365 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/48bac47d-6b7b-4409-8577-545950ed7262-registration-dir\") pod \"csi-node-driver-f2x6w\" (UID: \"48bac47d-6b7b-4409-8577-545950ed7262\") " pod="calico-system/csi-node-driver-f2x6w" Mar 12 23:42:41.425012 kubelet[2862]: I0312 23:42:41.424960 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/48bac47d-6b7b-4409-8577-545950ed7262-varrun\") pod \"csi-node-driver-f2x6w\" (UID: \"48bac47d-6b7b-4409-8577-545950ed7262\") " pod="calico-system/csi-node-driver-f2x6w" Mar 12 23:42:41.425065 kubelet[2862]: I0312 23:42:41.425020 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48bac47d-6b7b-4409-8577-545950ed7262-kubelet-dir\") pod \"csi-node-driver-f2x6w\" (UID: \"48bac47d-6b7b-4409-8577-545950ed7262\") " pod="calico-system/csi-node-driver-f2x6w" Mar 12 23:42:41.425065 kubelet[2862]: I0312 23:42:41.425038 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/48bac47d-6b7b-4409-8577-545950ed7262-socket-dir\") pod \"csi-node-driver-f2x6w\" (UID: \"48bac47d-6b7b-4409-8577-545950ed7262\") " pod="calico-system/csi-node-driver-f2x6w" Mar 12 23:42:41.425065 kubelet[2862]: I0312 23:42:41.425055 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfzs7\" (UniqueName: \"kubernetes.io/projected/48bac47d-6b7b-4409-8577-545950ed7262-kube-api-access-xfzs7\") pod \"csi-node-driver-f2x6w\" (UID: \"48bac47d-6b7b-4409-8577-545950ed7262\") " pod="calico-system/csi-node-driver-f2x6w" Mar 12 23:42:41.426927 kubelet[2862]: E0312 23:42:41.426898 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.426927 kubelet[2862]: W0312 23:42:41.426923 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.427032 kubelet[2862]: E0312 23:42:41.426948 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.427189 kubelet[2862]: E0312 23:42:41.427174 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.427189 kubelet[2862]: W0312 23:42:41.427188 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.427312 kubelet[2862]: E0312 23:42:41.427198 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.427921 kubelet[2862]: E0312 23:42:41.427841 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.427921 kubelet[2862]: W0312 23:42:41.427858 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.427921 kubelet[2862]: E0312 23:42:41.427872 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.431432 kubelet[2862]: E0312 23:42:41.431411 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.432295 kubelet[2862]: W0312 23:42:41.431545 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.432295 kubelet[2862]: E0312 23:42:41.431565 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.432528 kubelet[2862]: E0312 23:42:41.432515 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.432711 kubelet[2862]: W0312 23:42:41.432588 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.432799 kubelet[2862]: E0312 23:42:41.432772 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.437956 kubelet[2862]: E0312 23:42:41.437912 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.438035 kubelet[2862]: W0312 23:42:41.437953 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.438035 kubelet[2862]: E0312 23:42:41.438003 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.439282 kubelet[2862]: E0312 23:42:41.439241 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.439282 kubelet[2862]: W0312 23:42:41.439255 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.439390 kubelet[2862]: E0312 23:42:41.439360 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.500747 containerd[1626]: time="2026-03-12T23:42:41.500640850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bd9f97c6c-tstpd,Uid:0865d8b6-52ff-4de2-b2a6-565656482020,Namespace:calico-system,Attempt:0,}" Mar 12 23:42:41.520441 containerd[1626]: time="2026-03-12T23:42:41.520398186Z" level=info msg="connecting to shim 3ddbd145cd2afcd17b9511601fdf23fd92289eb8360138e576a418220e529ce9" address="unix:///run/containerd/s/f6b8ef1dd63f47e7f6a3edcb58c3478d7626121ce3cebe5d91284b27ea94f6d9" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:41.525664 kubelet[2862]: E0312 23:42:41.525522 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.525664 kubelet[2862]: W0312 23:42:41.525543 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.525664 kubelet[2862]: E0312 23:42:41.525562 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.525983 kubelet[2862]: E0312 23:42:41.525853 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.525983 kubelet[2862]: W0312 23:42:41.525865 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.525983 kubelet[2862]: E0312 23:42:41.525874 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.526279 kubelet[2862]: E0312 23:42:41.526116 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.526279 kubelet[2862]: W0312 23:42:41.526130 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.526279 kubelet[2862]: E0312 23:42:41.526139 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.526613 kubelet[2862]: E0312 23:42:41.526424 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.526613 kubelet[2862]: W0312 23:42:41.526437 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.526613 kubelet[2862]: E0312 23:42:41.526446 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.526904 kubelet[2862]: E0312 23:42:41.526747 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.526904 kubelet[2862]: W0312 23:42:41.526759 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.526904 kubelet[2862]: E0312 23:42:41.526769 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.527061 kubelet[2862]: E0312 23:42:41.527048 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.527114 kubelet[2862]: W0312 23:42:41.527105 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.527169 kubelet[2862]: E0312 23:42:41.527158 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.527391 kubelet[2862]: E0312 23:42:41.527378 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.527473 kubelet[2862]: W0312 23:42:41.527460 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.527520 kubelet[2862]: E0312 23:42:41.527510 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.527727 kubelet[2862]: E0312 23:42:41.527714 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.527796 kubelet[2862]: W0312 23:42:41.527784 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.527850 kubelet[2862]: E0312 23:42:41.527840 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.528082 kubelet[2862]: E0312 23:42:41.528049 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.528082 kubelet[2862]: W0312 23:42:41.528061 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.528082 kubelet[2862]: E0312 23:42:41.528070 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.529077 kubelet[2862]: E0312 23:42:41.528996 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.529077 kubelet[2862]: W0312 23:42:41.529012 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.529077 kubelet[2862]: E0312 23:42:41.529022 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.529405 kubelet[2862]: E0312 23:42:41.529385 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.529542 kubelet[2862]: W0312 23:42:41.529475 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.529542 kubelet[2862]: E0312 23:42:41.529491 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.529829 kubelet[2862]: E0312 23:42:41.529763 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.529829 kubelet[2862]: W0312 23:42:41.529775 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.529829 kubelet[2862]: E0312 23:42:41.529783 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.530057 kubelet[2862]: E0312 23:42:41.530045 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.530127 kubelet[2862]: W0312 23:42:41.530116 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.530180 kubelet[2862]: E0312 23:42:41.530170 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.530405 kubelet[2862]: E0312 23:42:41.530393 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.530533 kubelet[2862]: W0312 23:42:41.530469 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.530533 kubelet[2862]: E0312 23:42:41.530484 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.531016 kubelet[2862]: E0312 23:42:41.530875 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.531016 kubelet[2862]: W0312 23:42:41.530889 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.531016 kubelet[2862]: E0312 23:42:41.530899 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.531257 kubelet[2862]: E0312 23:42:41.531187 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.531257 kubelet[2862]: W0312 23:42:41.531198 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.531257 kubelet[2862]: E0312 23:42:41.531208 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.531630 kubelet[2862]: E0312 23:42:41.531559 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.531630 kubelet[2862]: W0312 23:42:41.531572 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.531630 kubelet[2862]: E0312 23:42:41.531582 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.531944 kubelet[2862]: E0312 23:42:41.531925 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.532109 kubelet[2862]: W0312 23:42:41.531998 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.532109 kubelet[2862]: E0312 23:42:41.532013 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.532324 kubelet[2862]: E0312 23:42:41.532311 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.532498 kubelet[2862]: W0312 23:42:41.532429 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.532498 kubelet[2862]: E0312 23:42:41.532446 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.532890 kubelet[2862]: E0312 23:42:41.532876 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.532983 kubelet[2862]: W0312 23:42:41.532970 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.533033 kubelet[2862]: E0312 23:42:41.533023 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.533330 kubelet[2862]: E0312 23:42:41.533316 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.533643 kubelet[2862]: W0312 23:42:41.533504 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.533643 kubelet[2862]: E0312 23:42:41.533577 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.533953 kubelet[2862]: E0312 23:42:41.533870 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.533953 kubelet[2862]: W0312 23:42:41.533881 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.533953 kubelet[2862]: E0312 23:42:41.533891 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.534291 kubelet[2862]: E0312 23:42:41.534245 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.534291 kubelet[2862]: W0312 23:42:41.534257 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.534529 kubelet[2862]: E0312 23:42:41.534447 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.534765 kubelet[2862]: E0312 23:42:41.534750 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.534898 kubelet[2862]: W0312 23:42:41.534816 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.534898 kubelet[2862]: E0312 23:42:41.534832 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.535120 kubelet[2862]: E0312 23:42:41.535109 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.535190 kubelet[2862]: W0312 23:42:41.535178 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.535244 kubelet[2862]: E0312 23:42:41.535233 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.542461 systemd[1]: Started cri-containerd-3ddbd145cd2afcd17b9511601fdf23fd92289eb8360138e576a418220e529ce9.scope - libcontainer container 3ddbd145cd2afcd17b9511601fdf23fd92289eb8360138e576a418220e529ce9. Mar 12 23:42:41.546834 containerd[1626]: time="2026-03-12T23:42:41.546780834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hr4b9,Uid:d0888567-7ad6-498e-a459-42bedca5636e,Namespace:calico-system,Attempt:0,}" Mar 12 23:42:41.547529 kubelet[2862]: E0312 23:42:41.547506 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:41.547529 kubelet[2862]: W0312 23:42:41.547525 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:41.547746 kubelet[2862]: E0312 23:42:41.547543 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:41.570739 containerd[1626]: time="2026-03-12T23:42:41.570699231Z" level=info msg="connecting to shim 1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d" address="unix:///run/containerd/s/4fb4d9f1bc3900454a2270c68817154eec9656a1004480fdfd994fa7c5a6f31b" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:42:41.582058 containerd[1626]: time="2026-03-12T23:42:41.582008126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bd9f97c6c-tstpd,Uid:0865d8b6-52ff-4de2-b2a6-565656482020,Namespace:calico-system,Attempt:0,} returns sandbox id \"3ddbd145cd2afcd17b9511601fdf23fd92289eb8360138e576a418220e529ce9\"" Mar 12 23:42:41.583887 containerd[1626]: time="2026-03-12T23:42:41.583853534Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 12 23:42:41.596586 systemd[1]: Started cri-containerd-1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d.scope - libcontainer container 1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d. Mar 12 23:42:41.620760 containerd[1626]: time="2026-03-12T23:42:41.620651113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hr4b9,Uid:d0888567-7ad6-498e-a459-42bedca5636e,Namespace:calico-system,Attempt:0,} returns sandbox id \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\"" Mar 12 23:42:43.001914 kubelet[2862]: E0312 23:42:43.001826 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:43.039969 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1471524225.mount: Deactivated successfully. Mar 12 23:42:43.738127 containerd[1626]: time="2026-03-12T23:42:43.737617590Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:43.738779 containerd[1626]: time="2026-03-12T23:42:43.738607475Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Mar 12 23:42:43.740551 containerd[1626]: time="2026-03-12T23:42:43.740495404Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:43.744068 containerd[1626]: time="2026-03-12T23:42:43.743821500Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:43.744462 containerd[1626]: time="2026-03-12T23:42:43.744442303Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.160556008s" Mar 12 23:42:43.744517 containerd[1626]: time="2026-03-12T23:42:43.744468343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Mar 12 23:42:43.745846 containerd[1626]: time="2026-03-12T23:42:43.745812270Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 12 23:42:43.753699 containerd[1626]: time="2026-03-12T23:42:43.752605903Z" level=info msg="CreateContainer within sandbox \"3ddbd145cd2afcd17b9511601fdf23fd92289eb8360138e576a418220e529ce9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 12 23:42:43.765355 containerd[1626]: time="2026-03-12T23:42:43.765312044Z" level=info msg="Container 27e4184badf250301ed713530092eaa8a71822fc7e0ded807a416c220ace3586: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:43.774513 containerd[1626]: time="2026-03-12T23:42:43.774457529Z" level=info msg="CreateContainer within sandbox \"3ddbd145cd2afcd17b9511601fdf23fd92289eb8360138e576a418220e529ce9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"27e4184badf250301ed713530092eaa8a71822fc7e0ded807a416c220ace3586\"" Mar 12 23:42:43.775344 containerd[1626]: time="2026-03-12T23:42:43.775321973Z" level=info msg="StartContainer for \"27e4184badf250301ed713530092eaa8a71822fc7e0ded807a416c220ace3586\"" Mar 12 23:42:43.778403 containerd[1626]: time="2026-03-12T23:42:43.778359428Z" level=info msg="connecting to shim 27e4184badf250301ed713530092eaa8a71822fc7e0ded807a416c220ace3586" address="unix:///run/containerd/s/f6b8ef1dd63f47e7f6a3edcb58c3478d7626121ce3cebe5d91284b27ea94f6d9" protocol=ttrpc version=3 Mar 12 23:42:43.798429 systemd[1]: Started cri-containerd-27e4184badf250301ed713530092eaa8a71822fc7e0ded807a416c220ace3586.scope - libcontainer container 27e4184badf250301ed713530092eaa8a71822fc7e0ded807a416c220ace3586. Mar 12 23:42:43.834636 containerd[1626]: time="2026-03-12T23:42:43.834599741Z" level=info msg="StartContainer for \"27e4184badf250301ed713530092eaa8a71822fc7e0ded807a416c220ace3586\" returns successfully" Mar 12 23:42:44.075128 kubelet[2862]: I0312 23:42:44.074965 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7bd9f97c6c-tstpd" podStartSLOduration=0.912897411 podStartE2EDuration="3.074948587s" podCreationTimestamp="2026-03-12 23:42:41 +0000 UTC" firstStartedPulling="2026-03-12 23:42:41.583619733 +0000 UTC m=+21.797503878" lastFinishedPulling="2026-03-12 23:42:43.745670909 +0000 UTC m=+23.959555054" observedRunningTime="2026-03-12 23:42:44.074778746 +0000 UTC m=+24.288662891" watchObservedRunningTime="2026-03-12 23:42:44.074948587 +0000 UTC m=+24.288832732" Mar 12 23:42:44.098712 kubelet[2862]: E0312 23:42:44.098665 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.098712 kubelet[2862]: W0312 23:42:44.098690 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.098712 kubelet[2862]: E0312 23:42:44.098708 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.098904 kubelet[2862]: E0312 23:42:44.098877 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.098937 kubelet[2862]: W0312 23:42:44.098888 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.098937 kubelet[2862]: E0312 23:42:44.098920 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.099103 kubelet[2862]: E0312 23:42:44.099064 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.099103 kubelet[2862]: W0312 23:42:44.099081 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.099103 kubelet[2862]: E0312 23:42:44.099095 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.099261 kubelet[2862]: E0312 23:42:44.099224 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.099261 kubelet[2862]: W0312 23:42:44.099232 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.099261 kubelet[2862]: E0312 23:42:44.099248 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.099458 kubelet[2862]: E0312 23:42:44.099431 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.099458 kubelet[2862]: W0312 23:42:44.099445 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.099507 kubelet[2862]: E0312 23:42:44.099466 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.099621 kubelet[2862]: E0312 23:42:44.099598 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.099621 kubelet[2862]: W0312 23:42:44.099619 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.099673 kubelet[2862]: E0312 23:42:44.099628 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.099759 kubelet[2862]: E0312 23:42:44.099747 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.099759 kubelet[2862]: W0312 23:42:44.099757 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.099815 kubelet[2862]: E0312 23:42:44.099765 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.099942 kubelet[2862]: E0312 23:42:44.099916 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.099942 kubelet[2862]: W0312 23:42:44.099932 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.099942 kubelet[2862]: E0312 23:42:44.099940 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.100091 kubelet[2862]: E0312 23:42:44.100079 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.100155 kubelet[2862]: W0312 23:42:44.100091 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.100155 kubelet[2862]: E0312 23:42:44.100111 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.100255 kubelet[2862]: E0312 23:42:44.100242 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.100294 kubelet[2862]: W0312 23:42:44.100260 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.100294 kubelet[2862]: E0312 23:42:44.100282 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.100510 kubelet[2862]: E0312 23:42:44.100496 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.100510 kubelet[2862]: W0312 23:42:44.100508 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.100568 kubelet[2862]: E0312 23:42:44.100517 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.100713 kubelet[2862]: E0312 23:42:44.100673 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.100748 kubelet[2862]: W0312 23:42:44.100715 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.100748 kubelet[2862]: E0312 23:42:44.100726 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.100942 kubelet[2862]: E0312 23:42:44.100900 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.100942 kubelet[2862]: W0312 23:42:44.100925 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.100998 kubelet[2862]: E0312 23:42:44.100934 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.101127 kubelet[2862]: E0312 23:42:44.101104 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.101127 kubelet[2862]: W0312 23:42:44.101124 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.101174 kubelet[2862]: E0312 23:42:44.101132 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.101311 kubelet[2862]: E0312 23:42:44.101299 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.101311 kubelet[2862]: W0312 23:42:44.101308 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.101368 kubelet[2862]: E0312 23:42:44.101316 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.146765 kubelet[2862]: E0312 23:42:44.146722 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.146765 kubelet[2862]: W0312 23:42:44.146745 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.146765 kubelet[2862]: E0312 23:42:44.146761 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.146969 kubelet[2862]: E0312 23:42:44.146954 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.146969 kubelet[2862]: W0312 23:42:44.146966 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.147018 kubelet[2862]: E0312 23:42:44.146975 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.147241 kubelet[2862]: E0312 23:42:44.147220 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.147344 kubelet[2862]: W0312 23:42:44.147240 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.147344 kubelet[2862]: E0312 23:42:44.147275 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.147508 kubelet[2862]: E0312 23:42:44.147482 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.147508 kubelet[2862]: W0312 23:42:44.147500 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.147508 kubelet[2862]: E0312 23:42:44.147508 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.147747 kubelet[2862]: E0312 23:42:44.147735 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.147747 kubelet[2862]: W0312 23:42:44.147747 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.147802 kubelet[2862]: E0312 23:42:44.147755 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.147962 kubelet[2862]: E0312 23:42:44.147948 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.147962 kubelet[2862]: W0312 23:42:44.147959 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.148016 kubelet[2862]: E0312 23:42:44.147968 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.148189 kubelet[2862]: E0312 23:42:44.148145 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.148189 kubelet[2862]: W0312 23:42:44.148167 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.148189 kubelet[2862]: E0312 23:42:44.148177 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.148382 kubelet[2862]: E0312 23:42:44.148365 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.148382 kubelet[2862]: W0312 23:42:44.148379 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.148437 kubelet[2862]: E0312 23:42:44.148390 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.148547 kubelet[2862]: E0312 23:42:44.148533 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.148547 kubelet[2862]: W0312 23:42:44.148544 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.148601 kubelet[2862]: E0312 23:42:44.148552 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.148713 kubelet[2862]: E0312 23:42:44.148702 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.148713 kubelet[2862]: W0312 23:42:44.148712 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.148760 kubelet[2862]: E0312 23:42:44.148720 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.148883 kubelet[2862]: E0312 23:42:44.148872 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.148927 kubelet[2862]: W0312 23:42:44.148883 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.148927 kubelet[2862]: E0312 23:42:44.148890 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.149188 kubelet[2862]: E0312 23:42:44.149152 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.149188 kubelet[2862]: W0312 23:42:44.149170 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.149188 kubelet[2862]: E0312 23:42:44.149183 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.149400 kubelet[2862]: E0312 23:42:44.149346 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.149400 kubelet[2862]: W0312 23:42:44.149359 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.149400 kubelet[2862]: E0312 23:42:44.149368 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.149560 kubelet[2862]: E0312 23:42:44.149530 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.149560 kubelet[2862]: W0312 23:42:44.149542 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.149560 kubelet[2862]: E0312 23:42:44.149551 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.149739 kubelet[2862]: E0312 23:42:44.149727 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.149739 kubelet[2862]: W0312 23:42:44.149738 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.149786 kubelet[2862]: E0312 23:42:44.149746 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.149936 kubelet[2862]: E0312 23:42:44.149921 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.149936 kubelet[2862]: W0312 23:42:44.149933 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.149990 kubelet[2862]: E0312 23:42:44.149942 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.150118 kubelet[2862]: E0312 23:42:44.150104 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.150118 kubelet[2862]: W0312 23:42:44.150117 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.150170 kubelet[2862]: E0312 23:42:44.150126 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:44.150300 kubelet[2862]: E0312 23:42:44.150287 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:44.150300 kubelet[2862]: W0312 23:42:44.150299 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:44.150353 kubelet[2862]: E0312 23:42:44.150309 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.002283 kubelet[2862]: E0312 23:42:45.002233 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:45.064473 kubelet[2862]: I0312 23:42:45.064446 2862 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:42:45.108383 kubelet[2862]: E0312 23:42:45.108345 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.108383 kubelet[2862]: W0312 23:42:45.108368 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.108383 kubelet[2862]: E0312 23:42:45.108387 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.108719 kubelet[2862]: E0312 23:42:45.108566 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.108719 kubelet[2862]: W0312 23:42:45.108575 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.108719 kubelet[2862]: E0312 23:42:45.108583 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.108791 kubelet[2862]: E0312 23:42:45.108731 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.108791 kubelet[2862]: W0312 23:42:45.108739 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.108791 kubelet[2862]: E0312 23:42:45.108750 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.108922 kubelet[2862]: E0312 23:42:45.108877 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.108922 kubelet[2862]: W0312 23:42:45.108888 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.108922 kubelet[2862]: E0312 23:42:45.108896 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.109077 kubelet[2862]: E0312 23:42:45.109054 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.109077 kubelet[2862]: W0312 23:42:45.109066 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.109077 kubelet[2862]: E0312 23:42:45.109074 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.109208 kubelet[2862]: E0312 23:42:45.109189 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.109208 kubelet[2862]: W0312 23:42:45.109200 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.109208 kubelet[2862]: E0312 23:42:45.109207 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.109349 kubelet[2862]: E0312 23:42:45.109338 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.109380 kubelet[2862]: W0312 23:42:45.109350 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.109380 kubelet[2862]: E0312 23:42:45.109357 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.109553 kubelet[2862]: E0312 23:42:45.109529 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.109553 kubelet[2862]: W0312 23:42:45.109541 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.109553 kubelet[2862]: E0312 23:42:45.109549 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.109687 kubelet[2862]: E0312 23:42:45.109676 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.109687 kubelet[2862]: W0312 23:42:45.109686 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.109732 kubelet[2862]: E0312 23:42:45.109694 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.109815 kubelet[2862]: E0312 23:42:45.109805 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.109840 kubelet[2862]: W0312 23:42:45.109815 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.109840 kubelet[2862]: E0312 23:42:45.109822 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.109945 kubelet[2862]: E0312 23:42:45.109936 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.109973 kubelet[2862]: W0312 23:42:45.109945 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.109973 kubelet[2862]: E0312 23:42:45.109952 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.110079 kubelet[2862]: E0312 23:42:45.110069 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.110101 kubelet[2862]: W0312 23:42:45.110078 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.110101 kubelet[2862]: E0312 23:42:45.110086 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.110226 kubelet[2862]: E0312 23:42:45.110215 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.110250 kubelet[2862]: W0312 23:42:45.110225 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.110250 kubelet[2862]: E0312 23:42:45.110233 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.110376 kubelet[2862]: E0312 23:42:45.110365 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.110397 kubelet[2862]: W0312 23:42:45.110375 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.110397 kubelet[2862]: E0312 23:42:45.110385 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.110515 kubelet[2862]: E0312 23:42:45.110506 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.110539 kubelet[2862]: W0312 23:42:45.110515 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.110539 kubelet[2862]: E0312 23:42:45.110522 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.155238 kubelet[2862]: E0312 23:42:45.155180 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.155238 kubelet[2862]: W0312 23:42:45.155226 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.155402 kubelet[2862]: E0312 23:42:45.155263 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.155543 kubelet[2862]: E0312 23:42:45.155530 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.155543 kubelet[2862]: W0312 23:42:45.155541 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.155596 kubelet[2862]: E0312 23:42:45.155549 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.155807 kubelet[2862]: E0312 23:42:45.155768 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.155807 kubelet[2862]: W0312 23:42:45.155790 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.155868 kubelet[2862]: E0312 23:42:45.155806 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.156001 kubelet[2862]: E0312 23:42:45.155979 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.156001 kubelet[2862]: W0312 23:42:45.155991 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.156001 kubelet[2862]: E0312 23:42:45.155999 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.156143 kubelet[2862]: E0312 23:42:45.156133 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.156143 kubelet[2862]: W0312 23:42:45.156143 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.156187 kubelet[2862]: E0312 23:42:45.156150 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.156316 kubelet[2862]: E0312 23:42:45.156306 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.156346 kubelet[2862]: W0312 23:42:45.156316 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.156346 kubelet[2862]: E0312 23:42:45.156324 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.156587 kubelet[2862]: E0312 23:42:45.156551 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.156587 kubelet[2862]: W0312 23:42:45.156574 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.156641 kubelet[2862]: E0312 23:42:45.156587 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.156762 kubelet[2862]: E0312 23:42:45.156750 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.156788 kubelet[2862]: W0312 23:42:45.156762 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.156788 kubelet[2862]: E0312 23:42:45.156770 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.156932 kubelet[2862]: E0312 23:42:45.156921 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.156958 kubelet[2862]: W0312 23:42:45.156931 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.156958 kubelet[2862]: E0312 23:42:45.156939 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.157081 kubelet[2862]: E0312 23:42:45.157070 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.157081 kubelet[2862]: W0312 23:42:45.157080 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.157123 kubelet[2862]: E0312 23:42:45.157087 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.157241 kubelet[2862]: E0312 23:42:45.157230 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.157275 kubelet[2862]: W0312 23:42:45.157240 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.157275 kubelet[2862]: E0312 23:42:45.157248 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.157416 kubelet[2862]: E0312 23:42:45.157402 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.157440 kubelet[2862]: W0312 23:42:45.157415 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.157440 kubelet[2862]: E0312 23:42:45.157425 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.157583 kubelet[2862]: E0312 23:42:45.157573 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.157605 kubelet[2862]: W0312 23:42:45.157583 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.157605 kubelet[2862]: E0312 23:42:45.157591 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.157739 kubelet[2862]: E0312 23:42:45.157728 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.157763 kubelet[2862]: W0312 23:42:45.157740 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.157763 kubelet[2862]: E0312 23:42:45.157747 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.157883 kubelet[2862]: E0312 23:42:45.157873 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.157907 kubelet[2862]: W0312 23:42:45.157883 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.157907 kubelet[2862]: E0312 23:42:45.157891 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.158057 kubelet[2862]: E0312 23:42:45.158044 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.158057 kubelet[2862]: W0312 23:42:45.158054 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.158109 kubelet[2862]: E0312 23:42:45.158062 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.158302 kubelet[2862]: E0312 23:42:45.158286 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.158339 kubelet[2862]: W0312 23:42:45.158301 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.158339 kubelet[2862]: E0312 23:42:45.158312 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.158521 kubelet[2862]: E0312 23:42:45.158507 2862 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 23:42:45.158521 kubelet[2862]: W0312 23:42:45.158520 2862 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 23:42:45.158565 kubelet[2862]: E0312 23:42:45.158530 2862 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 23:42:45.327578 containerd[1626]: time="2026-03-12T23:42:45.327525746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:45.329196 containerd[1626]: time="2026-03-12T23:42:45.329145674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Mar 12 23:42:45.329889 containerd[1626]: time="2026-03-12T23:42:45.329844318Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:45.333391 containerd[1626]: time="2026-03-12T23:42:45.333352095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:45.334075 containerd[1626]: time="2026-03-12T23:42:45.334030858Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.588186748s" Mar 12 23:42:45.334105 containerd[1626]: time="2026-03-12T23:42:45.334090018Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Mar 12 23:42:45.337789 containerd[1626]: time="2026-03-12T23:42:45.337606155Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 12 23:42:45.345081 containerd[1626]: time="2026-03-12T23:42:45.344438428Z" level=info msg="Container 43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:45.351379 containerd[1626]: time="2026-03-12T23:42:45.351345342Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633\"" Mar 12 23:42:45.351783 containerd[1626]: time="2026-03-12T23:42:45.351745704Z" level=info msg="StartContainer for \"43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633\"" Mar 12 23:42:45.353455 containerd[1626]: time="2026-03-12T23:42:45.353402152Z" level=info msg="connecting to shim 43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633" address="unix:///run/containerd/s/4fb4d9f1bc3900454a2270c68817154eec9656a1004480fdfd994fa7c5a6f31b" protocol=ttrpc version=3 Mar 12 23:42:45.370517 systemd[1]: Started cri-containerd-43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633.scope - libcontainer container 43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633. Mar 12 23:42:45.427943 containerd[1626]: time="2026-03-12T23:42:45.427907993Z" level=info msg="StartContainer for \"43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633\" returns successfully" Mar 12 23:42:45.438904 systemd[1]: cri-containerd-43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633.scope: Deactivated successfully. Mar 12 23:42:45.443287 containerd[1626]: time="2026-03-12T23:42:45.442729865Z" level=info msg="received container exit event container_id:\"43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633\" id:\"43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633\" pid:3539 exited_at:{seconds:1773358965 nanos:442310583}" Mar 12 23:42:45.461619 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-43b4fdc22e216abdeda1cd0982ec3b404377b2436eacaf4667f5c3b3ba646633-rootfs.mount: Deactivated successfully. Mar 12 23:42:47.002305 kubelet[2862]: E0312 23:42:47.002185 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:49.002136 kubelet[2862]: E0312 23:42:49.002089 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:50.077903 containerd[1626]: time="2026-03-12T23:42:50.077856489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 12 23:42:51.002369 kubelet[2862]: E0312 23:42:51.002248 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:53.001816 kubelet[2862]: E0312 23:42:53.001769 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:55.001764 kubelet[2862]: E0312 23:42:55.001710 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:57.002302 kubelet[2862]: E0312 23:42:57.002226 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:42:58.557061 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1942251023.mount: Deactivated successfully. Mar 12 23:42:58.582786 containerd[1626]: time="2026-03-12T23:42:58.582735744Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:58.583714 containerd[1626]: time="2026-03-12T23:42:58.583649948Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Mar 12 23:42:58.584796 containerd[1626]: time="2026-03-12T23:42:58.584763073Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:58.587140 containerd[1626]: time="2026-03-12T23:42:58.587107085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:42:58.587688 containerd[1626]: time="2026-03-12T23:42:58.587655927Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 8.509759078s" Mar 12 23:42:58.587921 containerd[1626]: time="2026-03-12T23:42:58.587840608Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Mar 12 23:42:58.593345 containerd[1626]: time="2026-03-12T23:42:58.593306555Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 12 23:42:58.603312 containerd[1626]: time="2026-03-12T23:42:58.602843521Z" level=info msg="Container e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:42:58.613042 containerd[1626]: time="2026-03-12T23:42:58.612977330Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45\"" Mar 12 23:42:58.613575 containerd[1626]: time="2026-03-12T23:42:58.613548893Z" level=info msg="StartContainer for \"e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45\"" Mar 12 23:42:58.615709 containerd[1626]: time="2026-03-12T23:42:58.615683704Z" level=info msg="connecting to shim e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45" address="unix:///run/containerd/s/4fb4d9f1bc3900454a2270c68817154eec9656a1004480fdfd994fa7c5a6f31b" protocol=ttrpc version=3 Mar 12 23:42:58.638586 systemd[1]: Started cri-containerd-e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45.scope - libcontainer container e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45. Mar 12 23:42:58.715838 containerd[1626]: time="2026-03-12T23:42:58.715799428Z" level=info msg="StartContainer for \"e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45\" returns successfully" Mar 12 23:42:58.816052 systemd[1]: cri-containerd-e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45.scope: Deactivated successfully. Mar 12 23:42:58.817161 containerd[1626]: time="2026-03-12T23:42:58.817101517Z" level=info msg="received container exit event container_id:\"e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45\" id:\"e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45\" pid:3601 exited_at:{seconds:1773358978 nanos:816938356}" Mar 12 23:42:58.837684 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e484f2e8bc751a10c9cf909e6e0fbc6540644390cab649c1abd141f660349f45-rootfs.mount: Deactivated successfully. Mar 12 23:42:59.002331 kubelet[2862]: E0312 23:42:59.002254 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:43:01.002360 kubelet[2862]: E0312 23:43:01.002308 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:43:03.001901 kubelet[2862]: E0312 23:43:03.001688 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:43:03.105609 containerd[1626]: time="2026-03-12T23:43:03.105556646Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 12 23:43:05.001376 kubelet[2862]: E0312 23:43:05.001305 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:43:06.953940 containerd[1626]: time="2026-03-12T23:43:06.953882798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:06.955314 containerd[1626]: time="2026-03-12T23:43:06.955260164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Mar 12 23:43:06.956387 containerd[1626]: time="2026-03-12T23:43:06.956347690Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:06.959052 containerd[1626]: time="2026-03-12T23:43:06.958995502Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:06.959748 containerd[1626]: time="2026-03-12T23:43:06.959718146Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 3.8541217s" Mar 12 23:43:06.959784 containerd[1626]: time="2026-03-12T23:43:06.959747386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Mar 12 23:43:06.962853 containerd[1626]: time="2026-03-12T23:43:06.962804841Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 12 23:43:06.972183 containerd[1626]: time="2026-03-12T23:43:06.971716203Z" level=info msg="Container 8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:06.982074 containerd[1626]: time="2026-03-12T23:43:06.982034853Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55\"" Mar 12 23:43:06.982679 containerd[1626]: time="2026-03-12T23:43:06.982660376Z" level=info msg="StartContainer for \"8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55\"" Mar 12 23:43:06.984378 containerd[1626]: time="2026-03-12T23:43:06.984350384Z" level=info msg="connecting to shim 8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55" address="unix:///run/containerd/s/4fb4d9f1bc3900454a2270c68817154eec9656a1004480fdfd994fa7c5a6f31b" protocol=ttrpc version=3 Mar 12 23:43:07.002437 kubelet[2862]: E0312 23:43:07.002396 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:43:07.011622 systemd[1]: Started cri-containerd-8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55.scope - libcontainer container 8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55. Mar 12 23:43:07.087572 containerd[1626]: time="2026-03-12T23:43:07.087532720Z" level=info msg="StartContainer for \"8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55\" returns successfully" Mar 12 23:43:07.853493 kubelet[2862]: I0312 23:43:07.852374 2862 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:08.407071 containerd[1626]: time="2026-03-12T23:43:08.407027624Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 23:43:08.409703 systemd[1]: cri-containerd-8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55.scope: Deactivated successfully. Mar 12 23:43:08.410519 systemd[1]: cri-containerd-8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55.scope: Consumed 505ms CPU time, 193.8M memory peak, 171.3M written to disk. Mar 12 23:43:08.411864 containerd[1626]: time="2026-03-12T23:43:08.411832087Z" level=info msg="received container exit event container_id:\"8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55\" id:\"8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55\" pid:3658 exited_at:{seconds:1773358988 nanos:411647687}" Mar 12 23:43:08.430787 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8109fd0dec66a3de74d6b74e42e61f3ed00ca8bce4b6c601c263e912b5200b55-rootfs.mount: Deactivated successfully. Mar 12 23:43:08.511263 kubelet[2862]: I0312 23:43:08.510692 2862 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 12 23:43:09.006418 systemd[1]: Created slice kubepods-besteffort-pod48bac47d_6b7b_4409_8577_545950ed7262.slice - libcontainer container kubepods-besteffort-pod48bac47d_6b7b_4409_8577_545950ed7262.slice. Mar 12 23:43:09.244838 containerd[1626]: time="2026-03-12T23:43:09.244766852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f2x6w,Uid:48bac47d-6b7b-4409-8577-545950ed7262,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:09.640821 systemd[1]: Created slice kubepods-burstable-podf1b551ed_be1a_4d83_8897_81bad277f0a5.slice - libcontainer container kubepods-burstable-podf1b551ed_be1a_4d83_8897_81bad277f0a5.slice. Mar 12 23:43:09.649246 systemd[1]: Created slice kubepods-burstable-podc5ce2203_0c58_4658_847f_ad2515051d13.slice - libcontainer container kubepods-burstable-podc5ce2203_0c58_4658_847f_ad2515051d13.slice. Mar 12 23:43:09.657251 systemd[1]: Created slice kubepods-besteffort-podb6528280_8831_4c3e_bbd8_e6089ff36a89.slice - libcontainer container kubepods-besteffort-podb6528280_8831_4c3e_bbd8_e6089ff36a89.slice. Mar 12 23:43:09.726942 kubelet[2862]: I0312 23:43:09.726858 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-backend-key-pair\") pod \"whisker-65485cdcf6-wd67l\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " pod="calico-system/whisker-65485cdcf6-wd67l" Mar 12 23:43:09.726942 kubelet[2862]: I0312 23:43:09.726924 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5ce2203-0c58-4658-847f-ad2515051d13-config-volume\") pod \"coredns-674b8bbfcf-2k8vr\" (UID: \"c5ce2203-0c58-4658-847f-ad2515051d13\") " pod="kube-system/coredns-674b8bbfcf-2k8vr" Mar 12 23:43:09.727583 kubelet[2862]: I0312 23:43:09.727000 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-nginx-config\") pod \"whisker-65485cdcf6-wd67l\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " pod="calico-system/whisker-65485cdcf6-wd67l" Mar 12 23:43:09.727583 kubelet[2862]: I0312 23:43:09.727030 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-ca-bundle\") pod \"whisker-65485cdcf6-wd67l\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " pod="calico-system/whisker-65485cdcf6-wd67l" Mar 12 23:43:09.727583 kubelet[2862]: I0312 23:43:09.727059 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq76l\" (UniqueName: \"kubernetes.io/projected/b6528280-8831-4c3e-bbd8-e6089ff36a89-kube-api-access-kq76l\") pod \"whisker-65485cdcf6-wd67l\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " pod="calico-system/whisker-65485cdcf6-wd67l" Mar 12 23:43:09.727583 kubelet[2862]: I0312 23:43:09.727143 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1b551ed-be1a-4d83-8897-81bad277f0a5-config-volume\") pod \"coredns-674b8bbfcf-579rg\" (UID: \"f1b551ed-be1a-4d83-8897-81bad277f0a5\") " pod="kube-system/coredns-674b8bbfcf-579rg" Mar 12 23:43:09.727583 kubelet[2862]: I0312 23:43:09.727202 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnn76\" (UniqueName: \"kubernetes.io/projected/c5ce2203-0c58-4658-847f-ad2515051d13-kube-api-access-vnn76\") pod \"coredns-674b8bbfcf-2k8vr\" (UID: \"c5ce2203-0c58-4658-847f-ad2515051d13\") " pod="kube-system/coredns-674b8bbfcf-2k8vr" Mar 12 23:43:09.727697 kubelet[2862]: I0312 23:43:09.727323 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gkl\" (UniqueName: \"kubernetes.io/projected/f1b551ed-be1a-4d83-8897-81bad277f0a5-kube-api-access-v9gkl\") pod \"coredns-674b8bbfcf-579rg\" (UID: \"f1b551ed-be1a-4d83-8897-81bad277f0a5\") " pod="kube-system/coredns-674b8bbfcf-579rg" Mar 12 23:43:09.761218 systemd[1]: Created slice kubepods-besteffort-podb9aa069c_90dc_42a9_86a8_579388459807.slice - libcontainer container kubepods-besteffort-podb9aa069c_90dc_42a9_86a8_579388459807.slice. Mar 12 23:43:09.828739 kubelet[2862]: I0312 23:43:09.828591 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b9aa069c-90dc-42a9-86a8-579388459807-calico-apiserver-certs\") pod \"calico-apiserver-5c57dc4894-2g59f\" (UID: \"b9aa069c-90dc-42a9-86a8-579388459807\") " pod="calico-system/calico-apiserver-5c57dc4894-2g59f" Mar 12 23:43:09.828739 kubelet[2862]: I0312 23:43:09.828673 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnsfk\" (UniqueName: \"kubernetes.io/projected/b9aa069c-90dc-42a9-86a8-579388459807-kube-api-access-mnsfk\") pod \"calico-apiserver-5c57dc4894-2g59f\" (UID: \"b9aa069c-90dc-42a9-86a8-579388459807\") " pod="calico-system/calico-apiserver-5c57dc4894-2g59f" Mar 12 23:43:09.898726 systemd[1]: Created slice kubepods-besteffort-pod8044de53_0cc8_430b_8aec_3b24194fa940.slice - libcontainer container kubepods-besteffort-pod8044de53_0cc8_430b_8aec_3b24194fa940.slice. Mar 12 23:43:09.906721 systemd[1]: Created slice kubepods-besteffort-pod447005ab_12f7_4c87_b94d_836d35cb7afb.slice - libcontainer container kubepods-besteffort-pod447005ab_12f7_4c87_b94d_836d35cb7afb.slice. Mar 12 23:43:09.914851 systemd[1]: Created slice kubepods-besteffort-pod0d9f6974_521e_4d9d_a269_abe3fcc76140.slice - libcontainer container kubepods-besteffort-pod0d9f6974_521e_4d9d_a269_abe3fcc76140.slice. Mar 12 23:43:09.951904 containerd[1626]: time="2026-03-12T23:43:09.951827932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-579rg,Uid:f1b551ed-be1a-4d83-8897-81bad277f0a5,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:09.955623 containerd[1626]: time="2026-03-12T23:43:09.955537189Z" level=error msg="Failed to destroy network for sandbox \"0b842721f9da84238fcb51ed99ff6b633ca0aaf6ff4a3b6efd867850eed5c398\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:09.956599 containerd[1626]: time="2026-03-12T23:43:09.956574394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2k8vr,Uid:c5ce2203-0c58-4658-847f-ad2515051d13,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:09.958960 containerd[1626]: time="2026-03-12T23:43:09.958869205Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f2x6w,Uid:48bac47d-6b7b-4409-8577-545950ed7262,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b842721f9da84238fcb51ed99ff6b633ca0aaf6ff4a3b6efd867850eed5c398\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:09.959216 kubelet[2862]: E0312 23:43:09.959151 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b842721f9da84238fcb51ed99ff6b633ca0aaf6ff4a3b6efd867850eed5c398\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:09.959960 kubelet[2862]: E0312 23:43:09.959465 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b842721f9da84238fcb51ed99ff6b633ca0aaf6ff4a3b6efd867850eed5c398\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-f2x6w" Mar 12 23:43:09.959960 kubelet[2862]: E0312 23:43:09.959493 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b842721f9da84238fcb51ed99ff6b633ca0aaf6ff4a3b6efd867850eed5c398\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-f2x6w" Mar 12 23:43:09.959960 kubelet[2862]: E0312 23:43:09.959545 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-f2x6w_calico-system(48bac47d-6b7b-4409-8577-545950ed7262)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-f2x6w_calico-system(48bac47d-6b7b-4409-8577-545950ed7262)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b842721f9da84238fcb51ed99ff6b633ca0aaf6ff4a3b6efd867850eed5c398\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-f2x6w" podUID="48bac47d-6b7b-4409-8577-545950ed7262" Mar 12 23:43:09.960647 containerd[1626]: time="2026-03-12T23:43:09.959786050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65485cdcf6-wd67l,Uid:b6528280-8831-4c3e-bbd8-e6089ff36a89,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:10.016972 containerd[1626]: time="2026-03-12T23:43:10.016901204Z" level=error msg="Failed to destroy network for sandbox \"1d1757f83562ee7f951d61b356017b7defc9dcddf93b70b585032f3cbff47308\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.016972 containerd[1626]: time="2026-03-12T23:43:10.016929965Z" level=error msg="Failed to destroy network for sandbox \"e75ddfd6413d5ce27b80f32e9c70759be8dae396e1ce716abdd727b0d2445d41\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.018686 containerd[1626]: time="2026-03-12T23:43:10.018648773Z" level=error msg="Failed to destroy network for sandbox \"c9624e683b68cc00301299373a6b420eabe864fc5744202c85893327931f1238\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.020122 containerd[1626]: time="2026-03-12T23:43:10.020025619Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65485cdcf6-wd67l,Uid:b6528280-8831-4c3e-bbd8-e6089ff36a89,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d1757f83562ee7f951d61b356017b7defc9dcddf93b70b585032f3cbff47308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.020310 kubelet[2862]: E0312 23:43:10.020247 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d1757f83562ee7f951d61b356017b7defc9dcddf93b70b585032f3cbff47308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.020397 kubelet[2862]: E0312 23:43:10.020355 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d1757f83562ee7f951d61b356017b7defc9dcddf93b70b585032f3cbff47308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65485cdcf6-wd67l" Mar 12 23:43:10.020460 kubelet[2862]: E0312 23:43:10.020422 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d1757f83562ee7f951d61b356017b7defc9dcddf93b70b585032f3cbff47308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65485cdcf6-wd67l" Mar 12 23:43:10.020808 kubelet[2862]: E0312 23:43:10.020534 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-65485cdcf6-wd67l_calico-system(b6528280-8831-4c3e-bbd8-e6089ff36a89)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-65485cdcf6-wd67l_calico-system(b6528280-8831-4c3e-bbd8-e6089ff36a89)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d1757f83562ee7f951d61b356017b7defc9dcddf93b70b585032f3cbff47308\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-65485cdcf6-wd67l" podUID="b6528280-8831-4c3e-bbd8-e6089ff36a89" Mar 12 23:43:10.021451 containerd[1626]: time="2026-03-12T23:43:10.021230745Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-579rg,Uid:f1b551ed-be1a-4d83-8897-81bad277f0a5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e75ddfd6413d5ce27b80f32e9c70759be8dae396e1ce716abdd727b0d2445d41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.021884 kubelet[2862]: E0312 23:43:10.021647 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e75ddfd6413d5ce27b80f32e9c70759be8dae396e1ce716abdd727b0d2445d41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.021884 kubelet[2862]: E0312 23:43:10.021681 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e75ddfd6413d5ce27b80f32e9c70759be8dae396e1ce716abdd727b0d2445d41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-579rg" Mar 12 23:43:10.021884 kubelet[2862]: E0312 23:43:10.021696 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e75ddfd6413d5ce27b80f32e9c70759be8dae396e1ce716abdd727b0d2445d41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-579rg" Mar 12 23:43:10.021989 kubelet[2862]: E0312 23:43:10.021726 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-579rg_kube-system(f1b551ed-be1a-4d83-8897-81bad277f0a5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-579rg_kube-system(f1b551ed-be1a-4d83-8897-81bad277f0a5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e75ddfd6413d5ce27b80f32e9c70759be8dae396e1ce716abdd727b0d2445d41\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-579rg" podUID="f1b551ed-be1a-4d83-8897-81bad277f0a5" Mar 12 23:43:10.023096 containerd[1626]: time="2026-03-12T23:43:10.023043154Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2k8vr,Uid:c5ce2203-0c58-4658-847f-ad2515051d13,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9624e683b68cc00301299373a6b420eabe864fc5744202c85893327931f1238\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.023385 kubelet[2862]: E0312 23:43:10.023221 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9624e683b68cc00301299373a6b420eabe864fc5744202c85893327931f1238\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.023385 kubelet[2862]: E0312 23:43:10.023256 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9624e683b68cc00301299373a6b420eabe864fc5744202c85893327931f1238\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2k8vr" Mar 12 23:43:10.023385 kubelet[2862]: E0312 23:43:10.023285 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9624e683b68cc00301299373a6b420eabe864fc5744202c85893327931f1238\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2k8vr" Mar 12 23:43:10.023487 kubelet[2862]: E0312 23:43:10.023324 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-2k8vr_kube-system(c5ce2203-0c58-4658-847f-ad2515051d13)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-2k8vr_kube-system(c5ce2203-0c58-4658-847f-ad2515051d13)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c9624e683b68cc00301299373a6b420eabe864fc5744202c85893327931f1238\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-2k8vr" podUID="c5ce2203-0c58-4658-847f-ad2515051d13" Mar 12 23:43:10.031075 kubelet[2862]: I0312 23:43:10.030993 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsfm\" (UniqueName: \"kubernetes.io/projected/8044de53-0cc8-430b-8aec-3b24194fa940-kube-api-access-wvsfm\") pod \"calico-apiserver-5c57dc4894-t58wb\" (UID: \"8044de53-0cc8-430b-8aec-3b24194fa940\") " pod="calico-system/calico-apiserver-5c57dc4894-t58wb" Mar 12 23:43:10.031075 kubelet[2862]: I0312 23:43:10.031038 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/447005ab-12f7-4c87-b94d-836d35cb7afb-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-4t7tf\" (UID: \"447005ab-12f7-4c87-b94d-836d35cb7afb\") " pod="calico-system/goldmane-5b85766d88-4t7tf" Mar 12 23:43:10.031202 kubelet[2862]: I0312 23:43:10.031140 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/447005ab-12f7-4c87-b94d-836d35cb7afb-goldmane-key-pair\") pod \"goldmane-5b85766d88-4t7tf\" (UID: \"447005ab-12f7-4c87-b94d-836d35cb7afb\") " pod="calico-system/goldmane-5b85766d88-4t7tf" Mar 12 23:43:10.031310 kubelet[2862]: I0312 23:43:10.031242 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447005ab-12f7-4c87-b94d-836d35cb7afb-config\") pod \"goldmane-5b85766d88-4t7tf\" (UID: \"447005ab-12f7-4c87-b94d-836d35cb7afb\") " pod="calico-system/goldmane-5b85766d88-4t7tf" Mar 12 23:43:10.031422 kubelet[2862]: I0312 23:43:10.031391 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d9f6974-521e-4d9d-a269-abe3fcc76140-tigera-ca-bundle\") pod \"calico-kube-controllers-7746b7cf7f-wsv4z\" (UID: \"0d9f6974-521e-4d9d-a269-abe3fcc76140\") " pod="calico-system/calico-kube-controllers-7746b7cf7f-wsv4z" Mar 12 23:43:10.031506 kubelet[2862]: I0312 23:43:10.031462 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q4mz\" (UniqueName: \"kubernetes.io/projected/0d9f6974-521e-4d9d-a269-abe3fcc76140-kube-api-access-8q4mz\") pod \"calico-kube-controllers-7746b7cf7f-wsv4z\" (UID: \"0d9f6974-521e-4d9d-a269-abe3fcc76140\") " pod="calico-system/calico-kube-controllers-7746b7cf7f-wsv4z" Mar 12 23:43:10.031580 kubelet[2862]: I0312 23:43:10.031549 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8044de53-0cc8-430b-8aec-3b24194fa940-calico-apiserver-certs\") pod \"calico-apiserver-5c57dc4894-t58wb\" (UID: \"8044de53-0cc8-430b-8aec-3b24194fa940\") " pod="calico-system/calico-apiserver-5c57dc4894-t58wb" Mar 12 23:43:10.031644 kubelet[2862]: I0312 23:43:10.031610 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5djpl\" (UniqueName: \"kubernetes.io/projected/447005ab-12f7-4c87-b94d-836d35cb7afb-kube-api-access-5djpl\") pod \"goldmane-5b85766d88-4t7tf\" (UID: \"447005ab-12f7-4c87-b94d-836d35cb7afb\") " pod="calico-system/goldmane-5b85766d88-4t7tf" Mar 12 23:43:10.064081 containerd[1626]: time="2026-03-12T23:43:10.064004191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-2g59f,Uid:b9aa069c-90dc-42a9-86a8-579388459807,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:10.108515 containerd[1626]: time="2026-03-12T23:43:10.108472045Z" level=error msg="Failed to destroy network for sandbox \"960b44d8f54c4fc2734ad7a9a60c0f4cd28fdec3a8c07b6f9364211dc03a7661\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.110516 containerd[1626]: time="2026-03-12T23:43:10.110471854Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-2g59f,Uid:b9aa069c-90dc-42a9-86a8-579388459807,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"960b44d8f54c4fc2734ad7a9a60c0f4cd28fdec3a8c07b6f9364211dc03a7661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.110751 kubelet[2862]: E0312 23:43:10.110712 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"960b44d8f54c4fc2734ad7a9a60c0f4cd28fdec3a8c07b6f9364211dc03a7661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.110802 kubelet[2862]: E0312 23:43:10.110772 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"960b44d8f54c4fc2734ad7a9a60c0f4cd28fdec3a8c07b6f9364211dc03a7661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5c57dc4894-2g59f" Mar 12 23:43:10.110802 kubelet[2862]: E0312 23:43:10.110791 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"960b44d8f54c4fc2734ad7a9a60c0f4cd28fdec3a8c07b6f9364211dc03a7661\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5c57dc4894-2g59f" Mar 12 23:43:10.110873 kubelet[2862]: E0312 23:43:10.110836 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c57dc4894-2g59f_calico-system(b9aa069c-90dc-42a9-86a8-579388459807)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c57dc4894-2g59f_calico-system(b9aa069c-90dc-42a9-86a8-579388459807)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"960b44d8f54c4fc2734ad7a9a60c0f4cd28fdec3a8c07b6f9364211dc03a7661\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5c57dc4894-2g59f" podUID="b9aa069c-90dc-42a9-86a8-579388459807" Mar 12 23:43:10.147959 containerd[1626]: time="2026-03-12T23:43:10.147601793Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 12 23:43:10.160691 containerd[1626]: time="2026-03-12T23:43:10.160596735Z" level=info msg="Container b776e39ac560500b9c99e988cfd3f9391bec7fa65ebc5ed56fe3c9f2759df189: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:10.173156 containerd[1626]: time="2026-03-12T23:43:10.173112755Z" level=info msg="CreateContainer within sandbox \"1e153a138eb87b5b6a06eefe5db395ff5327da61dbd40b55dbe103b46d107e0d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b776e39ac560500b9c99e988cfd3f9391bec7fa65ebc5ed56fe3c9f2759df189\"" Mar 12 23:43:10.173967 containerd[1626]: time="2026-03-12T23:43:10.173919719Z" level=info msg="StartContainer for \"b776e39ac560500b9c99e988cfd3f9391bec7fa65ebc5ed56fe3c9f2759df189\"" Mar 12 23:43:10.175583 containerd[1626]: time="2026-03-12T23:43:10.175555767Z" level=info msg="connecting to shim b776e39ac560500b9c99e988cfd3f9391bec7fa65ebc5ed56fe3c9f2759df189" address="unix:///run/containerd/s/4fb4d9f1bc3900454a2270c68817154eec9656a1004480fdfd994fa7c5a6f31b" protocol=ttrpc version=3 Mar 12 23:43:10.201594 systemd[1]: Started cri-containerd-b776e39ac560500b9c99e988cfd3f9391bec7fa65ebc5ed56fe3c9f2759df189.scope - libcontainer container b776e39ac560500b9c99e988cfd3f9391bec7fa65ebc5ed56fe3c9f2759df189. Mar 12 23:43:10.202661 containerd[1626]: time="2026-03-12T23:43:10.202614697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-t58wb,Uid:8044de53-0cc8-430b-8aec-3b24194fa940,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:10.212511 containerd[1626]: time="2026-03-12T23:43:10.212479785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-4t7tf,Uid:447005ab-12f7-4c87-b94d-836d35cb7afb,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:10.219491 containerd[1626]: time="2026-03-12T23:43:10.219443298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7746b7cf7f-wsv4z,Uid:0d9f6974-521e-4d9d-a269-abe3fcc76140,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:10.264726 containerd[1626]: time="2026-03-12T23:43:10.264672476Z" level=error msg="Failed to destroy network for sandbox \"de044da7570ba35c69720be5f15cd5ff75a98e853a048c0eba9bb7086b080118\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.266132 containerd[1626]: time="2026-03-12T23:43:10.266083362Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-t58wb,Uid:8044de53-0cc8-430b-8aec-3b24194fa940,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"de044da7570ba35c69720be5f15cd5ff75a98e853a048c0eba9bb7086b080118\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.266427 kubelet[2862]: E0312 23:43:10.266385 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de044da7570ba35c69720be5f15cd5ff75a98e853a048c0eba9bb7086b080118\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.266489 kubelet[2862]: E0312 23:43:10.266447 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de044da7570ba35c69720be5f15cd5ff75a98e853a048c0eba9bb7086b080118\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5c57dc4894-t58wb" Mar 12 23:43:10.266489 kubelet[2862]: E0312 23:43:10.266466 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de044da7570ba35c69720be5f15cd5ff75a98e853a048c0eba9bb7086b080118\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5c57dc4894-t58wb" Mar 12 23:43:10.266570 kubelet[2862]: E0312 23:43:10.266537 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c57dc4894-t58wb_calico-system(8044de53-0cc8-430b-8aec-3b24194fa940)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c57dc4894-t58wb_calico-system(8044de53-0cc8-430b-8aec-3b24194fa940)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de044da7570ba35c69720be5f15cd5ff75a98e853a048c0eba9bb7086b080118\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5c57dc4894-t58wb" podUID="8044de53-0cc8-430b-8aec-3b24194fa940" Mar 12 23:43:10.275571 containerd[1626]: time="2026-03-12T23:43:10.275527128Z" level=info msg="StartContainer for \"b776e39ac560500b9c99e988cfd3f9391bec7fa65ebc5ed56fe3c9f2759df189\" returns successfully" Mar 12 23:43:10.282247 containerd[1626]: time="2026-03-12T23:43:10.282198440Z" level=error msg="Failed to destroy network for sandbox \"79f1687a09715db72f4a54d6addb4d425e46e7ed47b5339e5565a51a30d24ad8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.284498 containerd[1626]: time="2026-03-12T23:43:10.284427451Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-4t7tf,Uid:447005ab-12f7-4c87-b94d-836d35cb7afb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f1687a09715db72f4a54d6addb4d425e46e7ed47b5339e5565a51a30d24ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.284662 kubelet[2862]: E0312 23:43:10.284633 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f1687a09715db72f4a54d6addb4d425e46e7ed47b5339e5565a51a30d24ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.284709 kubelet[2862]: E0312 23:43:10.284682 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f1687a09715db72f4a54d6addb4d425e46e7ed47b5339e5565a51a30d24ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-4t7tf" Mar 12 23:43:10.284743 kubelet[2862]: E0312 23:43:10.284710 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f1687a09715db72f4a54d6addb4d425e46e7ed47b5339e5565a51a30d24ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-4t7tf" Mar 12 23:43:10.284792 kubelet[2862]: E0312 23:43:10.284754 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-4t7tf_calico-system(447005ab-12f7-4c87-b94d-836d35cb7afb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-4t7tf_calico-system(447005ab-12f7-4c87-b94d-836d35cb7afb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"79f1687a09715db72f4a54d6addb4d425e46e7ed47b5339e5565a51a30d24ad8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-4t7tf" podUID="447005ab-12f7-4c87-b94d-836d35cb7afb" Mar 12 23:43:10.286434 containerd[1626]: time="2026-03-12T23:43:10.286376660Z" level=error msg="Failed to destroy network for sandbox \"505847ea46eb7ab9247faf99afb9098ffef7e8fda237af1c55c261c844ec0501\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.287917 containerd[1626]: time="2026-03-12T23:43:10.287863627Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7746b7cf7f-wsv4z,Uid:0d9f6974-521e-4d9d-a269-abe3fcc76140,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"505847ea46eb7ab9247faf99afb9098ffef7e8fda237af1c55c261c844ec0501\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.288494 kubelet[2862]: E0312 23:43:10.288421 2862 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"505847ea46eb7ab9247faf99afb9098ffef7e8fda237af1c55c261c844ec0501\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 23:43:10.288992 kubelet[2862]: E0312 23:43:10.288589 2862 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"505847ea46eb7ab9247faf99afb9098ffef7e8fda237af1c55c261c844ec0501\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7746b7cf7f-wsv4z" Mar 12 23:43:10.288992 kubelet[2862]: E0312 23:43:10.288616 2862 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"505847ea46eb7ab9247faf99afb9098ffef7e8fda237af1c55c261c844ec0501\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7746b7cf7f-wsv4z" Mar 12 23:43:10.288992 kubelet[2862]: E0312 23:43:10.288661 2862 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7746b7cf7f-wsv4z_calico-system(0d9f6974-521e-4d9d-a269-abe3fcc76140)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7746b7cf7f-wsv4z_calico-system(0d9f6974-521e-4d9d-a269-abe3fcc76140)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"505847ea46eb7ab9247faf99afb9098ffef7e8fda237af1c55c261c844ec0501\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7746b7cf7f-wsv4z" podUID="0d9f6974-521e-4d9d-a269-abe3fcc76140" Mar 12 23:43:10.535632 kubelet[2862]: I0312 23:43:10.535527 2862 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-nginx-config\") pod \"b6528280-8831-4c3e-bbd8-e6089ff36a89\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " Mar 12 23:43:10.535632 kubelet[2862]: I0312 23:43:10.535570 2862 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq76l\" (UniqueName: \"kubernetes.io/projected/b6528280-8831-4c3e-bbd8-e6089ff36a89-kube-api-access-kq76l\") pod \"b6528280-8831-4c3e-bbd8-e6089ff36a89\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " Mar 12 23:43:10.535632 kubelet[2862]: I0312 23:43:10.535601 2862 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-backend-key-pair\") pod \"b6528280-8831-4c3e-bbd8-e6089ff36a89\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " Mar 12 23:43:10.535632 kubelet[2862]: I0312 23:43:10.535618 2862 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-ca-bundle\") pod \"b6528280-8831-4c3e-bbd8-e6089ff36a89\" (UID: \"b6528280-8831-4c3e-bbd8-e6089ff36a89\") " Mar 12 23:43:10.536151 kubelet[2862]: I0312 23:43:10.535996 2862 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "b6528280-8831-4c3e-bbd8-e6089ff36a89" (UID: "b6528280-8831-4c3e-bbd8-e6089ff36a89"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:43:10.536209 kubelet[2862]: I0312 23:43:10.536177 2862 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "b6528280-8831-4c3e-bbd8-e6089ff36a89" (UID: "b6528280-8831-4c3e-bbd8-e6089ff36a89"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 23:43:10.538801 kubelet[2862]: I0312 23:43:10.538743 2862 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6528280-8831-4c3e-bbd8-e6089ff36a89-kube-api-access-kq76l" (OuterVolumeSpecName: "kube-api-access-kq76l") pod "b6528280-8831-4c3e-bbd8-e6089ff36a89" (UID: "b6528280-8831-4c3e-bbd8-e6089ff36a89"). InnerVolumeSpecName "kube-api-access-kq76l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 23:43:10.539236 kubelet[2862]: I0312 23:43:10.539190 2862 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "b6528280-8831-4c3e-bbd8-e6089ff36a89" (UID: "b6528280-8831-4c3e-bbd8-e6089ff36a89"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 23:43:10.636253 kubelet[2862]: I0312 23:43:10.636173 2862 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-27aefdfc79\" DevicePath \"\"" Mar 12 23:43:10.636253 kubelet[2862]: I0312 23:43:10.636211 2862 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-whisker-ca-bundle\") on node \"ci-4459-2-4-n-27aefdfc79\" DevicePath \"\"" Mar 12 23:43:10.636253 kubelet[2862]: I0312 23:43:10.636221 2862 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b6528280-8831-4c3e-bbd8-e6089ff36a89-nginx-config\") on node \"ci-4459-2-4-n-27aefdfc79\" DevicePath \"\"" Mar 12 23:43:10.636253 kubelet[2862]: I0312 23:43:10.636229 2862 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kq76l\" (UniqueName: \"kubernetes.io/projected/b6528280-8831-4c3e-bbd8-e6089ff36a89-kube-api-access-kq76l\") on node \"ci-4459-2-4-n-27aefdfc79\" DevicePath \"\"" Mar 12 23:43:10.840333 systemd[1]: run-netns-cni\x2ddc167e2a\x2db2fe\x2d3e0a\x2dbb11\x2d25c7eac2168e.mount: Deactivated successfully. Mar 12 23:43:10.840420 systemd[1]: var-lib-kubelet-pods-b6528280\x2d8831\x2d4c3e\x2dbbd8\x2de6089ff36a89-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkq76l.mount: Deactivated successfully. Mar 12 23:43:10.840469 systemd[1]: var-lib-kubelet-pods-b6528280\x2d8831\x2d4c3e\x2dbbd8\x2de6089ff36a89-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 12 23:43:11.139544 systemd[1]: Removed slice kubepods-besteffort-podb6528280_8831_4c3e_bbd8_e6089ff36a89.slice - libcontainer container kubepods-besteffort-podb6528280_8831_4c3e_bbd8_e6089ff36a89.slice. Mar 12 23:43:11.155947 kubelet[2862]: I0312 23:43:11.155885 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hr4b9" podStartSLOduration=4.817260488 podStartE2EDuration="30.155869559s" podCreationTimestamp="2026-03-12 23:42:41 +0000 UTC" firstStartedPulling="2026-03-12 23:42:41.621697838 +0000 UTC m=+21.835581983" lastFinishedPulling="2026-03-12 23:43:06.960306909 +0000 UTC m=+47.174191054" observedRunningTime="2026-03-12 23:43:11.155488077 +0000 UTC m=+51.369372222" watchObservedRunningTime="2026-03-12 23:43:11.155869559 +0000 UTC m=+51.369753704" Mar 12 23:43:11.219645 systemd[1]: Created slice kubepods-besteffort-poda3f1a6f3_d1cd_408f_9c36_5940b9d68410.slice - libcontainer container kubepods-besteffort-poda3f1a6f3_d1cd_408f_9c36_5940b9d68410.slice. Mar 12 23:43:11.341000 kubelet[2862]: I0312 23:43:11.340926 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a3f1a6f3-d1cd-408f-9c36-5940b9d68410-nginx-config\") pod \"whisker-7549b86665-qcgm9\" (UID: \"a3f1a6f3-d1cd-408f-9c36-5940b9d68410\") " pod="calico-system/whisker-7549b86665-qcgm9" Mar 12 23:43:11.341148 kubelet[2862]: I0312 23:43:11.341050 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a3f1a6f3-d1cd-408f-9c36-5940b9d68410-whisker-backend-key-pair\") pod \"whisker-7549b86665-qcgm9\" (UID: \"a3f1a6f3-d1cd-408f-9c36-5940b9d68410\") " pod="calico-system/whisker-7549b86665-qcgm9" Mar 12 23:43:11.341148 kubelet[2862]: I0312 23:43:11.341128 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f1a6f3-d1cd-408f-9c36-5940b9d68410-whisker-ca-bundle\") pod \"whisker-7549b86665-qcgm9\" (UID: \"a3f1a6f3-d1cd-408f-9c36-5940b9d68410\") " pod="calico-system/whisker-7549b86665-qcgm9" Mar 12 23:43:11.341259 kubelet[2862]: I0312 23:43:11.341203 2862 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fz4s\" (UniqueName: \"kubernetes.io/projected/a3f1a6f3-d1cd-408f-9c36-5940b9d68410-kube-api-access-5fz4s\") pod \"whisker-7549b86665-qcgm9\" (UID: \"a3f1a6f3-d1cd-408f-9c36-5940b9d68410\") " pod="calico-system/whisker-7549b86665-qcgm9" Mar 12 23:43:11.522802 containerd[1626]: time="2026-03-12T23:43:11.522686939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7549b86665-qcgm9,Uid:a3f1a6f3-d1cd-408f-9c36-5940b9d68410,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:11.692075 systemd-networkd[1428]: caliac86fbb0833: Link UP Mar 12 23:43:11.692589 systemd-networkd[1428]: caliac86fbb0833: Gained carrier Mar 12 23:43:11.709517 containerd[1626]: 2026-03-12 23:43:11.546 [ERROR][4057] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 23:43:11.709517 containerd[1626]: 2026-03-12 23:43:11.570 [INFO][4057] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0 whisker-7549b86665- calico-system a3f1a6f3-d1cd-408f-9c36-5940b9d68410 957 0 2026-03-12 23:43:11 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7549b86665 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 whisker-7549b86665-qcgm9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliac86fbb0833 [] [] }} ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-" Mar 12 23:43:11.709517 containerd[1626]: 2026-03-12 23:43:11.570 [INFO][4057] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" Mar 12 23:43:11.709517 containerd[1626]: 2026-03-12 23:43:11.625 [INFO][4113] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" HandleID="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Workload="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.636 [INFO][4113] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" HandleID="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Workload="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000378c00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"whisker-7549b86665-qcgm9", "timestamp":"2026-03-12 23:43:11.62582092 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003851e0)} Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.636 [INFO][4113] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.636 [INFO][4113] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.636 [INFO][4113] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.642 [INFO][4113] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.647 [INFO][4113] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.655 [INFO][4113] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.657 [INFO][4113] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709730 containerd[1626]: 2026-03-12 23:43:11.660 [INFO][4113] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709909 containerd[1626]: 2026-03-12 23:43:11.660 [INFO][4113] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709909 containerd[1626]: 2026-03-12 23:43:11.662 [INFO][4113] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46 Mar 12 23:43:11.709909 containerd[1626]: 2026-03-12 23:43:11.667 [INFO][4113] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709909 containerd[1626]: 2026-03-12 23:43:11.673 [INFO][4113] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.1/26] block=192.168.36.0/26 handle="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709909 containerd[1626]: 2026-03-12 23:43:11.673 [INFO][4113] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.1/26] handle="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:11.709909 containerd[1626]: 2026-03-12 23:43:11.673 [INFO][4113] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:11.709909 containerd[1626]: 2026-03-12 23:43:11.673 [INFO][4113] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.1/26] IPv6=[] ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" HandleID="k8s-pod-network.7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Workload="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" Mar 12 23:43:11.710028 containerd[1626]: 2026-03-12 23:43:11.678 [INFO][4057] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0", GenerateName:"whisker-7549b86665-", Namespace:"calico-system", SelfLink:"", UID:"a3f1a6f3-d1cd-408f-9c36-5940b9d68410", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7549b86665", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"whisker-7549b86665-qcgm9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.36.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliac86fbb0833", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:11.710028 containerd[1626]: 2026-03-12 23:43:11.678 [INFO][4057] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.1/32] ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" Mar 12 23:43:11.710095 containerd[1626]: 2026-03-12 23:43:11.679 [INFO][4057] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliac86fbb0833 ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" Mar 12 23:43:11.710095 containerd[1626]: 2026-03-12 23:43:11.691 [INFO][4057] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" Mar 12 23:43:11.710147 containerd[1626]: 2026-03-12 23:43:11.692 [INFO][4057] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0", GenerateName:"whisker-7549b86665-", Namespace:"calico-system", SelfLink:"", UID:"a3f1a6f3-d1cd-408f-9c36-5940b9d68410", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 43, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7549b86665", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46", Pod:"whisker-7549b86665-qcgm9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.36.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliac86fbb0833", MAC:"ea:73:ff:40:93:c3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:11.710197 containerd[1626]: 2026-03-12 23:43:11.703 [INFO][4057] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" Namespace="calico-system" Pod="whisker-7549b86665-qcgm9" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-whisker--7549b86665--qcgm9-eth0" Mar 12 23:43:11.740782 containerd[1626]: time="2026-03-12T23:43:11.740723438Z" level=info msg="connecting to shim 7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46" address="unix:///run/containerd/s/1b56dd6fd6f8f11d3bacd8020cafa4b602b9206e5ca6bf1305132cb13d5110d1" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:11.773454 systemd[1]: Started cri-containerd-7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46.scope - libcontainer container 7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46. Mar 12 23:43:11.816523 containerd[1626]: time="2026-03-12T23:43:11.816476366Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7549b86665-qcgm9,Uid:a3f1a6f3-d1cd-408f-9c36-5940b9d68410,Namespace:calico-system,Attempt:0,} returns sandbox id \"7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46\"" Mar 12 23:43:11.819172 containerd[1626]: time="2026-03-12T23:43:11.818848897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 12 23:43:12.008608 kubelet[2862]: I0312 23:43:12.008568 2862 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6528280-8831-4c3e-bbd8-e6089ff36a89" path="/var/lib/kubelet/pods/b6528280-8831-4c3e-bbd8-e6089ff36a89/volumes" Mar 12 23:43:12.215746 systemd-networkd[1428]: vxlan.calico: Link UP Mar 12 23:43:12.215753 systemd-networkd[1428]: vxlan.calico: Gained carrier Mar 12 23:43:13.266245 systemd-networkd[1428]: caliac86fbb0833: Gained IPv6LL Mar 12 23:43:13.393417 systemd-networkd[1428]: vxlan.calico: Gained IPv6LL Mar 12 23:43:13.472251 containerd[1626]: time="2026-03-12T23:43:13.472204603Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:13.474422 containerd[1626]: time="2026-03-12T23:43:13.474383014Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Mar 12 23:43:13.475213 containerd[1626]: time="2026-03-12T23:43:13.475176898Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:13.477604 containerd[1626]: time="2026-03-12T23:43:13.477556149Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:13.478244 containerd[1626]: time="2026-03-12T23:43:13.478207592Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.659325935s" Mar 12 23:43:13.478296 containerd[1626]: time="2026-03-12T23:43:13.478250953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Mar 12 23:43:13.482218 containerd[1626]: time="2026-03-12T23:43:13.482187532Z" level=info msg="CreateContainer within sandbox \"7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 12 23:43:13.488217 containerd[1626]: time="2026-03-12T23:43:13.488103760Z" level=info msg="Container 48972ec6281e0d3cf4371ffcec845d2481760755bd16f54f546ca5496b5d6163: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:13.496511 containerd[1626]: time="2026-03-12T23:43:13.496419521Z" level=info msg="CreateContainer within sandbox \"7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"48972ec6281e0d3cf4371ffcec845d2481760755bd16f54f546ca5496b5d6163\"" Mar 12 23:43:13.497243 containerd[1626]: time="2026-03-12T23:43:13.497216685Z" level=info msg="StartContainer for \"48972ec6281e0d3cf4371ffcec845d2481760755bd16f54f546ca5496b5d6163\"" Mar 12 23:43:13.498344 containerd[1626]: time="2026-03-12T23:43:13.498309970Z" level=info msg="connecting to shim 48972ec6281e0d3cf4371ffcec845d2481760755bd16f54f546ca5496b5d6163" address="unix:///run/containerd/s/1b56dd6fd6f8f11d3bacd8020cafa4b602b9206e5ca6bf1305132cb13d5110d1" protocol=ttrpc version=3 Mar 12 23:43:13.524572 systemd[1]: Started cri-containerd-48972ec6281e0d3cf4371ffcec845d2481760755bd16f54f546ca5496b5d6163.scope - libcontainer container 48972ec6281e0d3cf4371ffcec845d2481760755bd16f54f546ca5496b5d6163. Mar 12 23:43:13.559078 containerd[1626]: time="2026-03-12T23:43:13.558922744Z" level=info msg="StartContainer for \"48972ec6281e0d3cf4371ffcec845d2481760755bd16f54f546ca5496b5d6163\" returns successfully" Mar 12 23:43:13.559983 containerd[1626]: time="2026-03-12T23:43:13.559949829Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 12 23:43:15.732694 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2218074913.mount: Deactivated successfully. Mar 12 23:43:15.755747 containerd[1626]: time="2026-03-12T23:43:15.755695995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:15.756983 containerd[1626]: time="2026-03-12T23:43:15.756959081Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Mar 12 23:43:15.758230 containerd[1626]: time="2026-03-12T23:43:15.758174007Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:15.760282 containerd[1626]: time="2026-03-12T23:43:15.760224057Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:15.760987 containerd[1626]: time="2026-03-12T23:43:15.760858140Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.200879151s" Mar 12 23:43:15.760987 containerd[1626]: time="2026-03-12T23:43:15.760890660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Mar 12 23:43:15.766074 containerd[1626]: time="2026-03-12T23:43:15.766037764Z" level=info msg="CreateContainer within sandbox \"7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 12 23:43:15.773920 containerd[1626]: time="2026-03-12T23:43:15.773475720Z" level=info msg="Container 279d78e8a1dbb493ae1c6cf0e5203e4377f13900c09f3fcefddc370f6fbb4435: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:15.781587 containerd[1626]: time="2026-03-12T23:43:15.781553678Z" level=info msg="CreateContainer within sandbox \"7108151296cd4187a52612dc0c08f2f726bcd7c33e119a1c9bda3c7d65e0ed46\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"279d78e8a1dbb493ae1c6cf0e5203e4377f13900c09f3fcefddc370f6fbb4435\"" Mar 12 23:43:15.782954 containerd[1626]: time="2026-03-12T23:43:15.782921285Z" level=info msg="StartContainer for \"279d78e8a1dbb493ae1c6cf0e5203e4377f13900c09f3fcefddc370f6fbb4435\"" Mar 12 23:43:15.784138 containerd[1626]: time="2026-03-12T23:43:15.783975810Z" level=info msg="connecting to shim 279d78e8a1dbb493ae1c6cf0e5203e4377f13900c09f3fcefddc370f6fbb4435" address="unix:///run/containerd/s/1b56dd6fd6f8f11d3bacd8020cafa4b602b9206e5ca6bf1305132cb13d5110d1" protocol=ttrpc version=3 Mar 12 23:43:15.803483 systemd[1]: Started cri-containerd-279d78e8a1dbb493ae1c6cf0e5203e4377f13900c09f3fcefddc370f6fbb4435.scope - libcontainer container 279d78e8a1dbb493ae1c6cf0e5203e4377f13900c09f3fcefddc370f6fbb4435. Mar 12 23:43:15.841623 containerd[1626]: time="2026-03-12T23:43:15.841585724Z" level=info msg="StartContainer for \"279d78e8a1dbb493ae1c6cf0e5203e4377f13900c09f3fcefddc370f6fbb4435\" returns successfully" Mar 12 23:43:21.002824 containerd[1626]: time="2026-03-12T23:43:21.002652577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-t58wb,Uid:8044de53-0cc8-430b-8aec-3b24194fa940,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:21.004129 containerd[1626]: time="2026-03-12T23:43:21.002827977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2k8vr,Uid:c5ce2203-0c58-4658-847f-ad2515051d13,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:21.114603 systemd-networkd[1428]: calidc81405ffb1: Link UP Mar 12 23:43:21.115164 systemd-networkd[1428]: calidc81405ffb1: Gained carrier Mar 12 23:43:21.124305 kubelet[2862]: I0312 23:43:21.123766 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7549b86665-qcgm9" podStartSLOduration=6.179888992 podStartE2EDuration="10.123751244s" podCreationTimestamp="2026-03-12 23:43:11 +0000 UTC" firstStartedPulling="2026-03-12 23:43:11.818645816 +0000 UTC m=+52.032529961" lastFinishedPulling="2026-03-12 23:43:15.762508068 +0000 UTC m=+55.976392213" observedRunningTime="2026-03-12 23:43:16.161779891 +0000 UTC m=+56.375663996" watchObservedRunningTime="2026-03-12 23:43:21.123751244 +0000 UTC m=+61.337635389" Mar 12 23:43:21.129549 containerd[1626]: 2026-03-12 23:43:21.048 [INFO][4512] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0 calico-apiserver-5c57dc4894- calico-system 8044de53-0cc8-430b-8aec-3b24194fa940 904 0 2026-03-12 23:42:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c57dc4894 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 calico-apiserver-5c57dc4894-t58wb eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calidc81405ffb1 [] [] }} ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-" Mar 12 23:43:21.129549 containerd[1626]: 2026-03-12 23:43:21.048 [INFO][4512] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" Mar 12 23:43:21.129549 containerd[1626]: 2026-03-12 23:43:21.071 [INFO][4530] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" HandleID="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.081 [INFO][4530] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" HandleID="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ea4d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"calico-apiserver-5c57dc4894-t58wb", "timestamp":"2026-03-12 23:43:21.071079268 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000438000)} Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.081 [INFO][4530] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.081 [INFO][4530] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.081 [INFO][4530] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.084 [INFO][4530] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.089 [INFO][4530] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.093 [INFO][4530] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.095 [INFO][4530] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129738 containerd[1626]: 2026-03-12 23:43:21.097 [INFO][4530] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129933 containerd[1626]: 2026-03-12 23:43:21.097 [INFO][4530] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129933 containerd[1626]: 2026-03-12 23:43:21.099 [INFO][4530] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb Mar 12 23:43:21.129933 containerd[1626]: 2026-03-12 23:43:21.103 [INFO][4530] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129933 containerd[1626]: 2026-03-12 23:43:21.109 [INFO][4530] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.2/26] block=192.168.36.0/26 handle="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129933 containerd[1626]: 2026-03-12 23:43:21.109 [INFO][4530] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.2/26] handle="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.129933 containerd[1626]: 2026-03-12 23:43:21.110 [INFO][4530] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:21.129933 containerd[1626]: 2026-03-12 23:43:21.110 [INFO][4530] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.2/26] IPv6=[] ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" HandleID="k8s-pod-network.dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" Mar 12 23:43:21.130101 containerd[1626]: 2026-03-12 23:43:21.112 [INFO][4512] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0", GenerateName:"calico-apiserver-5c57dc4894-", Namespace:"calico-system", SelfLink:"", UID:"8044de53-0cc8-430b-8aec-3b24194fa940", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c57dc4894", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"calico-apiserver-5c57dc4894-t58wb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calidc81405ffb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:21.130150 containerd[1626]: 2026-03-12 23:43:21.112 [INFO][4512] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.2/32] ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" Mar 12 23:43:21.130150 containerd[1626]: 2026-03-12 23:43:21.112 [INFO][4512] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc81405ffb1 ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" Mar 12 23:43:21.130150 containerd[1626]: 2026-03-12 23:43:21.114 [INFO][4512] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" Mar 12 23:43:21.130208 containerd[1626]: 2026-03-12 23:43:21.115 [INFO][4512] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0", GenerateName:"calico-apiserver-5c57dc4894-", Namespace:"calico-system", SelfLink:"", UID:"8044de53-0cc8-430b-8aec-3b24194fa940", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c57dc4894", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb", Pod:"calico-apiserver-5c57dc4894-t58wb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calidc81405ffb1", MAC:"0e:ec:79:a6:f3:37", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:21.130258 containerd[1626]: 2026-03-12 23:43:21.125 [INFO][4512] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-t58wb" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--t58wb-eth0" Mar 12 23:43:21.155239 containerd[1626]: time="2026-03-12T23:43:21.155176796Z" level=info msg="connecting to shim dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb" address="unix:///run/containerd/s/d947cee0aa9e4b90c7805af0bfb5655cca3bee960d29b41a6458dcb6b8950cb7" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:21.177442 systemd[1]: Started cri-containerd-dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb.scope - libcontainer container dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb. Mar 12 23:43:21.218227 containerd[1626]: time="2026-03-12T23:43:21.218180701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-t58wb,Uid:8044de53-0cc8-430b-8aec-3b24194fa940,Namespace:calico-system,Attempt:0,} returns sandbox id \"dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb\"" Mar 12 23:43:21.219806 containerd[1626]: time="2026-03-12T23:43:21.219768709Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:43:21.223355 systemd-networkd[1428]: cali151789f054f: Link UP Mar 12 23:43:21.223705 systemd-networkd[1428]: cali151789f054f: Gained carrier Mar 12 23:43:21.237715 containerd[1626]: 2026-03-12 23:43:21.046 [INFO][4499] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0 coredns-674b8bbfcf- kube-system c5ce2203-0c58-4658-847f-ad2515051d13 902 0 2026-03-12 23:42:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 coredns-674b8bbfcf-2k8vr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali151789f054f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-" Mar 12 23:43:21.237715 containerd[1626]: 2026-03-12 23:43:21.047 [INFO][4499] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" Mar 12 23:43:21.237715 containerd[1626]: 2026-03-12 23:43:21.072 [INFO][4528] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" HandleID="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Workload="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.087 [INFO][4528] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" HandleID="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Workload="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a0450), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"coredns-674b8bbfcf-2k8vr", "timestamp":"2026-03-12 23:43:21.072877517 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40005da000)} Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.087 [INFO][4528] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.110 [INFO][4528] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.110 [INFO][4528] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.186 [INFO][4528] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.192 [INFO][4528] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.197 [INFO][4528] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.199 [INFO][4528] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.237907 containerd[1626]: 2026-03-12 23:43:21.202 [INFO][4528] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.238089 containerd[1626]: 2026-03-12 23:43:21.202 [INFO][4528] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.238089 containerd[1626]: 2026-03-12 23:43:21.204 [INFO][4528] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273 Mar 12 23:43:21.238089 containerd[1626]: 2026-03-12 23:43:21.209 [INFO][4528] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.238089 containerd[1626]: 2026-03-12 23:43:21.217 [INFO][4528] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.3/26] block=192.168.36.0/26 handle="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.238089 containerd[1626]: 2026-03-12 23:43:21.217 [INFO][4528] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.3/26] handle="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:21.238089 containerd[1626]: 2026-03-12 23:43:21.217 [INFO][4528] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:21.238089 containerd[1626]: 2026-03-12 23:43:21.217 [INFO][4528] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.3/26] IPv6=[] ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" HandleID="k8s-pod-network.26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Workload="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" Mar 12 23:43:21.238222 containerd[1626]: 2026-03-12 23:43:21.220 [INFO][4499] cni-plugin/k8s.go 418: Populated endpoint ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c5ce2203-0c58-4658-847f-ad2515051d13", ResourceVersion:"902", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"coredns-674b8bbfcf-2k8vr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali151789f054f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:21.238222 containerd[1626]: 2026-03-12 23:43:21.220 [INFO][4499] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.3/32] ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" Mar 12 23:43:21.238222 containerd[1626]: 2026-03-12 23:43:21.220 [INFO][4499] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali151789f054f ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" Mar 12 23:43:21.238222 containerd[1626]: 2026-03-12 23:43:21.223 [INFO][4499] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" Mar 12 23:43:21.238222 containerd[1626]: 2026-03-12 23:43:21.224 [INFO][4499] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c5ce2203-0c58-4658-847f-ad2515051d13", ResourceVersion:"902", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273", Pod:"coredns-674b8bbfcf-2k8vr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali151789f054f", MAC:"46:59:a2:c5:bd:aa", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:21.238222 containerd[1626]: 2026-03-12 23:43:21.234 [INFO][4499] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" Namespace="kube-system" Pod="coredns-674b8bbfcf-2k8vr" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--2k8vr-eth0" Mar 12 23:43:21.265187 containerd[1626]: time="2026-03-12T23:43:21.264851088Z" level=info msg="connecting to shim 26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273" address="unix:///run/containerd/s/0bd483e46fc15a3e66428049cbce6ea38cba78d15f628c2bc3656f2c188317aa" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:21.289439 systemd[1]: Started cri-containerd-26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273.scope - libcontainer container 26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273. Mar 12 23:43:21.319856 containerd[1626]: time="2026-03-12T23:43:21.319819994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2k8vr,Uid:c5ce2203-0c58-4658-847f-ad2515051d13,Namespace:kube-system,Attempt:0,} returns sandbox id \"26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273\"" Mar 12 23:43:21.326627 containerd[1626]: time="2026-03-12T23:43:21.326593667Z" level=info msg="CreateContainer within sandbox \"26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:43:21.334958 containerd[1626]: time="2026-03-12T23:43:21.334381305Z" level=info msg="Container e965ad22ccdab954350763623c72591237edaa9db6bf560363baa34a8e84b5e4: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:21.343344 containerd[1626]: time="2026-03-12T23:43:21.343305228Z" level=info msg="CreateContainer within sandbox \"26aaacd149b436ec82417cd322cabe4b2ece7f38857925ffcdd2551c4aa7f273\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e965ad22ccdab954350763623c72591237edaa9db6bf560363baa34a8e84b5e4\"" Mar 12 23:43:21.344322 containerd[1626]: time="2026-03-12T23:43:21.343969991Z" level=info msg="StartContainer for \"e965ad22ccdab954350763623c72591237edaa9db6bf560363baa34a8e84b5e4\"" Mar 12 23:43:21.344987 containerd[1626]: time="2026-03-12T23:43:21.344957356Z" level=info msg="connecting to shim e965ad22ccdab954350763623c72591237edaa9db6bf560363baa34a8e84b5e4" address="unix:///run/containerd/s/0bd483e46fc15a3e66428049cbce6ea38cba78d15f628c2bc3656f2c188317aa" protocol=ttrpc version=3 Mar 12 23:43:21.367440 systemd[1]: Started cri-containerd-e965ad22ccdab954350763623c72591237edaa9db6bf560363baa34a8e84b5e4.scope - libcontainer container e965ad22ccdab954350763623c72591237edaa9db6bf560363baa34a8e84b5e4. Mar 12 23:43:21.391688 containerd[1626]: time="2026-03-12T23:43:21.391644022Z" level=info msg="StartContainer for \"e965ad22ccdab954350763623c72591237edaa9db6bf560363baa34a8e84b5e4\" returns successfully" Mar 12 23:43:22.003039 containerd[1626]: time="2026-03-12T23:43:22.002939666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7746b7cf7f-wsv4z,Uid:0d9f6974-521e-4d9d-a269-abe3fcc76140,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:22.003447 containerd[1626]: time="2026-03-12T23:43:22.003414788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f2x6w,Uid:48bac47d-6b7b-4409-8577-545950ed7262,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:22.121241 systemd-networkd[1428]: cali6cb905fb7b9: Link UP Mar 12 23:43:22.121690 systemd-networkd[1428]: cali6cb905fb7b9: Gained carrier Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.045 [INFO][4725] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0 calico-kube-controllers-7746b7cf7f- calico-system 0d9f6974-521e-4d9d-a269-abe3fcc76140 906 0 2026-03-12 23:42:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7746b7cf7f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 calico-kube-controllers-7746b7cf7f-wsv4z eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6cb905fb7b9 [] [] }} ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.046 [INFO][4725] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.073 [INFO][4748] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" HandleID="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.089 [INFO][4748] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" HandleID="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137ba0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"calico-kube-controllers-7746b7cf7f-wsv4z", "timestamp":"2026-03-12 23:43:22.073766769 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001966e0)} Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.089 [INFO][4748] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.089 [INFO][4748] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.089 [INFO][4748] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.091 [INFO][4748] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.096 [INFO][4748] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.100 [INFO][4748] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.102 [INFO][4748] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.104 [INFO][4748] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.104 [INFO][4748] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.105 [INFO][4748] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.110 [INFO][4748] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.116 [INFO][4748] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.4/26] block=192.168.36.0/26 handle="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.116 [INFO][4748] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.4/26] handle="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.116 [INFO][4748] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:22.135295 containerd[1626]: 2026-03-12 23:43:22.116 [INFO][4748] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.4/26] IPv6=[] ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" HandleID="k8s-pod-network.dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" Mar 12 23:43:22.135930 containerd[1626]: 2026-03-12 23:43:22.119 [INFO][4725] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0", GenerateName:"calico-kube-controllers-7746b7cf7f-", Namespace:"calico-system", SelfLink:"", UID:"0d9f6974-521e-4d9d-a269-abe3fcc76140", ResourceVersion:"906", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7746b7cf7f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"calico-kube-controllers-7746b7cf7f-wsv4z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6cb905fb7b9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:22.135930 containerd[1626]: 2026-03-12 23:43:22.119 [INFO][4725] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.4/32] ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" Mar 12 23:43:22.135930 containerd[1626]: 2026-03-12 23:43:22.119 [INFO][4725] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6cb905fb7b9 ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" Mar 12 23:43:22.135930 containerd[1626]: 2026-03-12 23:43:22.122 [INFO][4725] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" Mar 12 23:43:22.135930 containerd[1626]: 2026-03-12 23:43:22.122 [INFO][4725] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0", GenerateName:"calico-kube-controllers-7746b7cf7f-", Namespace:"calico-system", SelfLink:"", UID:"0d9f6974-521e-4d9d-a269-abe3fcc76140", ResourceVersion:"906", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7746b7cf7f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac", Pod:"calico-kube-controllers-7746b7cf7f-wsv4z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6cb905fb7b9", MAC:"26:c0:72:67:c7:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:22.135930 containerd[1626]: 2026-03-12 23:43:22.131 [INFO][4725] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" Namespace="calico-system" Pod="calico-kube-controllers-7746b7cf7f-wsv4z" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--kube--controllers--7746b7cf7f--wsv4z-eth0" Mar 12 23:43:22.161062 containerd[1626]: time="2026-03-12T23:43:22.161020592Z" level=info msg="connecting to shim dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac" address="unix:///run/containerd/s/abb6aebbdfb5ee51e3661f542a4c8afa79ccb7aa4532512df1e3920a1ba936a5" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:22.187616 systemd[1]: Started cri-containerd-dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac.scope - libcontainer container dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac. Mar 12 23:43:22.202684 kubelet[2862]: I0312 23:43:22.201974 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-2k8vr" podStartSLOduration=55.201956831 podStartE2EDuration="55.201956831s" podCreationTimestamp="2026-03-12 23:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:22.182805538 +0000 UTC m=+62.396689683" watchObservedRunningTime="2026-03-12 23:43:22.201956831 +0000 UTC m=+62.415840976" Mar 12 23:43:22.236372 systemd-networkd[1428]: cali71fec178bca: Link UP Mar 12 23:43:22.237700 systemd-networkd[1428]: cali71fec178bca: Gained carrier Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.044 [INFO][4719] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0 csi-node-driver- calico-system 48bac47d-6b7b-4409-8577-545950ed7262 742 0 2026-03-12 23:42:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 csi-node-driver-f2x6w eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali71fec178bca [] [] }} ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.045 [INFO][4719] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.073 [INFO][4750] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" HandleID="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Workload="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.089 [INFO][4750] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" HandleID="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Workload="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c9a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"csi-node-driver-f2x6w", "timestamp":"2026-03-12 23:43:22.073786049 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003d1a20)} Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.089 [INFO][4750] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.116 [INFO][4750] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.116 [INFO][4750] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.192 [INFO][4750] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.199 [INFO][4750] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.207 [INFO][4750] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.212 [INFO][4750] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.216 [INFO][4750] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.216 [INFO][4750] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.219 [INFO][4750] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927 Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.224 [INFO][4750] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.230 [INFO][4750] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.5/26] block=192.168.36.0/26 handle="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.230 [INFO][4750] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.5/26] handle="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.230 [INFO][4750] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:22.257620 containerd[1626]: 2026-03-12 23:43:22.231 [INFO][4750] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.5/26] IPv6=[] ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" HandleID="k8s-pod-network.0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Workload="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" Mar 12 23:43:22.258087 containerd[1626]: 2026-03-12 23:43:22.233 [INFO][4719] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"48bac47d-6b7b-4409-8577-545950ed7262", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"csi-node-driver-f2x6w", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali71fec178bca", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:22.258087 containerd[1626]: 2026-03-12 23:43:22.233 [INFO][4719] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.5/32] ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" Mar 12 23:43:22.258087 containerd[1626]: 2026-03-12 23:43:22.233 [INFO][4719] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali71fec178bca ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" Mar 12 23:43:22.258087 containerd[1626]: 2026-03-12 23:43:22.239 [INFO][4719] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" Mar 12 23:43:22.258087 containerd[1626]: 2026-03-12 23:43:22.242 [INFO][4719] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"48bac47d-6b7b-4409-8577-545950ed7262", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927", Pod:"csi-node-driver-f2x6w", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali71fec178bca", MAC:"86:be:cf:5a:55:f4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:22.258087 containerd[1626]: 2026-03-12 23:43:22.251 [INFO][4719] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" Namespace="calico-system" Pod="csi-node-driver-f2x6w" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-csi--node--driver--f2x6w-eth0" Mar 12 23:43:22.262120 containerd[1626]: time="2026-03-12T23:43:22.262063922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7746b7cf7f-wsv4z,Uid:0d9f6974-521e-4d9d-a269-abe3fcc76140,Namespace:calico-system,Attempt:0,} returns sandbox id \"dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac\"" Mar 12 23:43:22.283115 containerd[1626]: time="2026-03-12T23:43:22.282730542Z" level=info msg="connecting to shim 0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927" address="unix:///run/containerd/s/c0e6c1b18fa7b8fb43ef6d920ff867f24993ee5398cb6ee6fa44d037cdfe1b5d" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:22.303507 systemd[1]: Started cri-containerd-0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927.scope - libcontainer container 0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927. Mar 12 23:43:22.326352 containerd[1626]: time="2026-03-12T23:43:22.326244593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-f2x6w,Uid:48bac47d-6b7b-4409-8577-545950ed7262,Namespace:calico-system,Attempt:0,} returns sandbox id \"0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927\"" Mar 12 23:43:22.801496 systemd-networkd[1428]: calidc81405ffb1: Gained IPv6LL Mar 12 23:43:23.002569 containerd[1626]: time="2026-03-12T23:43:23.002507356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-4t7tf,Uid:447005ab-12f7-4c87-b94d-836d35cb7afb,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:23.002668 containerd[1626]: time="2026-03-12T23:43:23.002507996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-2g59f,Uid:b9aa069c-90dc-42a9-86a8-579388459807,Namespace:calico-system,Attempt:0,}" Mar 12 23:43:23.115567 systemd-networkd[1428]: calic4fd2178c64: Link UP Mar 12 23:43:23.117436 systemd-networkd[1428]: calic4fd2178c64: Gained carrier Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.043 [INFO][4909] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0 goldmane-5b85766d88- calico-system 447005ab-12f7-4c87-b94d-836d35cb7afb 905 0 2026-03-12 23:42:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 goldmane-5b85766d88-4t7tf eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic4fd2178c64 [] [] }} ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.043 [INFO][4909] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.067 [INFO][4940] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" HandleID="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Workload="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.082 [INFO][4940] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" HandleID="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Workload="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ebe90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"goldmane-5b85766d88-4t7tf", "timestamp":"2026-03-12 23:43:23.067670704 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400054a000)} Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.082 [INFO][4940] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.083 [INFO][4940] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.083 [INFO][4940] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.085 [INFO][4940] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.089 [INFO][4940] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.094 [INFO][4940] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.095 [INFO][4940] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.097 [INFO][4940] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.098 [INFO][4940] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.099 [INFO][4940] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240 Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.103 [INFO][4940] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.110 [INFO][4940] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.6/26] block=192.168.36.0/26 handle="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.110 [INFO][4940] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.6/26] handle="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.110 [INFO][4940] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:23.130752 containerd[1626]: 2026-03-12 23:43:23.110 [INFO][4940] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.6/26] IPv6=[] ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" HandleID="k8s-pod-network.8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Workload="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" Mar 12 23:43:23.132190 containerd[1626]: 2026-03-12 23:43:23.111 [INFO][4909] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"447005ab-12f7-4c87-b94d-836d35cb7afb", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"goldmane-5b85766d88-4t7tf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic4fd2178c64", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:23.132190 containerd[1626]: 2026-03-12 23:43:23.111 [INFO][4909] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.6/32] ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" Mar 12 23:43:23.132190 containerd[1626]: 2026-03-12 23:43:23.111 [INFO][4909] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic4fd2178c64 ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" Mar 12 23:43:23.132190 containerd[1626]: 2026-03-12 23:43:23.116 [INFO][4909] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" Mar 12 23:43:23.132190 containerd[1626]: 2026-03-12 23:43:23.120 [INFO][4909] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"447005ab-12f7-4c87-b94d-836d35cb7afb", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240", Pod:"goldmane-5b85766d88-4t7tf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic4fd2178c64", MAC:"52:eb:5e:53:f4:fb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:23.132190 containerd[1626]: 2026-03-12 23:43:23.128 [INFO][4909] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" Namespace="calico-system" Pod="goldmane-5b85766d88-4t7tf" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-goldmane--5b85766d88--4t7tf-eth0" Mar 12 23:43:23.156944 containerd[1626]: time="2026-03-12T23:43:23.156814365Z" level=info msg="connecting to shim 8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240" address="unix:///run/containerd/s/51b5cd3fcd5734646fa3df587470b9e5c09d2ab424c7e75bb13b3af838928b6f" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:23.186468 systemd[1]: Started cri-containerd-8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240.scope - libcontainer container 8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240. Mar 12 23:43:23.231395 systemd-networkd[1428]: cali3f1dd03ffe2: Link UP Mar 12 23:43:23.231848 systemd-networkd[1428]: cali3f1dd03ffe2: Gained carrier Mar 12 23:43:23.239447 containerd[1626]: time="2026-03-12T23:43:23.239408195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-4t7tf,Uid:447005ab-12f7-4c87-b94d-836d35cb7afb,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240\"" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.049 [INFO][4920] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0 calico-apiserver-5c57dc4894- calico-system b9aa069c-90dc-42a9-86a8-579388459807 903 0 2026-03-12 23:42:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c57dc4894 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 calico-apiserver-5c57dc4894-2g59f eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali3f1dd03ffe2 [] [] }} ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.050 [INFO][4920] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.074 [INFO][4946] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" HandleID="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.084 [INFO][4946] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" HandleID="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001197d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"calico-apiserver-5c57dc4894-2g59f", "timestamp":"2026-03-12 23:43:23.074836417 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000496580)} Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.085 [INFO][4946] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.110 [INFO][4946] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.110 [INFO][4946] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.186 [INFO][4946] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.193 [INFO][4946] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.198 [INFO][4946] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.201 [INFO][4946] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.205 [INFO][4946] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.205 [INFO][4946] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.207 [INFO][4946] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295 Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.212 [INFO][4946] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.224 [INFO][4946] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.7/26] block=192.168.36.0/26 handle="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.224 [INFO][4946] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.7/26] handle="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.224 [INFO][4946] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:23.249238 containerd[1626]: 2026-03-12 23:43:23.224 [INFO][4946] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.7/26] IPv6=[] ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" HandleID="k8s-pod-network.325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Workload="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" Mar 12 23:43:23.249898 containerd[1626]: 2026-03-12 23:43:23.228 [INFO][4920] cni-plugin/k8s.go 418: Populated endpoint ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0", GenerateName:"calico-apiserver-5c57dc4894-", Namespace:"calico-system", SelfLink:"", UID:"b9aa069c-90dc-42a9-86a8-579388459807", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c57dc4894", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"calico-apiserver-5c57dc4894-2g59f", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3f1dd03ffe2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:23.249898 containerd[1626]: 2026-03-12 23:43:23.228 [INFO][4920] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.7/32] ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" Mar 12 23:43:23.249898 containerd[1626]: 2026-03-12 23:43:23.228 [INFO][4920] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3f1dd03ffe2 ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" Mar 12 23:43:23.249898 containerd[1626]: 2026-03-12 23:43:23.232 [INFO][4920] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" Mar 12 23:43:23.249898 containerd[1626]: 2026-03-12 23:43:23.232 [INFO][4920] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0", GenerateName:"calico-apiserver-5c57dc4894-", Namespace:"calico-system", SelfLink:"", UID:"b9aa069c-90dc-42a9-86a8-579388459807", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c57dc4894", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295", Pod:"calico-apiserver-5c57dc4894-2g59f", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3f1dd03ffe2", MAC:"9e:a0:e8:db:96:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:23.249898 containerd[1626]: 2026-03-12 23:43:23.244 [INFO][4920] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" Namespace="calico-system" Pod="calico-apiserver-5c57dc4894-2g59f" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-calico--apiserver--5c57dc4894--2g59f-eth0" Mar 12 23:43:23.249554 systemd-networkd[1428]: cali151789f054f: Gained IPv6LL Mar 12 23:43:23.285671 containerd[1626]: time="2026-03-12T23:43:23.285309372Z" level=info msg="connecting to shim 325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295" address="unix:///run/containerd/s/1d52fe11e1cfb9b83ecb6468dd233077d0cd3aec83b53518d327027466c6422d" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:23.310483 systemd[1]: Started cri-containerd-325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295.scope - libcontainer container 325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295. Mar 12 23:43:23.346883 containerd[1626]: time="2026-03-12T23:43:23.346833823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c57dc4894-2g59f,Uid:b9aa069c-90dc-42a9-86a8-579388459807,Namespace:calico-system,Attempt:0,} returns sandbox id \"325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295\"" Mar 12 23:43:23.761386 systemd-networkd[1428]: cali6cb905fb7b9: Gained IPv6LL Mar 12 23:43:23.883734 containerd[1626]: time="2026-03-12T23:43:23.883693019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:23.884917 containerd[1626]: time="2026-03-12T23:43:23.884884664Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Mar 12 23:43:23.886292 containerd[1626]: time="2026-03-12T23:43:23.886226871Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:23.889303 containerd[1626]: time="2026-03-12T23:43:23.889050285Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:23.889637 containerd[1626]: time="2026-03-12T23:43:23.889604127Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 2.669794418s" Mar 12 23:43:23.889637 containerd[1626]: time="2026-03-12T23:43:23.889636567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:43:23.890365 systemd-networkd[1428]: cali71fec178bca: Gained IPv6LL Mar 12 23:43:23.891157 containerd[1626]: time="2026-03-12T23:43:23.891128175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 12 23:43:23.895048 containerd[1626]: time="2026-03-12T23:43:23.895011673Z" level=info msg="CreateContainer within sandbox \"dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:43:23.904831 containerd[1626]: time="2026-03-12T23:43:23.904786201Z" level=info msg="Container cc07335cd7e4041e8ee4e5e767b0e9f1359c1c7ca03eb68bee9a2bbf18278e78: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:23.913004 containerd[1626]: time="2026-03-12T23:43:23.912936160Z" level=info msg="CreateContainer within sandbox \"dada8a749b608edae9d3bf93d19aa5a1ee49e8792e7112a9a61e0aa2a8345cfb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"cc07335cd7e4041e8ee4e5e767b0e9f1359c1c7ca03eb68bee9a2bbf18278e78\"" Mar 12 23:43:23.913669 containerd[1626]: time="2026-03-12T23:43:23.913640244Z" level=info msg="StartContainer for \"cc07335cd7e4041e8ee4e5e767b0e9f1359c1c7ca03eb68bee9a2bbf18278e78\"" Mar 12 23:43:23.915888 containerd[1626]: time="2026-03-12T23:43:23.915184171Z" level=info msg="connecting to shim cc07335cd7e4041e8ee4e5e767b0e9f1359c1c7ca03eb68bee9a2bbf18278e78" address="unix:///run/containerd/s/d947cee0aa9e4b90c7805af0bfb5655cca3bee960d29b41a6458dcb6b8950cb7" protocol=ttrpc version=3 Mar 12 23:43:23.937497 systemd[1]: Started cri-containerd-cc07335cd7e4041e8ee4e5e767b0e9f1359c1c7ca03eb68bee9a2bbf18278e78.scope - libcontainer container cc07335cd7e4041e8ee4e5e767b0e9f1359c1c7ca03eb68bee9a2bbf18278e78. Mar 12 23:43:23.970686 containerd[1626]: time="2026-03-12T23:43:23.970647399Z" level=info msg="StartContainer for \"cc07335cd7e4041e8ee4e5e767b0e9f1359c1c7ca03eb68bee9a2bbf18278e78\" returns successfully" Mar 12 23:43:24.186995 kubelet[2862]: I0312 23:43:24.186900 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5c57dc4894-t58wb" podStartSLOduration=41.51541182 podStartE2EDuration="44.186882326s" podCreationTimestamp="2026-03-12 23:42:40 +0000 UTC" firstStartedPulling="2026-03-12 23:43:21.219507068 +0000 UTC m=+61.433391213" lastFinishedPulling="2026-03-12 23:43:23.890977574 +0000 UTC m=+64.104861719" observedRunningTime="2026-03-12 23:43:24.185894081 +0000 UTC m=+64.399778186" watchObservedRunningTime="2026-03-12 23:43:24.186882326 +0000 UTC m=+64.400766471" Mar 12 23:43:24.977531 systemd-networkd[1428]: cali3f1dd03ffe2: Gained IPv6LL Mar 12 23:43:25.003130 containerd[1626]: time="2026-03-12T23:43:25.003086195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-579rg,Uid:f1b551ed-be1a-4d83-8897-81bad277f0a5,Namespace:kube-system,Attempt:0,}" Mar 12 23:43:25.105940 systemd-networkd[1428]: calic4fd2178c64: Gained IPv6LL Mar 12 23:43:25.108596 systemd-networkd[1428]: cali230034ade62: Link UP Mar 12 23:43:25.108794 systemd-networkd[1428]: cali230034ade62: Gained carrier Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.040 [INFO][5152] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0 coredns-674b8bbfcf- kube-system f1b551ed-be1a-4d83-8897-81bad277f0a5 898 0 2026-03-12 23:42:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-27aefdfc79 coredns-674b8bbfcf-579rg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali230034ade62 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.040 [INFO][5152] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.063 [INFO][5166] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" HandleID="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Workload="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.072 [INFO][5166] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" HandleID="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Workload="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000365bf0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-27aefdfc79", "pod":"coredns-674b8bbfcf-579rg", "timestamp":"2026-03-12 23:43:25.063560047 +0000 UTC"}, Hostname:"ci-4459-2-4-n-27aefdfc79", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40006a6420)} Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.072 [INFO][5166] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.073 [INFO][5166] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.073 [INFO][5166] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-27aefdfc79' Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.075 [INFO][5166] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.079 [INFO][5166] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.084 [INFO][5166] ipam/ipam.go 526: Trying affinity for 192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.086 [INFO][5166] ipam/ipam.go 160: Attempting to load block cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.089 [INFO][5166] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.36.0/26 host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.089 [INFO][5166] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.36.0/26 handle="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.090 [INFO][5166] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.095 [INFO][5166] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.36.0/26 handle="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.102 [INFO][5166] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.36.8/26] block=192.168.36.0/26 handle="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.102 [INFO][5166] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.36.8/26] handle="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" host="ci-4459-2-4-n-27aefdfc79" Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.102 [INFO][5166] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 23:43:25.140450 containerd[1626]: 2026-03-12 23:43:25.102 [INFO][5166] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.36.8/26] IPv6=[] ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" HandleID="k8s-pod-network.6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Workload="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" Mar 12 23:43:25.141391 containerd[1626]: 2026-03-12 23:43:25.104 [INFO][5152] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"f1b551ed-be1a-4d83-8897-81bad277f0a5", ResourceVersion:"898", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"", Pod:"coredns-674b8bbfcf-579rg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali230034ade62", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:25.141391 containerd[1626]: 2026-03-12 23:43:25.104 [INFO][5152] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.8/32] ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" Mar 12 23:43:25.141391 containerd[1626]: 2026-03-12 23:43:25.104 [INFO][5152] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali230034ade62 ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" Mar 12 23:43:25.141391 containerd[1626]: 2026-03-12 23:43:25.106 [INFO][5152] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" Mar 12 23:43:25.141391 containerd[1626]: 2026-03-12 23:43:25.109 [INFO][5152] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"f1b551ed-be1a-4d83-8897-81bad277f0a5", ResourceVersion:"898", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 23, 42, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-27aefdfc79", ContainerID:"6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab", Pod:"coredns-674b8bbfcf-579rg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali230034ade62", MAC:"6e:15:a0:82:47:56", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 23:43:25.141391 containerd[1626]: 2026-03-12 23:43:25.133 [INFO][5152] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" Namespace="kube-system" Pod="coredns-674b8bbfcf-579rg" WorkloadEndpoint="ci--4459--2--4--n--27aefdfc79-k8s-coredns--674b8bbfcf--579rg-eth0" Mar 12 23:43:25.169958 containerd[1626]: time="2026-03-12T23:43:25.169870682Z" level=info msg="connecting to shim 6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab" address="unix:///run/containerd/s/8a475532ffe2f96ac66b8882fdf6ff1c1f30d35e8a183bd5ac1584a0a4cb2215" namespace=k8s.io protocol=ttrpc version=3 Mar 12 23:43:25.181096 kubelet[2862]: I0312 23:43:25.181062 2862 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:25.192463 systemd[1]: Started cri-containerd-6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab.scope - libcontainer container 6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab. Mar 12 23:43:25.233051 containerd[1626]: time="2026-03-12T23:43:25.232924147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-579rg,Uid:f1b551ed-be1a-4d83-8897-81bad277f0a5,Namespace:kube-system,Attempt:0,} returns sandbox id \"6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab\"" Mar 12 23:43:25.245789 containerd[1626]: time="2026-03-12T23:43:25.245726809Z" level=info msg="CreateContainer within sandbox \"6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 23:43:25.258144 containerd[1626]: time="2026-03-12T23:43:25.258098429Z" level=info msg="Container a875c537a706abe85e18476eccb308ab38b9400e8ae35e37b68c90d95e7b786d: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:25.268474 containerd[1626]: time="2026-03-12T23:43:25.267855476Z" level=info msg="CreateContainer within sandbox \"6699f99d48d3b30fb9b542acf2e7a2a3539d31042595194943a8b5b545239eab\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a875c537a706abe85e18476eccb308ab38b9400e8ae35e37b68c90d95e7b786d\"" Mar 12 23:43:25.268772 containerd[1626]: time="2026-03-12T23:43:25.268732760Z" level=info msg="StartContainer for \"a875c537a706abe85e18476eccb308ab38b9400e8ae35e37b68c90d95e7b786d\"" Mar 12 23:43:25.270027 containerd[1626]: time="2026-03-12T23:43:25.269957006Z" level=info msg="connecting to shim a875c537a706abe85e18476eccb308ab38b9400e8ae35e37b68c90d95e7b786d" address="unix:///run/containerd/s/8a475532ffe2f96ac66b8882fdf6ff1c1f30d35e8a183bd5ac1584a0a4cb2215" protocol=ttrpc version=3 Mar 12 23:43:25.290453 systemd[1]: Started cri-containerd-a875c537a706abe85e18476eccb308ab38b9400e8ae35e37b68c90d95e7b786d.scope - libcontainer container a875c537a706abe85e18476eccb308ab38b9400e8ae35e37b68c90d95e7b786d. Mar 12 23:43:25.326731 containerd[1626]: time="2026-03-12T23:43:25.326619680Z" level=info msg="StartContainer for \"a875c537a706abe85e18476eccb308ab38b9400e8ae35e37b68c90d95e7b786d\" returns successfully" Mar 12 23:43:26.217882 kubelet[2862]: I0312 23:43:26.217531 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-579rg" podStartSLOduration=59.217513511 podStartE2EDuration="59.217513511s" podCreationTimestamp="2026-03-12 23:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 23:43:26.198172737 +0000 UTC m=+66.412056882" watchObservedRunningTime="2026-03-12 23:43:26.217513511 +0000 UTC m=+66.431397656" Mar 12 23:43:26.806161 containerd[1626]: time="2026-03-12T23:43:26.805479715Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:26.807047 containerd[1626]: time="2026-03-12T23:43:26.807016363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Mar 12 23:43:26.808637 containerd[1626]: time="2026-03-12T23:43:26.808586570Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:26.810895 containerd[1626]: time="2026-03-12T23:43:26.810847301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:26.811410 containerd[1626]: time="2026-03-12T23:43:26.811371904Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 2.920206569s" Mar 12 23:43:26.811410 containerd[1626]: time="2026-03-12T23:43:26.811406344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Mar 12 23:43:26.812462 containerd[1626]: time="2026-03-12T23:43:26.812440029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 12 23:43:26.822308 containerd[1626]: time="2026-03-12T23:43:26.821480753Z" level=info msg="CreateContainer within sandbox \"dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 12 23:43:26.832936 containerd[1626]: time="2026-03-12T23:43:26.832882088Z" level=info msg="Container ffc4fdfe55109f8fefd7e82700917030799eb522526647a9d0f79ebd8c88ccd0: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:26.842258 containerd[1626]: time="2026-03-12T23:43:26.842196053Z" level=info msg="CreateContainer within sandbox \"dd9d66a2440074714c367a58f56d00dd015532160aa2d1b784efa477e79791ac\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ffc4fdfe55109f8fefd7e82700917030799eb522526647a9d0f79ebd8c88ccd0\"" Mar 12 23:43:26.842795 containerd[1626]: time="2026-03-12T23:43:26.842720696Z" level=info msg="StartContainer for \"ffc4fdfe55109f8fefd7e82700917030799eb522526647a9d0f79ebd8c88ccd0\"" Mar 12 23:43:26.844067 containerd[1626]: time="2026-03-12T23:43:26.844036662Z" level=info msg="connecting to shim ffc4fdfe55109f8fefd7e82700917030799eb522526647a9d0f79ebd8c88ccd0" address="unix:///run/containerd/s/abb6aebbdfb5ee51e3661f542a4c8afa79ccb7aa4532512df1e3920a1ba936a5" protocol=ttrpc version=3 Mar 12 23:43:26.867611 systemd[1]: Started cri-containerd-ffc4fdfe55109f8fefd7e82700917030799eb522526647a9d0f79ebd8c88ccd0.scope - libcontainer container ffc4fdfe55109f8fefd7e82700917030799eb522526647a9d0f79ebd8c88ccd0. Mar 12 23:43:26.902564 containerd[1626]: time="2026-03-12T23:43:26.902472585Z" level=info msg="StartContainer for \"ffc4fdfe55109f8fefd7e82700917030799eb522526647a9d0f79ebd8c88ccd0\" returns successfully" Mar 12 23:43:27.089650 systemd-networkd[1428]: cali230034ade62: Gained IPv6LL Mar 12 23:43:27.201027 kubelet[2862]: I0312 23:43:27.200949 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7746b7cf7f-wsv4z" podStartSLOduration=41.652037049 podStartE2EDuration="46.200935749s" podCreationTimestamp="2026-03-12 23:42:41 +0000 UTC" firstStartedPulling="2026-03-12 23:43:22.263382288 +0000 UTC m=+62.477266433" lastFinishedPulling="2026-03-12 23:43:26.812280988 +0000 UTC m=+67.026165133" observedRunningTime="2026-03-12 23:43:27.199825903 +0000 UTC m=+67.413710048" watchObservedRunningTime="2026-03-12 23:43:27.200935749 +0000 UTC m=+67.414819894" Mar 12 23:43:28.381043 containerd[1626]: time="2026-03-12T23:43:28.380982258Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:28.382078 containerd[1626]: time="2026-03-12T23:43:28.382051023Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Mar 12 23:43:28.383914 containerd[1626]: time="2026-03-12T23:43:28.383862072Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:28.386283 containerd[1626]: time="2026-03-12T23:43:28.386219403Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:28.387148 containerd[1626]: time="2026-03-12T23:43:28.387104408Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.574550818s" Mar 12 23:43:28.387148 containerd[1626]: time="2026-03-12T23:43:28.387140928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Mar 12 23:43:28.388593 containerd[1626]: time="2026-03-12T23:43:28.388565255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 12 23:43:28.391744 containerd[1626]: time="2026-03-12T23:43:28.391706030Z" level=info msg="CreateContainer within sandbox \"0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 12 23:43:28.404292 containerd[1626]: time="2026-03-12T23:43:28.403428527Z" level=info msg="Container a5f84d37f6b9749257ea6999b0e4d7e7a65fcce81d29acc6051fb25be68159c7: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:28.412801 containerd[1626]: time="2026-03-12T23:43:28.412762132Z" level=info msg="CreateContainer within sandbox \"0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a5f84d37f6b9749257ea6999b0e4d7e7a65fcce81d29acc6051fb25be68159c7\"" Mar 12 23:43:28.413484 containerd[1626]: time="2026-03-12T23:43:28.413463495Z" level=info msg="StartContainer for \"a5f84d37f6b9749257ea6999b0e4d7e7a65fcce81d29acc6051fb25be68159c7\"" Mar 12 23:43:28.416153 containerd[1626]: time="2026-03-12T23:43:28.416128308Z" level=info msg="connecting to shim a5f84d37f6b9749257ea6999b0e4d7e7a65fcce81d29acc6051fb25be68159c7" address="unix:///run/containerd/s/c0e6c1b18fa7b8fb43ef6d920ff867f24993ee5398cb6ee6fa44d037cdfe1b5d" protocol=ttrpc version=3 Mar 12 23:43:28.439418 systemd[1]: Started cri-containerd-a5f84d37f6b9749257ea6999b0e4d7e7a65fcce81d29acc6051fb25be68159c7.scope - libcontainer container a5f84d37f6b9749257ea6999b0e4d7e7a65fcce81d29acc6051fb25be68159c7. Mar 12 23:43:28.499429 containerd[1626]: time="2026-03-12T23:43:28.499366511Z" level=info msg="StartContainer for \"a5f84d37f6b9749257ea6999b0e4d7e7a65fcce81d29acc6051fb25be68159c7\" returns successfully" Mar 12 23:43:30.594694 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3667521333.mount: Deactivated successfully. Mar 12 23:43:30.815884 containerd[1626]: time="2026-03-12T23:43:30.815835038Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:30.817445 containerd[1626]: time="2026-03-12T23:43:30.817405926Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Mar 12 23:43:30.818354 containerd[1626]: time="2026-03-12T23:43:30.818319570Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:30.821019 containerd[1626]: time="2026-03-12T23:43:30.820972743Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:30.821788 containerd[1626]: time="2026-03-12T23:43:30.821640866Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 2.433039651s" Mar 12 23:43:30.821788 containerd[1626]: time="2026-03-12T23:43:30.821675626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Mar 12 23:43:30.822807 containerd[1626]: time="2026-03-12T23:43:30.822783552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 23:43:30.825799 containerd[1626]: time="2026-03-12T23:43:30.825654526Z" level=info msg="CreateContainer within sandbox \"8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 12 23:43:30.833832 containerd[1626]: time="2026-03-12T23:43:30.833783885Z" level=info msg="Container 71a590a93683f47aaec382af31fe8a6f30004b3ce178069d8092bb22f1b3c881: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:30.843971 containerd[1626]: time="2026-03-12T23:43:30.843930534Z" level=info msg="CreateContainer within sandbox \"8c58d6f0d7123e92db2def43d8fa0e56222c70024b3bc81058ecf968ca49e240\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"71a590a93683f47aaec382af31fe8a6f30004b3ce178069d8092bb22f1b3c881\"" Mar 12 23:43:30.844675 containerd[1626]: time="2026-03-12T23:43:30.844646457Z" level=info msg="StartContainer for \"71a590a93683f47aaec382af31fe8a6f30004b3ce178069d8092bb22f1b3c881\"" Mar 12 23:43:30.845999 containerd[1626]: time="2026-03-12T23:43:30.845906863Z" level=info msg="connecting to shim 71a590a93683f47aaec382af31fe8a6f30004b3ce178069d8092bb22f1b3c881" address="unix:///run/containerd/s/51b5cd3fcd5734646fa3df587470b9e5c09d2ab424c7e75bb13b3af838928b6f" protocol=ttrpc version=3 Mar 12 23:43:30.867511 systemd[1]: Started cri-containerd-71a590a93683f47aaec382af31fe8a6f30004b3ce178069d8092bb22f1b3c881.scope - libcontainer container 71a590a93683f47aaec382af31fe8a6f30004b3ce178069d8092bb22f1b3c881. Mar 12 23:43:30.902465 containerd[1626]: time="2026-03-12T23:43:30.902426777Z" level=info msg="StartContainer for \"71a590a93683f47aaec382af31fe8a6f30004b3ce178069d8092bb22f1b3c881\" returns successfully" Mar 12 23:43:31.187246 containerd[1626]: time="2026-03-12T23:43:31.186994831Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:31.187818 containerd[1626]: time="2026-03-12T23:43:31.187795115Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 12 23:43:31.193315 containerd[1626]: time="2026-03-12T23:43:31.193281182Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 370.44527ms" Mar 12 23:43:31.193315 containerd[1626]: time="2026-03-12T23:43:31.193316502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Mar 12 23:43:31.194714 containerd[1626]: time="2026-03-12T23:43:31.194404587Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 12 23:43:31.198199 containerd[1626]: time="2026-03-12T23:43:31.198164045Z" level=info msg="CreateContainer within sandbox \"325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 23:43:31.211840 containerd[1626]: time="2026-03-12T23:43:31.211790631Z" level=info msg="Container d1bf68f21d9ad8090dad441f590e2d5ff8a00e021bbd8be89b2dc34ad85e46bc: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:31.226044 containerd[1626]: time="2026-03-12T23:43:31.225998740Z" level=info msg="CreateContainer within sandbox \"325c4e0b720cf3c62ece568cf347b4fc5964b6851917480eb15aef21b1ef0295\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d1bf68f21d9ad8090dad441f590e2d5ff8a00e021bbd8be89b2dc34ad85e46bc\"" Mar 12 23:43:31.226851 containerd[1626]: time="2026-03-12T23:43:31.226824344Z" level=info msg="StartContainer for \"d1bf68f21d9ad8090dad441f590e2d5ff8a00e021bbd8be89b2dc34ad85e46bc\"" Mar 12 23:43:31.228495 containerd[1626]: time="2026-03-12T23:43:31.228356471Z" level=info msg="connecting to shim d1bf68f21d9ad8090dad441f590e2d5ff8a00e021bbd8be89b2dc34ad85e46bc" address="unix:///run/containerd/s/1d52fe11e1cfb9b83ecb6468dd233077d0cd3aec83b53518d327027466c6422d" protocol=ttrpc version=3 Mar 12 23:43:31.242759 systemd[1]: Started cri-containerd-d1bf68f21d9ad8090dad441f590e2d5ff8a00e021bbd8be89b2dc34ad85e46bc.scope - libcontainer container d1bf68f21d9ad8090dad441f590e2d5ff8a00e021bbd8be89b2dc34ad85e46bc. Mar 12 23:43:31.280598 containerd[1626]: time="2026-03-12T23:43:31.280258042Z" level=info msg="StartContainer for \"d1bf68f21d9ad8090dad441f590e2d5ff8a00e021bbd8be89b2dc34ad85e46bc\" returns successfully" Mar 12 23:43:31.306528 kubelet[2862]: I0312 23:43:31.306451 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-4t7tf" podStartSLOduration=43.726289866 podStartE2EDuration="51.306432568s" podCreationTimestamp="2026-03-12 23:42:40 +0000 UTC" firstStartedPulling="2026-03-12 23:43:23.242193288 +0000 UTC m=+63.456077433" lastFinishedPulling="2026-03-12 23:43:30.82233603 +0000 UTC m=+71.036220135" observedRunningTime="2026-03-12 23:43:31.216385493 +0000 UTC m=+71.430269638" watchObservedRunningTime="2026-03-12 23:43:31.306432568 +0000 UTC m=+71.520316713" Mar 12 23:43:32.219945 kubelet[2862]: I0312 23:43:32.218978 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5c57dc4894-2g59f" podStartSLOduration=44.372820062 podStartE2EDuration="52.21896194s" podCreationTimestamp="2026-03-12 23:42:40 +0000 UTC" firstStartedPulling="2026-03-12 23:43:23.347858427 +0000 UTC m=+63.561742572" lastFinishedPulling="2026-03-12 23:43:31.194000345 +0000 UTC m=+71.407884450" observedRunningTime="2026-03-12 23:43:32.218120416 +0000 UTC m=+72.432004561" watchObservedRunningTime="2026-03-12 23:43:32.21896194 +0000 UTC m=+72.432846085" Mar 12 23:43:33.209700 kubelet[2862]: I0312 23:43:33.209668 2862 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:33.373461 containerd[1626]: time="2026-03-12T23:43:33.373412886Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:33.374728 containerd[1626]: time="2026-03-12T23:43:33.374697612Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Mar 12 23:43:33.375541 containerd[1626]: time="2026-03-12T23:43:33.375488056Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:33.378240 containerd[1626]: time="2026-03-12T23:43:33.378184069Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 23:43:33.379084 containerd[1626]: time="2026-03-12T23:43:33.378995233Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 2.184555686s" Mar 12 23:43:33.379084 containerd[1626]: time="2026-03-12T23:43:33.379029153Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Mar 12 23:43:33.382398 containerd[1626]: time="2026-03-12T23:43:33.382327889Z" level=info msg="CreateContainer within sandbox \"0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 12 23:43:33.392321 containerd[1626]: time="2026-03-12T23:43:33.391611854Z" level=info msg="Container fe8df55f6a7acd4807c8108668431b93cda517fa3014fff09c4edb2483206658: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:43:33.396709 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3779339974.mount: Deactivated successfully. Mar 12 23:43:33.401782 containerd[1626]: time="2026-03-12T23:43:33.401735023Z" level=info msg="CreateContainer within sandbox \"0bdbfd934828c36e29925950e8fefe1d8d6aacbbf9568f36229084b8ec23d927\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"fe8df55f6a7acd4807c8108668431b93cda517fa3014fff09c4edb2483206658\"" Mar 12 23:43:33.402465 containerd[1626]: time="2026-03-12T23:43:33.402438386Z" level=info msg="StartContainer for \"fe8df55f6a7acd4807c8108668431b93cda517fa3014fff09c4edb2483206658\"" Mar 12 23:43:33.404235 containerd[1626]: time="2026-03-12T23:43:33.404191475Z" level=info msg="connecting to shim fe8df55f6a7acd4807c8108668431b93cda517fa3014fff09c4edb2483206658" address="unix:///run/containerd/s/c0e6c1b18fa7b8fb43ef6d920ff867f24993ee5398cb6ee6fa44d037cdfe1b5d" protocol=ttrpc version=3 Mar 12 23:43:33.426517 systemd[1]: Started cri-containerd-fe8df55f6a7acd4807c8108668431b93cda517fa3014fff09c4edb2483206658.scope - libcontainer container fe8df55f6a7acd4807c8108668431b93cda517fa3014fff09c4edb2483206658. Mar 12 23:43:33.498220 containerd[1626]: time="2026-03-12T23:43:33.498077249Z" level=info msg="StartContainer for \"fe8df55f6a7acd4807c8108668431b93cda517fa3014fff09c4edb2483206658\" returns successfully" Mar 12 23:43:34.079779 kubelet[2862]: I0312 23:43:34.079752 2862 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 12 23:43:34.079779 kubelet[2862]: I0312 23:43:34.079782 2862 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 12 23:43:34.226832 kubelet[2862]: I0312 23:43:34.226765 2862 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-f2x6w" podStartSLOduration=42.174521217 podStartE2EDuration="53.226745734s" podCreationTimestamp="2026-03-12 23:42:41 +0000 UTC" firstStartedPulling="2026-03-12 23:43:22.327448479 +0000 UTC m=+62.541332624" lastFinishedPulling="2026-03-12 23:43:33.379672996 +0000 UTC m=+73.593557141" observedRunningTime="2026-03-12 23:43:34.226057851 +0000 UTC m=+74.439941996" watchObservedRunningTime="2026-03-12 23:43:34.226745734 +0000 UTC m=+74.440629879" Mar 12 23:43:35.223369 kubelet[2862]: I0312 23:43:35.223186 2862 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:43:49.754556 kubelet[2862]: I0312 23:43:49.754506 2862 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 23:44:27.327898 update_engine[1603]: I20260312 23:44:27.327538 1603 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 12 23:44:27.327898 update_engine[1603]: I20260312 23:44:27.327589 1603 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 12 23:44:27.327898 update_engine[1603]: I20260312 23:44:27.327812 1603 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 12 23:44:27.328371 update_engine[1603]: I20260312 23:44:27.328161 1603 omaha_request_params.cc:62] Current group set to stable Mar 12 23:44:27.328371 update_engine[1603]: I20260312 23:44:27.328253 1603 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 12 23:44:27.328371 update_engine[1603]: I20260312 23:44:27.328262 1603 update_attempter.cc:643] Scheduling an action processor start. Mar 12 23:44:27.328371 update_engine[1603]: I20260312 23:44:27.328307 1603 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 12 23:44:27.328371 update_engine[1603]: I20260312 23:44:27.328333 1603 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 12 23:44:27.328488 update_engine[1603]: I20260312 23:44:27.328377 1603 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 12 23:44:27.328488 update_engine[1603]: I20260312 23:44:27.328384 1603 omaha_request_action.cc:272] Request: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: Mar 12 23:44:27.328488 update_engine[1603]: I20260312 23:44:27.328390 1603 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:44:27.329158 locksmithd[1650]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 12 23:44:27.330289 update_engine[1603]: I20260312 23:44:27.330096 1603 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:44:27.330815 update_engine[1603]: I20260312 23:44:27.330770 1603 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:44:27.339346 update_engine[1603]: E20260312 23:44:27.339236 1603 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:44:27.339518 update_engine[1603]: I20260312 23:44:27.339447 1603 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 12 23:44:37.283209 update_engine[1603]: I20260312 23:44:37.283088 1603 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:44:37.283209 update_engine[1603]: I20260312 23:44:37.283188 1603 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:44:37.283885 update_engine[1603]: I20260312 23:44:37.283679 1603 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:44:37.288688 update_engine[1603]: E20260312 23:44:37.288621 1603 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:44:37.288790 update_engine[1603]: I20260312 23:44:37.288727 1603 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 12 23:44:47.283719 update_engine[1603]: I20260312 23:44:47.283339 1603 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:44:47.283719 update_engine[1603]: I20260312 23:44:47.283480 1603 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:44:47.284291 update_engine[1603]: I20260312 23:44:47.283957 1603 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:44:47.290237 update_engine[1603]: E20260312 23:44:47.290189 1603 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:44:47.290295 update_engine[1603]: I20260312 23:44:47.290277 1603 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 12 23:44:57.281715 update_engine[1603]: I20260312 23:44:57.281612 1603 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:44:57.282109 update_engine[1603]: I20260312 23:44:57.281733 1603 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:44:57.282357 update_engine[1603]: I20260312 23:44:57.282322 1603 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:44:57.287909 update_engine[1603]: E20260312 23:44:57.287791 1603 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:44:57.288014 update_engine[1603]: I20260312 23:44:57.287959 1603 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 12 23:44:57.288014 update_engine[1603]: I20260312 23:44:57.287969 1603 omaha_request_action.cc:617] Omaha request response: Mar 12 23:44:57.288060 update_engine[1603]: E20260312 23:44:57.288048 1603 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 12 23:44:57.288079 update_engine[1603]: I20260312 23:44:57.288061 1603 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 12 23:44:57.288079 update_engine[1603]: I20260312 23:44:57.288066 1603 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 12 23:44:57.288079 update_engine[1603]: I20260312 23:44:57.288070 1603 update_attempter.cc:306] Processing Done. Mar 12 23:44:57.288131 update_engine[1603]: E20260312 23:44:57.288084 1603 update_attempter.cc:619] Update failed. Mar 12 23:44:57.288131 update_engine[1603]: I20260312 23:44:57.288089 1603 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 12 23:44:57.288131 update_engine[1603]: I20260312 23:44:57.288092 1603 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 12 23:44:57.288131 update_engine[1603]: I20260312 23:44:57.288097 1603 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 12 23:44:57.288204 update_engine[1603]: I20260312 23:44:57.288163 1603 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 12 23:44:57.288204 update_engine[1603]: I20260312 23:44:57.288185 1603 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 12 23:44:57.288204 update_engine[1603]: I20260312 23:44:57.288190 1603 omaha_request_action.cc:272] Request: Mar 12 23:44:57.288204 update_engine[1603]: Mar 12 23:44:57.288204 update_engine[1603]: Mar 12 23:44:57.288204 update_engine[1603]: Mar 12 23:44:57.288204 update_engine[1603]: Mar 12 23:44:57.288204 update_engine[1603]: Mar 12 23:44:57.288204 update_engine[1603]: Mar 12 23:44:57.288204 update_engine[1603]: I20260312 23:44:57.288196 1603 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 12 23:44:57.288399 update_engine[1603]: I20260312 23:44:57.288213 1603 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 12 23:44:57.288522 update_engine[1603]: I20260312 23:44:57.288482 1603 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 12 23:44:57.288623 locksmithd[1650]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 12 23:44:57.294978 update_engine[1603]: E20260312 23:44:57.294930 1603 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 12 23:44:57.295048 update_engine[1603]: I20260312 23:44:57.295001 1603 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 12 23:44:57.295048 update_engine[1603]: I20260312 23:44:57.295008 1603 omaha_request_action.cc:617] Omaha request response: Mar 12 23:44:57.295048 update_engine[1603]: I20260312 23:44:57.295014 1603 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 12 23:44:57.295048 update_engine[1603]: I20260312 23:44:57.295018 1603 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 12 23:44:57.295048 update_engine[1603]: I20260312 23:44:57.295023 1603 update_attempter.cc:306] Processing Done. Mar 12 23:44:57.295048 update_engine[1603]: I20260312 23:44:57.295029 1603 update_attempter.cc:310] Error event sent. Mar 12 23:44:57.295048 update_engine[1603]: I20260312 23:44:57.295037 1603 update_check_scheduler.cc:74] Next update check in 41m11s Mar 12 23:44:57.295353 locksmithd[1650]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 12 23:45:43.561607 systemd[1]: Started sshd@7-10.0.4.241:22-20.161.92.111:53662.service - OpenSSH per-connection server daemon (20.161.92.111:53662). Mar 12 23:45:44.066493 sshd[6102]: Accepted publickey for core from 20.161.92.111 port 53662 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:45:44.067831 sshd-session[6102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:44.071624 systemd-logind[1600]: New session 8 of user core. Mar 12 23:45:44.086651 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 12 23:45:44.426307 sshd[6105]: Connection closed by 20.161.92.111 port 53662 Mar 12 23:45:44.426491 sshd-session[6102]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:44.429979 systemd[1]: sshd@7-10.0.4.241:22-20.161.92.111:53662.service: Deactivated successfully. Mar 12 23:45:44.431616 systemd[1]: session-8.scope: Deactivated successfully. Mar 12 23:45:44.434151 systemd-logind[1600]: Session 8 logged out. Waiting for processes to exit. Mar 12 23:45:44.435284 systemd-logind[1600]: Removed session 8. Mar 12 23:45:49.533681 systemd[1]: Started sshd@8-10.0.4.241:22-20.161.92.111:53672.service - OpenSSH per-connection server daemon (20.161.92.111:53672). Mar 12 23:45:50.055982 sshd[6145]: Accepted publickey for core from 20.161.92.111 port 53672 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:45:50.057352 sshd-session[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:50.061921 systemd-logind[1600]: New session 9 of user core. Mar 12 23:45:50.071443 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 12 23:45:50.402777 sshd[6148]: Connection closed by 20.161.92.111 port 53672 Mar 12 23:45:50.403376 sshd-session[6145]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:50.407526 systemd-logind[1600]: Session 9 logged out. Waiting for processes to exit. Mar 12 23:45:50.407950 systemd[1]: sshd@8-10.0.4.241:22-20.161.92.111:53672.service: Deactivated successfully. Mar 12 23:45:50.409823 systemd[1]: session-9.scope: Deactivated successfully. Mar 12 23:45:50.411248 systemd-logind[1600]: Removed session 9. Mar 12 23:45:55.508257 systemd[1]: Started sshd@9-10.0.4.241:22-20.161.92.111:41590.service - OpenSSH per-connection server daemon (20.161.92.111:41590). Mar 12 23:45:56.031256 sshd[6173]: Accepted publickey for core from 20.161.92.111 port 41590 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:45:56.032736 sshd-session[6173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:45:56.037218 systemd-logind[1600]: New session 10 of user core. Mar 12 23:45:56.043428 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 12 23:45:56.377617 sshd[6176]: Connection closed by 20.161.92.111 port 41590 Mar 12 23:45:56.378161 sshd-session[6173]: pam_unix(sshd:session): session closed for user core Mar 12 23:45:56.381744 systemd[1]: sshd@9-10.0.4.241:22-20.161.92.111:41590.service: Deactivated successfully. Mar 12 23:45:56.383512 systemd[1]: session-10.scope: Deactivated successfully. Mar 12 23:45:56.384291 systemd-logind[1600]: Session 10 logged out. Waiting for processes to exit. Mar 12 23:45:56.386726 systemd-logind[1600]: Removed session 10. Mar 12 23:46:01.486844 systemd[1]: Started sshd@10-10.0.4.241:22-20.161.92.111:55472.service - OpenSSH per-connection server daemon (20.161.92.111:55472). Mar 12 23:46:02.010164 sshd[6237]: Accepted publickey for core from 20.161.92.111 port 55472 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:02.011646 sshd-session[6237]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:02.016843 systemd-logind[1600]: New session 11 of user core. Mar 12 23:46:02.027511 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 12 23:46:02.364236 sshd[6240]: Connection closed by 20.161.92.111 port 55472 Mar 12 23:46:02.364692 sshd-session[6237]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:02.368812 systemd[1]: sshd@10-10.0.4.241:22-20.161.92.111:55472.service: Deactivated successfully. Mar 12 23:46:02.370544 systemd[1]: session-11.scope: Deactivated successfully. Mar 12 23:46:02.371476 systemd-logind[1600]: Session 11 logged out. Waiting for processes to exit. Mar 12 23:46:02.372996 systemd-logind[1600]: Removed session 11. Mar 12 23:46:07.470967 systemd[1]: Started sshd@11-10.0.4.241:22-20.161.92.111:55484.service - OpenSSH per-connection server daemon (20.161.92.111:55484). Mar 12 23:46:07.986305 sshd[6280]: Accepted publickey for core from 20.161.92.111 port 55484 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:07.987427 sshd-session[6280]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:07.991919 systemd-logind[1600]: New session 12 of user core. Mar 12 23:46:08.007450 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 12 23:46:08.331854 sshd[6283]: Connection closed by 20.161.92.111 port 55484 Mar 12 23:46:08.332420 sshd-session[6280]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:08.336074 systemd[1]: sshd@11-10.0.4.241:22-20.161.92.111:55484.service: Deactivated successfully. Mar 12 23:46:08.338030 systemd[1]: session-12.scope: Deactivated successfully. Mar 12 23:46:08.338747 systemd-logind[1600]: Session 12 logged out. Waiting for processes to exit. Mar 12 23:46:08.340062 systemd-logind[1600]: Removed session 12. Mar 12 23:46:08.440379 systemd[1]: Started sshd@12-10.0.4.241:22-20.161.92.111:55496.service - OpenSSH per-connection server daemon (20.161.92.111:55496). Mar 12 23:46:08.960773 sshd[6297]: Accepted publickey for core from 20.161.92.111 port 55496 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:08.962184 sshd-session[6297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:08.966404 systemd-logind[1600]: New session 13 of user core. Mar 12 23:46:08.974591 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 12 23:46:09.330510 sshd[6300]: Connection closed by 20.161.92.111 port 55496 Mar 12 23:46:09.329915 sshd-session[6297]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:09.333765 systemd[1]: sshd@12-10.0.4.241:22-20.161.92.111:55496.service: Deactivated successfully. Mar 12 23:46:09.335512 systemd[1]: session-13.scope: Deactivated successfully. Mar 12 23:46:09.336181 systemd-logind[1600]: Session 13 logged out. Waiting for processes to exit. Mar 12 23:46:09.337216 systemd-logind[1600]: Removed session 13. Mar 12 23:46:09.434811 systemd[1]: Started sshd@13-10.0.4.241:22-20.161.92.111:55508.service - OpenSSH per-connection server daemon (20.161.92.111:55508). Mar 12 23:46:09.950727 sshd[6312]: Accepted publickey for core from 20.161.92.111 port 55508 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:09.952550 sshd-session[6312]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:09.956618 systemd-logind[1600]: New session 14 of user core. Mar 12 23:46:09.965420 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 12 23:46:10.297339 sshd[6315]: Connection closed by 20.161.92.111 port 55508 Mar 12 23:46:10.297737 sshd-session[6312]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:10.301879 systemd[1]: sshd@13-10.0.4.241:22-20.161.92.111:55508.service: Deactivated successfully. Mar 12 23:46:10.303658 systemd[1]: session-14.scope: Deactivated successfully. Mar 12 23:46:10.304330 systemd-logind[1600]: Session 14 logged out. Waiting for processes to exit. Mar 12 23:46:10.305340 systemd-logind[1600]: Removed session 14. Mar 12 23:46:15.406749 systemd[1]: Started sshd@14-10.0.4.241:22-20.161.92.111:46694.service - OpenSSH per-connection server daemon (20.161.92.111:46694). Mar 12 23:46:15.922585 sshd[6376]: Accepted publickey for core from 20.161.92.111 port 46694 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:15.923938 sshd-session[6376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:15.929064 systemd-logind[1600]: New session 15 of user core. Mar 12 23:46:15.941468 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 12 23:46:16.264537 sshd[6382]: Connection closed by 20.161.92.111 port 46694 Mar 12 23:46:16.265029 sshd-session[6376]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:16.269481 systemd-logind[1600]: Session 15 logged out. Waiting for processes to exit. Mar 12 23:46:16.269851 systemd[1]: sshd@14-10.0.4.241:22-20.161.92.111:46694.service: Deactivated successfully. Mar 12 23:46:16.271696 systemd[1]: session-15.scope: Deactivated successfully. Mar 12 23:46:16.273368 systemd-logind[1600]: Removed session 15. Mar 12 23:46:16.372871 systemd[1]: Started sshd@15-10.0.4.241:22-20.161.92.111:46708.service - OpenSSH per-connection server daemon (20.161.92.111:46708). Mar 12 23:46:16.881933 sshd[6396]: Accepted publickey for core from 20.161.92.111 port 46708 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:16.883245 sshd-session[6396]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:16.887135 systemd-logind[1600]: New session 16 of user core. Mar 12 23:46:16.896473 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 12 23:46:17.272316 sshd[6399]: Connection closed by 20.161.92.111 port 46708 Mar 12 23:46:17.273035 sshd-session[6396]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:17.276532 systemd[1]: sshd@15-10.0.4.241:22-20.161.92.111:46708.service: Deactivated successfully. Mar 12 23:46:17.278205 systemd[1]: session-16.scope: Deactivated successfully. Mar 12 23:46:17.279420 systemd-logind[1600]: Session 16 logged out. Waiting for processes to exit. Mar 12 23:46:17.281345 systemd-logind[1600]: Removed session 16. Mar 12 23:46:17.384249 systemd[1]: Started sshd@16-10.0.4.241:22-20.161.92.111:46716.service - OpenSSH per-connection server daemon (20.161.92.111:46716). Mar 12 23:46:17.904313 sshd[6411]: Accepted publickey for core from 20.161.92.111 port 46716 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:17.905795 sshd-session[6411]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:17.909893 systemd-logind[1600]: New session 17 of user core. Mar 12 23:46:17.919437 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 12 23:46:18.765314 sshd[6414]: Connection closed by 20.161.92.111 port 46716 Mar 12 23:46:18.765793 sshd-session[6411]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:18.769807 systemd[1]: sshd@16-10.0.4.241:22-20.161.92.111:46716.service: Deactivated successfully. Mar 12 23:46:18.771610 systemd[1]: session-17.scope: Deactivated successfully. Mar 12 23:46:18.772348 systemd-logind[1600]: Session 17 logged out. Waiting for processes to exit. Mar 12 23:46:18.773542 systemd-logind[1600]: Removed session 17. Mar 12 23:46:18.870946 systemd[1]: Started sshd@17-10.0.4.241:22-20.161.92.111:46718.service - OpenSSH per-connection server daemon (20.161.92.111:46718). Mar 12 23:46:19.386397 sshd[6441]: Accepted publickey for core from 20.161.92.111 port 46718 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:19.387339 sshd-session[6441]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:19.391011 systemd-logind[1600]: New session 18 of user core. Mar 12 23:46:19.403592 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 12 23:46:19.829962 sshd[6444]: Connection closed by 20.161.92.111 port 46718 Mar 12 23:46:19.829695 sshd-session[6441]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:19.833428 systemd[1]: sshd@17-10.0.4.241:22-20.161.92.111:46718.service: Deactivated successfully. Mar 12 23:46:19.835150 systemd[1]: session-18.scope: Deactivated successfully. Mar 12 23:46:19.835883 systemd-logind[1600]: Session 18 logged out. Waiting for processes to exit. Mar 12 23:46:19.836921 systemd-logind[1600]: Removed session 18. Mar 12 23:46:19.942993 systemd[1]: Started sshd@18-10.0.4.241:22-20.161.92.111:46732.service - OpenSSH per-connection server daemon (20.161.92.111:46732). Mar 12 23:46:20.462411 sshd[6460]: Accepted publickey for core from 20.161.92.111 port 46732 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:20.464200 sshd-session[6460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:20.468370 systemd-logind[1600]: New session 19 of user core. Mar 12 23:46:20.474430 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 12 23:46:20.815951 sshd[6465]: Connection closed by 20.161.92.111 port 46732 Mar 12 23:46:20.815786 sshd-session[6460]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:20.820187 systemd[1]: sshd@18-10.0.4.241:22-20.161.92.111:46732.service: Deactivated successfully. Mar 12 23:46:20.821959 systemd[1]: session-19.scope: Deactivated successfully. Mar 12 23:46:20.822648 systemd-logind[1600]: Session 19 logged out. Waiting for processes to exit. Mar 12 23:46:20.823692 systemd-logind[1600]: Removed session 19. Mar 12 23:46:25.921497 systemd[1]: Started sshd@19-10.0.4.241:22-20.161.92.111:43732.service - OpenSSH per-connection server daemon (20.161.92.111:43732). Mar 12 23:46:26.435622 sshd[6510]: Accepted publickey for core from 20.161.92.111 port 43732 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:26.436981 sshd-session[6510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:26.440923 systemd-logind[1600]: New session 20 of user core. Mar 12 23:46:26.449568 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 12 23:46:26.779442 sshd[6513]: Connection closed by 20.161.92.111 port 43732 Mar 12 23:46:26.779789 sshd-session[6510]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:26.784124 systemd[1]: sshd@19-10.0.4.241:22-20.161.92.111:43732.service: Deactivated successfully. Mar 12 23:46:26.785915 systemd[1]: session-20.scope: Deactivated successfully. Mar 12 23:46:26.786646 systemd-logind[1600]: Session 20 logged out. Waiting for processes to exit. Mar 12 23:46:26.788007 systemd-logind[1600]: Removed session 20. Mar 12 23:46:31.894957 systemd[1]: Started sshd@20-10.0.4.241:22-20.161.92.111:44470.service - OpenSSH per-connection server daemon (20.161.92.111:44470). Mar 12 23:46:32.418958 sshd[6576]: Accepted publickey for core from 20.161.92.111 port 44470 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:32.420145 sshd-session[6576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:32.423944 systemd-logind[1600]: New session 21 of user core. Mar 12 23:46:32.439454 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 12 23:46:32.759337 sshd[6579]: Connection closed by 20.161.92.111 port 44470 Mar 12 23:46:32.759663 sshd-session[6576]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:32.764422 systemd[1]: sshd@20-10.0.4.241:22-20.161.92.111:44470.service: Deactivated successfully. Mar 12 23:46:32.766144 systemd[1]: session-21.scope: Deactivated successfully. Mar 12 23:46:32.766912 systemd-logind[1600]: Session 21 logged out. Waiting for processes to exit. Mar 12 23:46:32.768046 systemd-logind[1600]: Removed session 21. Mar 12 23:46:37.869323 systemd[1]: Started sshd@21-10.0.4.241:22-20.161.92.111:44472.service - OpenSSH per-connection server daemon (20.161.92.111:44472). Mar 12 23:46:38.373321 sshd[6593]: Accepted publickey for core from 20.161.92.111 port 44472 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:38.374199 sshd-session[6593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:38.378178 systemd-logind[1600]: New session 22 of user core. Mar 12 23:46:38.384470 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 12 23:46:38.715444 sshd[6596]: Connection closed by 20.161.92.111 port 44472 Mar 12 23:46:38.715582 sshd-session[6593]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:38.719373 systemd[1]: sshd@21-10.0.4.241:22-20.161.92.111:44472.service: Deactivated successfully. Mar 12 23:46:38.721144 systemd[1]: session-22.scope: Deactivated successfully. Mar 12 23:46:38.721903 systemd-logind[1600]: Session 22 logged out. Waiting for processes to exit. Mar 12 23:46:38.722928 systemd-logind[1600]: Removed session 22. Mar 12 23:46:43.823480 systemd[1]: Started sshd@22-10.0.4.241:22-20.161.92.111:50580.service - OpenSSH per-connection server daemon (20.161.92.111:50580). Mar 12 23:46:44.341526 sshd[6635]: Accepted publickey for core from 20.161.92.111 port 50580 ssh2: RSA SHA256:CzqfIjRsOnOnt75sejx5+PE6sq3hgmkcABrykNq+0wU Mar 12 23:46:44.342866 sshd-session[6635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 23:46:44.347021 systemd-logind[1600]: New session 23 of user core. Mar 12 23:46:44.354449 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 12 23:46:44.684451 sshd[6638]: Connection closed by 20.161.92.111 port 50580 Mar 12 23:46:44.684876 sshd-session[6635]: pam_unix(sshd:session): session closed for user core Mar 12 23:46:44.688576 systemd[1]: sshd@22-10.0.4.241:22-20.161.92.111:50580.service: Deactivated successfully. Mar 12 23:46:44.690396 systemd[1]: session-23.scope: Deactivated successfully. Mar 12 23:46:44.691105 systemd-logind[1600]: Session 23 logged out. Waiting for processes to exit. Mar 12 23:46:44.692149 systemd-logind[1600]: Removed session 23. Mar 12 23:47:15.316892 kubelet[2862]: E0312 23:47:15.316554 2862 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.4.241:41136->10.0.4.145:2379: read: connection timed out" Mar 12 23:47:15.612398 containerd[1626]: time="2026-03-12T23:47:15.612184284Z" level=warning msg="container event discarded" container=e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410 type=CONTAINER_CREATED_EVENT Mar 12 23:47:15.612398 containerd[1626]: time="2026-03-12T23:47:15.612309405Z" level=warning msg="container event discarded" container=e2382cee28f6064de03d3ba643ec6e13729c8d91edf90fba505aa92be9627410 type=CONTAINER_STARTED_EVENT Mar 12 23:47:15.647667 containerd[1626]: time="2026-03-12T23:47:15.647584936Z" level=warning msg="container event discarded" container=b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7 type=CONTAINER_CREATED_EVENT Mar 12 23:47:15.647783 containerd[1626]: time="2026-03-12T23:47:15.647661736Z" level=warning msg="container event discarded" container=6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d type=CONTAINER_CREATED_EVENT Mar 12 23:47:15.647783 containerd[1626]: time="2026-03-12T23:47:15.647696176Z" level=warning msg="container event discarded" container=6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d type=CONTAINER_STARTED_EVENT Mar 12 23:47:15.678798 containerd[1626]: time="2026-03-12T23:47:15.678720846Z" level=warning msg="container event discarded" container=71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95 type=CONTAINER_CREATED_EVENT Mar 12 23:47:15.678798 containerd[1626]: time="2026-03-12T23:47:15.678751886Z" level=warning msg="container event discarded" container=1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974 type=CONTAINER_CREATED_EVENT Mar 12 23:47:15.678798 containerd[1626]: time="2026-03-12T23:47:15.678762646Z" level=warning msg="container event discarded" container=1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974 type=CONTAINER_STARTED_EVENT Mar 12 23:47:15.701049 containerd[1626]: time="2026-03-12T23:47:15.700990794Z" level=warning msg="container event discarded" container=a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5 type=CONTAINER_CREATED_EVENT Mar 12 23:47:15.722316 containerd[1626]: time="2026-03-12T23:47:15.722238697Z" level=warning msg="container event discarded" container=b2e4e004dd7356383f5cdd4893b7e0d3050311009eb2e2e59412174097f6fae7 type=CONTAINER_STARTED_EVENT Mar 12 23:47:15.764597 containerd[1626]: time="2026-03-12T23:47:15.764491341Z" level=warning msg="container event discarded" container=71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95 type=CONTAINER_STARTED_EVENT Mar 12 23:47:15.764597 containerd[1626]: time="2026-03-12T23:47:15.764549941Z" level=warning msg="container event discarded" container=a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5 type=CONTAINER_STARTED_EVENT Mar 12 23:47:16.401100 systemd[1]: cri-containerd-b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366.scope: Deactivated successfully. Mar 12 23:47:16.401441 systemd[1]: cri-containerd-b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366.scope: Consumed 19.087s CPU time, 117.3M memory peak. Mar 12 23:47:16.402622 containerd[1626]: time="2026-03-12T23:47:16.402508028Z" level=info msg="received container exit event container_id:\"b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366\" id:\"b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366\" pid:3195 exit_status:1 exited_at:{seconds:1773359236 nanos:401845025}" Mar 12 23:47:16.407683 systemd[1]: cri-containerd-71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95.scope: Deactivated successfully. Mar 12 23:47:16.407984 systemd[1]: cri-containerd-71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95.scope: Consumed 4.880s CPU time, 63.9M memory peak. Mar 12 23:47:16.410053 containerd[1626]: time="2026-03-12T23:47:16.409980824Z" level=info msg="received container exit event container_id:\"71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95\" id:\"71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95\" pid:2705 exit_status:1 exited_at:{seconds:1773359236 nanos:409764663}" Mar 12 23:47:16.430340 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366-rootfs.mount: Deactivated successfully. Mar 12 23:47:16.436321 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95-rootfs.mount: Deactivated successfully. Mar 12 23:47:16.679081 kubelet[2862]: I0312 23:47:16.678991 2862 scope.go:117] "RemoveContainer" containerID="b7eca3215fcbf7f54d4788fd642c1f9e926d493b40c05b6831c82d524c233366" Mar 12 23:47:16.681582 containerd[1626]: time="2026-03-12T23:47:16.681523578Z" level=info msg="CreateContainer within sandbox \"0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 12 23:47:16.682425 kubelet[2862]: I0312 23:47:16.681979 2862 scope.go:117] "RemoveContainer" containerID="71de903b1edff8743cc3ba72e862c87b86e715b2a0be6c779ce5a30a48f31e95" Mar 12 23:47:16.683529 containerd[1626]: time="2026-03-12T23:47:16.683432907Z" level=info msg="CreateContainer within sandbox \"6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 12 23:47:16.690203 containerd[1626]: time="2026-03-12T23:47:16.689711777Z" level=info msg="Container 875ccec57f6b950e6671c16432d9b37ad927e43733eb737ad5eedfedb3050354: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:47:16.696557 containerd[1626]: time="2026-03-12T23:47:16.696511010Z" level=info msg="Container b5a2d032abc136cbcecce361d86968b663bf1b8be577c6dfbde0fc28e0e52e54: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:47:16.700709 containerd[1626]: time="2026-03-12T23:47:16.700665950Z" level=info msg="CreateContainer within sandbox \"0374537b4517941097c78f208873e71cbaa6ee00065b807a7c7e6617b913ffc4\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"875ccec57f6b950e6671c16432d9b37ad927e43733eb737ad5eedfedb3050354\"" Mar 12 23:47:16.701708 containerd[1626]: time="2026-03-12T23:47:16.701398354Z" level=info msg="StartContainer for \"875ccec57f6b950e6671c16432d9b37ad927e43733eb737ad5eedfedb3050354\"" Mar 12 23:47:16.703007 containerd[1626]: time="2026-03-12T23:47:16.702973762Z" level=info msg="connecting to shim 875ccec57f6b950e6671c16432d9b37ad927e43733eb737ad5eedfedb3050354" address="unix:///run/containerd/s/2aa3b40f6586b8579ea303c76fd9f6699dd8b1e83a16d44c8e6c7ba1439b2fba" protocol=ttrpc version=3 Mar 12 23:47:16.705118 containerd[1626]: time="2026-03-12T23:47:16.705054732Z" level=info msg="CreateContainer within sandbox \"6fdc19b1715658186aa06e3a20d90c5f24dfac03c4eca2b5b49903e8d44d964d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b5a2d032abc136cbcecce361d86968b663bf1b8be577c6dfbde0fc28e0e52e54\"" Mar 12 23:47:16.705664 containerd[1626]: time="2026-03-12T23:47:16.705644414Z" level=info msg="StartContainer for \"b5a2d032abc136cbcecce361d86968b663bf1b8be577c6dfbde0fc28e0e52e54\"" Mar 12 23:47:16.706894 containerd[1626]: time="2026-03-12T23:47:16.706860140Z" level=info msg="connecting to shim b5a2d032abc136cbcecce361d86968b663bf1b8be577c6dfbde0fc28e0e52e54" address="unix:///run/containerd/s/c2fbe77d15511d197e932da34b2a13ba989d460dc39f0ab7adaf9ba343f5ccc1" protocol=ttrpc version=3 Mar 12 23:47:16.723447 systemd[1]: Started cri-containerd-875ccec57f6b950e6671c16432d9b37ad927e43733eb737ad5eedfedb3050354.scope - libcontainer container 875ccec57f6b950e6671c16432d9b37ad927e43733eb737ad5eedfedb3050354. Mar 12 23:47:16.726808 systemd[1]: Started cri-containerd-b5a2d032abc136cbcecce361d86968b663bf1b8be577c6dfbde0fc28e0e52e54.scope - libcontainer container b5a2d032abc136cbcecce361d86968b663bf1b8be577c6dfbde0fc28e0e52e54. Mar 12 23:47:16.756212 containerd[1626]: time="2026-03-12T23:47:16.756055618Z" level=info msg="StartContainer for \"875ccec57f6b950e6671c16432d9b37ad927e43733eb737ad5eedfedb3050354\" returns successfully" Mar 12 23:47:16.771922 containerd[1626]: time="2026-03-12T23:47:16.771865615Z" level=info msg="StartContainer for \"b5a2d032abc136cbcecce361d86968b663bf1b8be577c6dfbde0fc28e0e52e54\" returns successfully" Mar 12 23:47:19.987613 kubelet[2862]: E0312 23:47:19.987376 2862 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.4.241:40904->10.0.4.145:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-27aefdfc79.189c3cd2d8aaf162 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-27aefdfc79,UID:3dadde19eb176ad149b114910369edcb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-27aefdfc79,},FirstTimestamp:2026-03-12 23:47:09.55301309 +0000 UTC m=+289.766897235,LastTimestamp:2026-03-12 23:47:09.55301309 +0000 UTC m=+289.766897235,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-27aefdfc79,}" Mar 12 23:47:20.364589 systemd[1]: cri-containerd-a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5.scope: Deactivated successfully. Mar 12 23:47:20.364880 systemd[1]: cri-containerd-a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5.scope: Consumed 4.799s CPU time, 22.8M memory peak. Mar 12 23:47:20.365768 containerd[1626]: time="2026-03-12T23:47:20.365735722Z" level=info msg="received container exit event container_id:\"a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5\" id:\"a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5\" pid:2727 exit_status:1 exited_at:{seconds:1773359240 nanos:365501881}" Mar 12 23:47:20.384555 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5-rootfs.mount: Deactivated successfully. Mar 12 23:47:20.698616 kubelet[2862]: I0312 23:47:20.698303 2862 scope.go:117] "RemoveContainer" containerID="a118123cf683ea343e8ba043ca7039894d00cb3363865e6afe3f6d35ebadefc5" Mar 12 23:47:20.700043 containerd[1626]: time="2026-03-12T23:47:20.699926659Z" level=info msg="CreateContainer within sandbox \"1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 12 23:47:20.707303 containerd[1626]: time="2026-03-12T23:47:20.706936773Z" level=info msg="Container a9059afbf3f52b20cc348b615ea3eaf79181d725d1cf5d22e5417a7752affcd2: CDI devices from CRI Config.CDIDevices: []" Mar 12 23:47:20.718642 containerd[1626]: time="2026-03-12T23:47:20.718585789Z" level=info msg="CreateContainer within sandbox \"1dacb3a2b14eca1b128c9c4c23edaae15ef09944261e021c472d521a9e507974\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"a9059afbf3f52b20cc348b615ea3eaf79181d725d1cf5d22e5417a7752affcd2\"" Mar 12 23:47:20.719377 containerd[1626]: time="2026-03-12T23:47:20.719348153Z" level=info msg="StartContainer for \"a9059afbf3f52b20cc348b615ea3eaf79181d725d1cf5d22e5417a7752affcd2\"" Mar 12 23:47:20.720436 containerd[1626]: time="2026-03-12T23:47:20.720395958Z" level=info msg="connecting to shim a9059afbf3f52b20cc348b615ea3eaf79181d725d1cf5d22e5417a7752affcd2" address="unix:///run/containerd/s/0d49336a1967a7ea33df0f9953ab76eb5eba8f826e37be83a80468c0921e10c3" protocol=ttrpc version=3 Mar 12 23:47:20.745472 systemd[1]: Started cri-containerd-a9059afbf3f52b20cc348b615ea3eaf79181d725d1cf5d22e5417a7752affcd2.scope - libcontainer container a9059afbf3f52b20cc348b615ea3eaf79181d725d1cf5d22e5417a7752affcd2. Mar 12 23:47:20.781730 containerd[1626]: time="2026-03-12T23:47:20.781680094Z" level=info msg="StartContainer for \"a9059afbf3f52b20cc348b615ea3eaf79181d725d1cf5d22e5417a7752affcd2\" returns successfully" Mar 12 23:47:23.654351 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec