Apr 16 23:25:52.761621 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Apr 16 23:25:52.761651 kernel: Linux version 6.12.81-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Apr 16 22:10:49 -00 2026 Apr 16 23:25:52.761664 kernel: KASLR enabled Apr 16 23:25:52.761672 kernel: efi: EFI v2.7 by EDK II Apr 16 23:25:52.761679 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Apr 16 23:25:52.761684 kernel: random: crng init done Apr 16 23:25:52.761691 kernel: secureboot: Secure boot disabled Apr 16 23:25:52.761696 kernel: ACPI: Early table checksum verification disabled Apr 16 23:25:52.761702 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Apr 16 23:25:52.761708 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Apr 16 23:25:52.761716 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761722 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761727 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761743 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761750 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761756 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761765 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761771 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761777 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761783 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 16 23:25:52.761789 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Apr 16 23:25:52.761794 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Apr 16 23:25:52.761800 kernel: ACPI: Use ACPI SPCR as default console: Yes Apr 16 23:25:52.761806 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Apr 16 23:25:52.761812 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Apr 16 23:25:52.761818 kernel: Zone ranges: Apr 16 23:25:52.761825 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Apr 16 23:25:52.761831 kernel: DMA32 empty Apr 16 23:25:52.761837 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Apr 16 23:25:52.761843 kernel: Device empty Apr 16 23:25:52.761849 kernel: Movable zone start for each node Apr 16 23:25:52.761855 kernel: Early memory node ranges Apr 16 23:25:52.761861 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Apr 16 23:25:52.761867 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Apr 16 23:25:52.761873 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Apr 16 23:25:52.761879 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Apr 16 23:25:52.761885 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Apr 16 23:25:52.761891 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Apr 16 23:25:52.761898 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Apr 16 23:25:52.761904 kernel: psci: probing for conduit method from ACPI. Apr 16 23:25:52.761912 kernel: psci: PSCIv1.3 detected in firmware. Apr 16 23:25:52.761919 kernel: psci: Using standard PSCI v0.2 function IDs Apr 16 23:25:52.761925 kernel: psci: Trusted OS migration not required Apr 16 23:25:52.761933 kernel: psci: SMC Calling Convention v1.1 Apr 16 23:25:52.761939 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Apr 16 23:25:52.761945 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Apr 16 23:25:52.761952 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Apr 16 23:25:52.761958 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Apr 16 23:25:52.761964 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Apr 16 23:25:52.761971 kernel: percpu: Embedded 33 pages/cpu s97752 r8192 d29224 u135168 Apr 16 23:25:52.761977 kernel: pcpu-alloc: s97752 r8192 d29224 u135168 alloc=33*4096 Apr 16 23:25:52.761984 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Apr 16 23:25:52.761990 kernel: Detected PIPT I-cache on CPU0 Apr 16 23:25:52.761996 kernel: CPU features: detected: GIC system register CPU interface Apr 16 23:25:52.762002 kernel: CPU features: detected: Spectre-v4 Apr 16 23:25:52.762010 kernel: CPU features: detected: Spectre-BHB Apr 16 23:25:52.762016 kernel: CPU features: kernel page table isolation forced ON by KASLR Apr 16 23:25:52.762023 kernel: CPU features: detected: Kernel page table isolation (KPTI) Apr 16 23:25:52.762029 kernel: CPU features: detected: ARM erratum 1418040 Apr 16 23:25:52.762035 kernel: CPU features: detected: SSBS not fully self-synchronizing Apr 16 23:25:52.762042 kernel: alternatives: applying boot alternatives Apr 16 23:25:52.762049 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=c4961845f9869114226296d88644496bf9e4629823927a5e8ae22de79f1c7b59 Apr 16 23:25:52.762056 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Apr 16 23:25:52.762062 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Apr 16 23:25:52.762069 kernel: Fallback order for Node 0: 0 Apr 16 23:25:52.762076 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Apr 16 23:25:52.762082 kernel: Policy zone: Normal Apr 16 23:25:52.762089 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 16 23:25:52.762095 kernel: software IO TLB: area num 4. Apr 16 23:25:52.762101 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Apr 16 23:25:52.762108 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Apr 16 23:25:52.762114 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 16 23:25:52.762121 kernel: rcu: RCU event tracing is enabled. Apr 16 23:25:52.762128 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Apr 16 23:25:52.762134 kernel: Trampoline variant of Tasks RCU enabled. Apr 16 23:25:52.762140 kernel: Tracing variant of Tasks RCU enabled. Apr 16 23:25:52.762147 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 16 23:25:52.762154 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Apr 16 23:25:52.762161 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Apr 16 23:25:52.762168 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Apr 16 23:25:52.762174 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Apr 16 23:25:52.762180 kernel: GICv3: 256 SPIs implemented Apr 16 23:25:52.762186 kernel: GICv3: 0 Extended SPIs implemented Apr 16 23:25:52.762193 kernel: Root IRQ handler: gic_handle_irq Apr 16 23:25:52.762199 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Apr 16 23:25:52.762205 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Apr 16 23:25:52.762212 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Apr 16 23:25:52.762218 kernel: ITS [mem 0x08080000-0x0809ffff] Apr 16 23:25:52.762224 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Apr 16 23:25:52.762232 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Apr 16 23:25:52.762239 kernel: GICv3: using LPI property table @0x0000000100130000 Apr 16 23:25:52.762245 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Apr 16 23:25:52.762251 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 16 23:25:52.762258 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 23:25:52.762264 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Apr 16 23:25:52.762270 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Apr 16 23:25:52.762277 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Apr 16 23:25:52.762283 kernel: arm-pv: using stolen time PV Apr 16 23:25:52.762290 kernel: Console: colour dummy device 80x25 Apr 16 23:25:52.762298 kernel: ACPI: Core revision 20240827 Apr 16 23:25:52.762304 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Apr 16 23:25:52.762311 kernel: pid_max: default: 32768 minimum: 301 Apr 16 23:25:52.762318 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Apr 16 23:25:52.762324 kernel: landlock: Up and running. Apr 16 23:25:52.762331 kernel: SELinux: Initializing. Apr 16 23:25:52.762337 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 16 23:25:52.762344 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 16 23:25:52.762351 kernel: rcu: Hierarchical SRCU implementation. Apr 16 23:25:52.762357 kernel: rcu: Max phase no-delay instances is 400. Apr 16 23:25:52.762365 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Apr 16 23:25:52.762372 kernel: Remapping and enabling EFI services. Apr 16 23:25:52.762378 kernel: smp: Bringing up secondary CPUs ... Apr 16 23:25:52.762385 kernel: Detected PIPT I-cache on CPU1 Apr 16 23:25:52.762391 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Apr 16 23:25:52.762398 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Apr 16 23:25:52.762405 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 23:25:52.762411 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Apr 16 23:25:52.762418 kernel: Detected PIPT I-cache on CPU2 Apr 16 23:25:52.762430 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Apr 16 23:25:52.762437 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Apr 16 23:25:52.762444 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 23:25:52.762452 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Apr 16 23:25:52.762459 kernel: Detected PIPT I-cache on CPU3 Apr 16 23:25:52.762466 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Apr 16 23:25:52.762473 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Apr 16 23:25:52.762483 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 16 23:25:52.762494 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Apr 16 23:25:52.762506 kernel: smp: Brought up 1 node, 4 CPUs Apr 16 23:25:52.762516 kernel: SMP: Total of 4 processors activated. Apr 16 23:25:52.762525 kernel: CPU: All CPU(s) started at EL1 Apr 16 23:25:52.762532 kernel: CPU features: detected: 32-bit EL0 Support Apr 16 23:25:52.762539 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Apr 16 23:25:52.762546 kernel: CPU features: detected: Common not Private translations Apr 16 23:25:52.762554 kernel: CPU features: detected: CRC32 instructions Apr 16 23:25:52.762561 kernel: CPU features: detected: Enhanced Virtualization Traps Apr 16 23:25:52.762569 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Apr 16 23:25:52.762577 kernel: CPU features: detected: LSE atomic instructions Apr 16 23:25:52.762584 kernel: CPU features: detected: Privileged Access Never Apr 16 23:25:52.762591 kernel: CPU features: detected: RAS Extension Support Apr 16 23:25:52.762598 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Apr 16 23:25:52.762606 kernel: alternatives: applying system-wide alternatives Apr 16 23:25:52.762612 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Apr 16 23:25:52.762620 kernel: Memory: 16297296K/16777216K available (11200K kernel code, 2458K rwdata, 9092K rodata, 39552K init, 1038K bss, 457136K reserved, 16384K cma-reserved) Apr 16 23:25:52.762627 kernel: devtmpfs: initialized Apr 16 23:25:52.762636 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 16 23:25:52.762643 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Apr 16 23:25:52.762650 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Apr 16 23:25:52.762657 kernel: 0 pages in range for non-PLT usage Apr 16 23:25:52.762664 kernel: 508384 pages in range for PLT usage Apr 16 23:25:52.762671 kernel: pinctrl core: initialized pinctrl subsystem Apr 16 23:25:52.762678 kernel: SMBIOS 3.0.0 present. Apr 16 23:25:52.762685 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Apr 16 23:25:52.762692 kernel: DMI: Memory slots populated: 1/1 Apr 16 23:25:52.762700 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 16 23:25:52.762707 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Apr 16 23:25:52.762715 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Apr 16 23:25:52.762722 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Apr 16 23:25:52.762735 kernel: audit: initializing netlink subsys (disabled) Apr 16 23:25:52.762757 kernel: audit: type=2000 audit(0.044:1): state=initialized audit_enabled=0 res=1 Apr 16 23:25:52.762765 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 16 23:25:52.762772 kernel: cpuidle: using governor menu Apr 16 23:25:52.762779 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Apr 16 23:25:52.762788 kernel: ASID allocator initialised with 32768 entries Apr 16 23:25:52.762795 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 16 23:25:52.762802 kernel: Serial: AMBA PL011 UART driver Apr 16 23:25:52.762809 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 16 23:25:52.762817 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Apr 16 23:25:52.762824 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Apr 16 23:25:52.762831 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Apr 16 23:25:52.762838 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 16 23:25:52.762845 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Apr 16 23:25:52.762853 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Apr 16 23:25:52.762860 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Apr 16 23:25:52.762867 kernel: ACPI: Added _OSI(Module Device) Apr 16 23:25:52.762874 kernel: ACPI: Added _OSI(Processor Device) Apr 16 23:25:52.762882 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 16 23:25:52.762889 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 16 23:25:52.762895 kernel: ACPI: Interpreter enabled Apr 16 23:25:52.762902 kernel: ACPI: Using GIC for interrupt routing Apr 16 23:25:52.762909 kernel: ACPI: MCFG table detected, 1 entries Apr 16 23:25:52.762918 kernel: ACPI: CPU0 has been hot-added Apr 16 23:25:52.762924 kernel: ACPI: CPU1 has been hot-added Apr 16 23:25:52.762931 kernel: ACPI: CPU2 has been hot-added Apr 16 23:25:52.762938 kernel: ACPI: CPU3 has been hot-added Apr 16 23:25:52.762945 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Apr 16 23:25:52.762952 kernel: printk: legacy console [ttyAMA0] enabled Apr 16 23:25:52.762959 kernel: ACPI: PCI: Interrupt link L000 configured for IRQ 35 Apr 16 23:25:52.762966 kernel: ACPI: PCI: Interrupt link L001 configured for IRQ 36 Apr 16 23:25:52.762973 kernel: ACPI: PCI: Interrupt link L002 configured for IRQ 37 Apr 16 23:25:52.762980 kernel: ACPI: PCI: Interrupt link L003 configured for IRQ 38 Apr 16 23:25:52.762988 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Apr 16 23:25:52.763129 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Apr 16 23:25:52.763193 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Apr 16 23:25:52.763251 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Apr 16 23:25:52.763315 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Apr 16 23:25:52.763373 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Apr 16 23:25:52.763384 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Apr 16 23:25:52.763391 kernel: PCI host bridge to bus 0000:00 Apr 16 23:25:52.763455 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Apr 16 23:25:52.763509 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Apr 16 23:25:52.763561 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Apr 16 23:25:52.763613 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Apr 16 23:25:52.763688 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Apr 16 23:25:52.763784 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.763848 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Apr 16 23:25:52.763908 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Apr 16 23:25:52.763966 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Apr 16 23:25:52.764023 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Apr 16 23:25:52.764081 kernel: pci 0000:00:01.0: enabling Extended Tags Apr 16 23:25:52.764148 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.764210 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Apr 16 23:25:52.764267 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Apr 16 23:25:52.764324 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Apr 16 23:25:52.764383 kernel: pci 0000:00:01.1: enabling Extended Tags Apr 16 23:25:52.764449 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.764507 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Apr 16 23:25:52.764566 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Apr 16 23:25:52.764647 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Apr 16 23:25:52.764706 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Apr 16 23:25:52.764807 kernel: pci 0000:00:01.2: enabling Extended Tags Apr 16 23:25:52.764876 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.764934 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Apr 16 23:25:52.764992 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Apr 16 23:25:52.765049 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Apr 16 23:25:52.765109 kernel: pci 0000:00:01.3: enabling Extended Tags Apr 16 23:25:52.765180 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.765241 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Apr 16 23:25:52.765300 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Apr 16 23:25:52.765356 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Apr 16 23:25:52.765413 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Apr 16 23:25:52.765470 kernel: pci 0000:00:01.4: enabling Extended Tags Apr 16 23:25:52.765536 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.765594 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Apr 16 23:25:52.765651 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Apr 16 23:25:52.765707 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Apr 16 23:25:52.765791 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Apr 16 23:25:52.765851 kernel: pci 0000:00:01.5: enabling Extended Tags Apr 16 23:25:52.765914 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.765976 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Apr 16 23:25:52.766033 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Apr 16 23:25:52.766089 kernel: pci 0000:00:01.6: enabling Extended Tags Apr 16 23:25:52.766152 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.766209 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Apr 16 23:25:52.766265 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Apr 16 23:25:52.766324 kernel: pci 0000:00:01.7: enabling Extended Tags Apr 16 23:25:52.766389 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.766446 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Apr 16 23:25:52.766503 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Apr 16 23:25:52.766559 kernel: pci 0000:00:02.0: enabling Extended Tags Apr 16 23:25:52.766622 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.766680 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Apr 16 23:25:52.766749 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Apr 16 23:25:52.766807 kernel: pci 0000:00:02.1: enabling Extended Tags Apr 16 23:25:52.766870 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.766927 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Apr 16 23:25:52.766984 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Apr 16 23:25:52.767041 kernel: pci 0000:00:02.2: enabling Extended Tags Apr 16 23:25:52.767105 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.767165 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Apr 16 23:25:52.767222 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Apr 16 23:25:52.767279 kernel: pci 0000:00:02.3: enabling Extended Tags Apr 16 23:25:52.767342 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.767400 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Apr 16 23:25:52.767457 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Apr 16 23:25:52.767514 kernel: pci 0000:00:02.4: enabling Extended Tags Apr 16 23:25:52.767583 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.767640 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Apr 16 23:25:52.767698 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Apr 16 23:25:52.767766 kernel: pci 0000:00:02.5: enabling Extended Tags Apr 16 23:25:52.767831 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.767889 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Apr 16 23:25:52.767946 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Apr 16 23:25:52.768005 kernel: pci 0000:00:02.6: enabling Extended Tags Apr 16 23:25:52.768068 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.768126 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Apr 16 23:25:52.768182 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Apr 16 23:25:52.768239 kernel: pci 0000:00:02.7: enabling Extended Tags Apr 16 23:25:52.768301 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.768359 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Apr 16 23:25:52.768418 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Apr 16 23:25:52.768474 kernel: pci 0000:00:03.0: enabling Extended Tags Apr 16 23:25:52.768537 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.768616 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Apr 16 23:25:52.768676 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Apr 16 23:25:52.768751 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Apr 16 23:25:52.768815 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Apr 16 23:25:52.768872 kernel: pci 0000:00:03.1: enabling Extended Tags Apr 16 23:25:52.768941 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.768999 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Apr 16 23:25:52.769056 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Apr 16 23:25:52.769112 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Apr 16 23:25:52.769169 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Apr 16 23:25:52.769226 kernel: pci 0000:00:03.2: enabling Extended Tags Apr 16 23:25:52.769288 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.769348 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Apr 16 23:25:52.769404 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Apr 16 23:25:52.769463 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Apr 16 23:25:52.769521 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Apr 16 23:25:52.769578 kernel: pci 0000:00:03.3: enabling Extended Tags Apr 16 23:25:52.769643 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.769703 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Apr 16 23:25:52.769789 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Apr 16 23:25:52.769851 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Apr 16 23:25:52.769908 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Apr 16 23:25:52.769965 kernel: pci 0000:00:03.4: enabling Extended Tags Apr 16 23:25:52.770033 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.770091 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Apr 16 23:25:52.770151 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Apr 16 23:25:52.770207 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Apr 16 23:25:52.770264 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Apr 16 23:25:52.770320 kernel: pci 0000:00:03.5: enabling Extended Tags Apr 16 23:25:52.770384 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.770442 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Apr 16 23:25:52.770501 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Apr 16 23:25:52.770561 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Apr 16 23:25:52.770620 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Apr 16 23:25:52.770686 kernel: pci 0000:00:03.6: enabling Extended Tags Apr 16 23:25:52.770775 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.770846 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Apr 16 23:25:52.770912 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Apr 16 23:25:52.770980 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Apr 16 23:25:52.771041 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Apr 16 23:25:52.771099 kernel: pci 0000:00:03.7: enabling Extended Tags Apr 16 23:25:52.771187 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.771249 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Apr 16 23:25:52.771309 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Apr 16 23:25:52.771366 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Apr 16 23:25:52.771423 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Apr 16 23:25:52.771479 kernel: pci 0000:00:04.0: enabling Extended Tags Apr 16 23:25:52.771542 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.771600 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Apr 16 23:25:52.771656 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Apr 16 23:25:52.771714 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Apr 16 23:25:52.771784 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Apr 16 23:25:52.771843 kernel: pci 0000:00:04.1: enabling Extended Tags Apr 16 23:25:52.771907 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.771964 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Apr 16 23:25:52.772020 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Apr 16 23:25:52.772077 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Apr 16 23:25:52.772138 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Apr 16 23:25:52.772195 kernel: pci 0000:00:04.2: enabling Extended Tags Apr 16 23:25:52.772260 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.772318 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Apr 16 23:25:52.772375 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Apr 16 23:25:52.772431 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Apr 16 23:25:52.772488 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Apr 16 23:25:52.772553 kernel: pci 0000:00:04.3: enabling Extended Tags Apr 16 23:25:52.772638 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.772698 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Apr 16 23:25:52.772774 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Apr 16 23:25:52.772836 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Apr 16 23:25:52.772894 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Apr 16 23:25:52.772966 kernel: pci 0000:00:04.4: enabling Extended Tags Apr 16 23:25:52.773032 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.773091 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Apr 16 23:25:52.773150 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Apr 16 23:25:52.773207 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Apr 16 23:25:52.773263 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Apr 16 23:25:52.773320 kernel: pci 0000:00:04.5: enabling Extended Tags Apr 16 23:25:52.773390 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.773447 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Apr 16 23:25:52.773504 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Apr 16 23:25:52.773561 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Apr 16 23:25:52.773618 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Apr 16 23:25:52.773675 kernel: pci 0000:00:04.6: enabling Extended Tags Apr 16 23:25:52.773754 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.773821 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Apr 16 23:25:52.773879 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Apr 16 23:25:52.773944 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Apr 16 23:25:52.774016 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Apr 16 23:25:52.774099 kernel: pci 0000:00:04.7: enabling Extended Tags Apr 16 23:25:52.774167 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Apr 16 23:25:52.774229 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Apr 16 23:25:52.774286 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Apr 16 23:25:52.774343 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Apr 16 23:25:52.774399 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Apr 16 23:25:52.774456 kernel: pci 0000:00:05.0: enabling Extended Tags Apr 16 23:25:52.774526 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Apr 16 23:25:52.774593 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Apr 16 23:25:52.774655 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Apr 16 23:25:52.774714 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Apr 16 23:25:52.774803 kernel: pci 0000:01:00.0: enabling Extended Tags Apr 16 23:25:52.774875 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Apr 16 23:25:52.774936 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Apr 16 23:25:52.774995 kernel: pci 0000:02:00.0: enabling Extended Tags Apr 16 23:25:52.775062 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Apr 16 23:25:52.775125 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Apr 16 23:25:52.775185 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Apr 16 23:25:52.775245 kernel: pci 0000:03:00.0: enabling Extended Tags Apr 16 23:25:52.775311 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Apr 16 23:25:52.775371 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Apr 16 23:25:52.775431 kernel: pci 0000:04:00.0: enabling Extended Tags Apr 16 23:25:52.775498 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Apr 16 23:25:52.775560 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Apr 16 23:25:52.775620 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Apr 16 23:25:52.775679 kernel: pci 0000:05:00.0: enabling Extended Tags Apr 16 23:25:52.775770 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Apr 16 23:25:52.775848 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Apr 16 23:25:52.775911 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Apr 16 23:25:52.775970 kernel: pci 0000:06:00.0: enabling Extended Tags Apr 16 23:25:52.776035 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Apr 16 23:25:52.776095 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Apr 16 23:25:52.776154 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Apr 16 23:25:52.776214 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Apr 16 23:25:52.776273 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Apr 16 23:25:52.776332 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Apr 16 23:25:52.776391 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Apr 16 23:25:52.776452 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Apr 16 23:25:52.776510 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Apr 16 23:25:52.776583 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Apr 16 23:25:52.776649 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Apr 16 23:25:52.776708 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Apr 16 23:25:52.776790 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Apr 16 23:25:52.776853 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Apr 16 23:25:52.776914 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Apr 16 23:25:52.776994 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Apr 16 23:25:52.777054 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Apr 16 23:25:52.777112 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Apr 16 23:25:52.777173 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Apr 16 23:25:52.777230 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Apr 16 23:25:52.777288 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Apr 16 23:25:52.777352 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Apr 16 23:25:52.777410 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Apr 16 23:25:52.777469 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Apr 16 23:25:52.777531 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Apr 16 23:25:52.777590 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Apr 16 23:25:52.777647 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Apr 16 23:25:52.777708 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Apr 16 23:25:52.777790 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Apr 16 23:25:52.777855 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Apr 16 23:25:52.777918 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Apr 16 23:25:52.777977 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Apr 16 23:25:52.778040 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Apr 16 23:25:52.778101 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Apr 16 23:25:52.778161 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Apr 16 23:25:52.778234 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Apr 16 23:25:52.778297 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Apr 16 23:25:52.778358 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Apr 16 23:25:52.778417 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Apr 16 23:25:52.778479 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Apr 16 23:25:52.778538 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Apr 16 23:25:52.778597 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Apr 16 23:25:52.778659 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Apr 16 23:25:52.778717 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Apr 16 23:25:52.778785 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Apr 16 23:25:52.778848 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Apr 16 23:25:52.778906 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Apr 16 23:25:52.778963 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Apr 16 23:25:52.779028 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Apr 16 23:25:52.779087 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Apr 16 23:25:52.779146 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Apr 16 23:25:52.779223 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Apr 16 23:25:52.779282 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Apr 16 23:25:52.779340 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Apr 16 23:25:52.779402 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Apr 16 23:25:52.779462 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Apr 16 23:25:52.779519 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Apr 16 23:25:52.779581 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Apr 16 23:25:52.779639 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Apr 16 23:25:52.779696 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Apr 16 23:25:52.779773 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Apr 16 23:25:52.779833 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Apr 16 23:25:52.779891 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Apr 16 23:25:52.779958 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Apr 16 23:25:52.780019 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Apr 16 23:25:52.780091 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Apr 16 23:25:52.780157 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Apr 16 23:25:52.780232 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Apr 16 23:25:52.780291 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Apr 16 23:25:52.780356 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Apr 16 23:25:52.780418 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Apr 16 23:25:52.780475 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Apr 16 23:25:52.780537 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Apr 16 23:25:52.780613 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Apr 16 23:25:52.780673 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Apr 16 23:25:52.780751 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Apr 16 23:25:52.780827 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Apr 16 23:25:52.780887 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Apr 16 23:25:52.780953 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Apr 16 23:25:52.781011 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Apr 16 23:25:52.781069 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Apr 16 23:25:52.781131 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Apr 16 23:25:52.781189 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Apr 16 23:25:52.781247 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Apr 16 23:25:52.781315 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Apr 16 23:25:52.781376 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Apr 16 23:25:52.781435 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Apr 16 23:25:52.781496 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Apr 16 23:25:52.781555 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Apr 16 23:25:52.781613 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Apr 16 23:25:52.781676 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Apr 16 23:25:52.781749 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Apr 16 23:25:52.781815 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Apr 16 23:25:52.781881 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Apr 16 23:25:52.781941 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Apr 16 23:25:52.782001 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Apr 16 23:25:52.782067 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Apr 16 23:25:52.782127 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Apr 16 23:25:52.782186 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Apr 16 23:25:52.782264 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Apr 16 23:25:52.782326 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Apr 16 23:25:52.782390 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Apr 16 23:25:52.782485 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Apr 16 23:25:52.782553 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Apr 16 23:25:52.782612 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Apr 16 23:25:52.782674 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Apr 16 23:25:52.782748 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Apr 16 23:25:52.782834 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Apr 16 23:25:52.782898 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Apr 16 23:25:52.782958 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Apr 16 23:25:52.783016 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Apr 16 23:25:52.783075 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Apr 16 23:25:52.783138 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Apr 16 23:25:52.783202 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Apr 16 23:25:52.783263 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Apr 16 23:25:52.783325 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Apr 16 23:25:52.783386 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Apr 16 23:25:52.783448 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Apr 16 23:25:52.783510 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Apr 16 23:25:52.783570 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Apr 16 23:25:52.783629 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Apr 16 23:25:52.783690 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Apr 16 23:25:52.783763 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Apr 16 23:25:52.783844 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Apr 16 23:25:52.783903 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Apr 16 23:25:52.783965 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Apr 16 23:25:52.784025 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Apr 16 23:25:52.784086 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Apr 16 23:25:52.784144 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Apr 16 23:25:52.784206 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Apr 16 23:25:52.784265 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Apr 16 23:25:52.784328 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Apr 16 23:25:52.784386 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Apr 16 23:25:52.784447 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Apr 16 23:25:52.784504 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Apr 16 23:25:52.784564 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Apr 16 23:25:52.784646 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Apr 16 23:25:52.784709 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Apr 16 23:25:52.784799 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Apr 16 23:25:52.784870 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Apr 16 23:25:52.784946 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Apr 16 23:25:52.785007 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Apr 16 23:25:52.785068 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Apr 16 23:25:52.785129 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Apr 16 23:25:52.785187 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Apr 16 23:25:52.785246 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Apr 16 23:25:52.785303 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Apr 16 23:25:52.785364 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Apr 16 23:25:52.785422 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Apr 16 23:25:52.785484 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Apr 16 23:25:52.785543 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Apr 16 23:25:52.785606 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Apr 16 23:25:52.785666 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Apr 16 23:25:52.785749 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Apr 16 23:25:52.785832 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Apr 16 23:25:52.785900 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Apr 16 23:25:52.785959 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Apr 16 23:25:52.786018 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Apr 16 23:25:52.786076 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Apr 16 23:25:52.786137 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Apr 16 23:25:52.786194 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Apr 16 23:25:52.786254 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Apr 16 23:25:52.786312 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Apr 16 23:25:52.786386 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Apr 16 23:25:52.786444 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Apr 16 23:25:52.786503 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Apr 16 23:25:52.786561 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Apr 16 23:25:52.786623 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Apr 16 23:25:52.786680 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Apr 16 23:25:52.786759 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Apr 16 23:25:52.786819 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Apr 16 23:25:52.786878 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Apr 16 23:25:52.786936 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Apr 16 23:25:52.786996 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Apr 16 23:25:52.787053 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Apr 16 23:25:52.787115 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Apr 16 23:25:52.787174 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Apr 16 23:25:52.787240 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Apr 16 23:25:52.787298 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Apr 16 23:25:52.787357 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Apr 16 23:25:52.787415 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Apr 16 23:25:52.787474 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Apr 16 23:25:52.787531 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Apr 16 23:25:52.787593 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Apr 16 23:25:52.787650 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Apr 16 23:25:52.787709 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Apr 16 23:25:52.787790 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Apr 16 23:25:52.787852 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Apr 16 23:25:52.787910 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Apr 16 23:25:52.787968 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Apr 16 23:25:52.788026 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Apr 16 23:25:52.788089 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Apr 16 23:25:52.788147 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Apr 16 23:25:52.788206 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Apr 16 23:25:52.788263 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Apr 16 23:25:52.788323 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Apr 16 23:25:52.788380 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.788437 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.788496 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Apr 16 23:25:52.788557 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.788633 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.788696 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Apr 16 23:25:52.788779 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.788841 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.788901 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Apr 16 23:25:52.788958 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.789020 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.789078 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Apr 16 23:25:52.789135 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.789192 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.789252 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Apr 16 23:25:52.789309 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.789369 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.789428 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Apr 16 23:25:52.789485 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.789542 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.789601 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Apr 16 23:25:52.789658 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.789717 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.789791 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Apr 16 23:25:52.789851 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.789909 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.789968 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Apr 16 23:25:52.790026 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.790086 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.790145 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Apr 16 23:25:52.790203 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.790260 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.790319 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Apr 16 23:25:52.790396 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.790459 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.790518 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Apr 16 23:25:52.790576 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.790634 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.790693 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Apr 16 23:25:52.790766 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.790829 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.790889 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Apr 16 23:25:52.790946 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.791003 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.791062 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Apr 16 23:25:52.791120 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.791179 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.791238 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Apr 16 23:25:52.791295 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.791352 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.791413 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Apr 16 23:25:52.791470 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.791529 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.791592 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Apr 16 23:25:52.791649 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Apr 16 23:25:52.791708 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Apr 16 23:25:52.791791 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Apr 16 23:25:52.791853 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Apr 16 23:25:52.791914 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Apr 16 23:25:52.791985 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Apr 16 23:25:52.792048 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Apr 16 23:25:52.792107 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Apr 16 23:25:52.792168 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Apr 16 23:25:52.792229 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Apr 16 23:25:52.792289 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Apr 16 23:25:52.792352 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Apr 16 23:25:52.792410 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Apr 16 23:25:52.792469 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Apr 16 23:25:52.792530 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.792602 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.792664 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.792722 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.792806 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.792866 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.792927 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.792989 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793050 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.793108 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793169 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.793230 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793292 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.793351 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793428 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.793512 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793575 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.793635 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793694 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.793782 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793848 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.793907 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.793970 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.794028 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.794087 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.794146 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.794205 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.794263 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.794326 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.794383 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.794451 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.794512 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.794575 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.794635 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.794698 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Apr 16 23:25:52.794774 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Apr 16 23:25:52.794843 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Apr 16 23:25:52.794905 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Apr 16 23:25:52.794965 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Apr 16 23:25:52.795023 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Apr 16 23:25:52.795097 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Apr 16 23:25:52.795155 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 23:25:52.795221 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Apr 16 23:25:52.795279 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Apr 16 23:25:52.795339 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Apr 16 23:25:52.795398 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 23:25:52.795463 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Apr 16 23:25:52.795526 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Apr 16 23:25:52.795589 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Apr 16 23:25:52.795647 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Apr 16 23:25:52.795705 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 23:25:52.795793 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Apr 16 23:25:52.795854 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Apr 16 23:25:52.795914 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Apr 16 23:25:52.795973 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 23:25:52.796038 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Apr 16 23:25:52.796102 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Apr 16 23:25:52.796159 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Apr 16 23:25:52.796218 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Apr 16 23:25:52.796276 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 23:25:52.796340 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Apr 16 23:25:52.796401 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Apr 16 23:25:52.796466 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Apr 16 23:25:52.796529 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Apr 16 23:25:52.796616 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 23:25:52.796680 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Apr 16 23:25:52.796761 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Apr 16 23:25:52.796831 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 23:25:52.796890 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Apr 16 23:25:52.796949 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Apr 16 23:25:52.797020 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 23:25:52.797089 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Apr 16 23:25:52.797152 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Apr 16 23:25:52.797212 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 23:25:52.797272 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Apr 16 23:25:52.797330 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Apr 16 23:25:52.797390 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Apr 16 23:25:52.797454 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Apr 16 23:25:52.797514 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Apr 16 23:25:52.797576 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Apr 16 23:25:52.797638 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Apr 16 23:25:52.797699 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Apr 16 23:25:52.797771 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Apr 16 23:25:52.797851 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Apr 16 23:25:52.797911 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Apr 16 23:25:52.797970 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Apr 16 23:25:52.798032 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Apr 16 23:25:52.798090 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Apr 16 23:25:52.798147 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Apr 16 23:25:52.798207 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Apr 16 23:25:52.798268 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Apr 16 23:25:52.798327 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Apr 16 23:25:52.798389 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Apr 16 23:25:52.798449 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Apr 16 23:25:52.798508 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Apr 16 23:25:52.798584 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Apr 16 23:25:52.798644 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Apr 16 23:25:52.798706 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Apr 16 23:25:52.798796 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Apr 16 23:25:52.798856 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Apr 16 23:25:52.798913 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Apr 16 23:25:52.798973 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Apr 16 23:25:52.799031 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Apr 16 23:25:52.799089 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Apr 16 23:25:52.799147 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Apr 16 23:25:52.799211 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Apr 16 23:25:52.799270 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Apr 16 23:25:52.799333 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Apr 16 23:25:52.799390 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Apr 16 23:25:52.799450 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Apr 16 23:25:52.799509 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Apr 16 23:25:52.799567 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Apr 16 23:25:52.799626 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Apr 16 23:25:52.799687 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Apr 16 23:25:52.799765 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Apr 16 23:25:52.799826 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Apr 16 23:25:52.799886 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Apr 16 23:25:52.799947 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Apr 16 23:25:52.800005 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Apr 16 23:25:52.800064 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Apr 16 23:25:52.800122 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Apr 16 23:25:52.800183 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Apr 16 23:25:52.800245 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Apr 16 23:25:52.800303 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Apr 16 23:25:52.800364 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Apr 16 23:25:52.800431 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Apr 16 23:25:52.800491 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Apr 16 23:25:52.800551 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Apr 16 23:25:52.800623 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Apr 16 23:25:52.800687 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Apr 16 23:25:52.800762 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Apr 16 23:25:52.800822 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Apr 16 23:25:52.800882 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Apr 16 23:25:52.800944 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Apr 16 23:25:52.801002 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Apr 16 23:25:52.801060 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Apr 16 23:25:52.801118 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Apr 16 23:25:52.801179 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Apr 16 23:25:52.801237 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Apr 16 23:25:52.801298 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Apr 16 23:25:52.801356 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Apr 16 23:25:52.801416 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Apr 16 23:25:52.801474 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Apr 16 23:25:52.801531 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Apr 16 23:25:52.801588 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Apr 16 23:25:52.801647 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Apr 16 23:25:52.801705 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Apr 16 23:25:52.801774 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Apr 16 23:25:52.801833 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Apr 16 23:25:52.801894 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Apr 16 23:25:52.801953 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Apr 16 23:25:52.802010 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Apr 16 23:25:52.802068 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Apr 16 23:25:52.802128 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Apr 16 23:25:52.802186 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Apr 16 23:25:52.802246 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Apr 16 23:25:52.802304 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Apr 16 23:25:52.802364 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Apr 16 23:25:52.802423 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Apr 16 23:25:52.802480 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Apr 16 23:25:52.802539 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Apr 16 23:25:52.802602 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Apr 16 23:25:52.802654 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Apr 16 23:25:52.802708 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Apr 16 23:25:52.802798 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Apr 16 23:25:52.802855 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Apr 16 23:25:52.802917 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Apr 16 23:25:52.802972 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Apr 16 23:25:52.803033 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Apr 16 23:25:52.803090 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Apr 16 23:25:52.803151 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Apr 16 23:25:52.803205 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Apr 16 23:25:52.803264 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Apr 16 23:25:52.803318 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Apr 16 23:25:52.803380 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Apr 16 23:25:52.803437 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 16 23:25:52.803499 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Apr 16 23:25:52.803553 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 16 23:25:52.803612 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Apr 16 23:25:52.803671 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 16 23:25:52.803754 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Apr 16 23:25:52.803818 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Apr 16 23:25:52.803878 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Apr 16 23:25:52.803932 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Apr 16 23:25:52.803992 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Apr 16 23:25:52.804046 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Apr 16 23:25:52.804106 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Apr 16 23:25:52.804159 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Apr 16 23:25:52.804222 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Apr 16 23:25:52.804275 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Apr 16 23:25:52.804335 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Apr 16 23:25:52.804390 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Apr 16 23:25:52.804450 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Apr 16 23:25:52.804505 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Apr 16 23:25:52.804578 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Apr 16 23:25:52.804638 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Apr 16 23:25:52.804699 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Apr 16 23:25:52.804775 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Apr 16 23:25:52.804847 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Apr 16 23:25:52.804909 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Apr 16 23:25:52.804971 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Apr 16 23:25:52.805028 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Apr 16 23:25:52.805083 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Apr 16 23:25:52.805142 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Apr 16 23:25:52.805196 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Apr 16 23:25:52.805251 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Apr 16 23:25:52.805313 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Apr 16 23:25:52.805368 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Apr 16 23:25:52.805422 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Apr 16 23:25:52.805482 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Apr 16 23:25:52.805535 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Apr 16 23:25:52.805588 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Apr 16 23:25:52.805649 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Apr 16 23:25:52.805704 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Apr 16 23:25:52.805774 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Apr 16 23:25:52.805839 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Apr 16 23:25:52.805895 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Apr 16 23:25:52.805950 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Apr 16 23:25:52.806009 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Apr 16 23:25:52.806068 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Apr 16 23:25:52.806122 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Apr 16 23:25:52.806184 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Apr 16 23:25:52.806238 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Apr 16 23:25:52.806291 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Apr 16 23:25:52.806355 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Apr 16 23:25:52.806409 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Apr 16 23:25:52.806465 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Apr 16 23:25:52.806524 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Apr 16 23:25:52.806578 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Apr 16 23:25:52.806633 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Apr 16 23:25:52.806693 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Apr 16 23:25:52.806764 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Apr 16 23:25:52.806821 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Apr 16 23:25:52.806883 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Apr 16 23:25:52.806939 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Apr 16 23:25:52.806993 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Apr 16 23:25:52.807053 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Apr 16 23:25:52.807108 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Apr 16 23:25:52.807161 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Apr 16 23:25:52.807223 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Apr 16 23:25:52.807279 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Apr 16 23:25:52.807333 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Apr 16 23:25:52.807392 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Apr 16 23:25:52.807445 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Apr 16 23:25:52.807498 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Apr 16 23:25:52.807508 kernel: iommu: Default domain type: Translated Apr 16 23:25:52.807515 kernel: iommu: DMA domain TLB invalidation policy: strict mode Apr 16 23:25:52.807525 kernel: efivars: Registered efivars operations Apr 16 23:25:52.807533 kernel: vgaarb: loaded Apr 16 23:25:52.807540 kernel: clocksource: Switched to clocksource arch_sys_counter Apr 16 23:25:52.807548 kernel: VFS: Disk quotas dquot_6.6.0 Apr 16 23:25:52.807555 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 16 23:25:52.807563 kernel: pnp: PnP ACPI init Apr 16 23:25:52.807633 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Apr 16 23:25:52.807643 kernel: pnp: PnP ACPI: found 1 devices Apr 16 23:25:52.807653 kernel: NET: Registered PF_INET protocol family Apr 16 23:25:52.807660 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Apr 16 23:25:52.807668 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Apr 16 23:25:52.807676 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 16 23:25:52.807683 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Apr 16 23:25:52.807691 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Apr 16 23:25:52.807698 kernel: TCP: Hash tables configured (established 131072 bind 65536) Apr 16 23:25:52.807706 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Apr 16 23:25:52.807714 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Apr 16 23:25:52.807723 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 16 23:25:52.807807 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Apr 16 23:25:52.807819 kernel: PCI: CLS 0 bytes, default 64 Apr 16 23:25:52.807826 kernel: kvm [1]: HYP mode not available Apr 16 23:25:52.807834 kernel: Initialise system trusted keyrings Apr 16 23:25:52.807842 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Apr 16 23:25:52.807849 kernel: Key type asymmetric registered Apr 16 23:25:52.807856 kernel: Asymmetric key parser 'x509' registered Apr 16 23:25:52.807864 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Apr 16 23:25:52.807874 kernel: io scheduler mq-deadline registered Apr 16 23:25:52.807881 kernel: io scheduler kyber registered Apr 16 23:25:52.807889 kernel: io scheduler bfq registered Apr 16 23:25:52.807896 kernel: ACPI: \_SB_.L001: Enabled at IRQ 36 Apr 16 23:25:52.807958 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Apr 16 23:25:52.808018 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Apr 16 23:25:52.808076 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.808137 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Apr 16 23:25:52.808197 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Apr 16 23:25:52.808256 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.808318 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Apr 16 23:25:52.808377 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Apr 16 23:25:52.808436 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.808499 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Apr 16 23:25:52.808557 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Apr 16 23:25:52.808658 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.808746 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Apr 16 23:25:52.808817 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Apr 16 23:25:52.808877 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.808937 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Apr 16 23:25:52.808996 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Apr 16 23:25:52.809056 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.809116 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Apr 16 23:25:52.809173 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Apr 16 23:25:52.809235 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.809294 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Apr 16 23:25:52.809353 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Apr 16 23:25:52.809411 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.809422 kernel: ACPI: \_SB_.L002: Enabled at IRQ 37 Apr 16 23:25:52.809481 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Apr 16 23:25:52.809540 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Apr 16 23:25:52.809597 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.809659 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Apr 16 23:25:52.809717 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Apr 16 23:25:52.809789 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.809853 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Apr 16 23:25:52.809912 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Apr 16 23:25:52.809969 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.810029 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Apr 16 23:25:52.810090 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Apr 16 23:25:52.810148 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.810208 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Apr 16 23:25:52.810266 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Apr 16 23:25:52.810322 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.810382 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Apr 16 23:25:52.810440 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Apr 16 23:25:52.810498 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.810560 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Apr 16 23:25:52.810618 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Apr 16 23:25:52.810675 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.810753 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Apr 16 23:25:52.810815 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Apr 16 23:25:52.810872 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.810882 kernel: ACPI: \_SB_.L003: Enabled at IRQ 38 Apr 16 23:25:52.810940 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Apr 16 23:25:52.811001 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Apr 16 23:25:52.811057 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.811117 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Apr 16 23:25:52.811175 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Apr 16 23:25:52.811234 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.811294 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Apr 16 23:25:52.811351 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Apr 16 23:25:52.811408 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.811469 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Apr 16 23:25:52.811527 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Apr 16 23:25:52.811584 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.811644 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Apr 16 23:25:52.811702 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Apr 16 23:25:52.811774 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.811837 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Apr 16 23:25:52.811898 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Apr 16 23:25:52.811955 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.812017 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Apr 16 23:25:52.812076 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Apr 16 23:25:52.812137 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.812197 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Apr 16 23:25:52.812257 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Apr 16 23:25:52.812314 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.812326 kernel: ACPI: \_SB_.L000: Enabled at IRQ 35 Apr 16 23:25:52.812385 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Apr 16 23:25:52.812443 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Apr 16 23:25:52.812501 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.812561 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Apr 16 23:25:52.812640 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Apr 16 23:25:52.812699 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.812782 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Apr 16 23:25:52.812850 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Apr 16 23:25:52.812908 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.812969 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Apr 16 23:25:52.813026 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Apr 16 23:25:52.813084 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.813145 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Apr 16 23:25:52.813203 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Apr 16 23:25:52.813260 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.813322 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Apr 16 23:25:52.813381 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Apr 16 23:25:52.813442 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.813505 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Apr 16 23:25:52.813565 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Apr 16 23:25:52.813637 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.813696 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Apr 16 23:25:52.813772 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Apr 16 23:25:52.813836 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.813897 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Apr 16 23:25:52.813955 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Apr 16 23:25:52.814013 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 16 23:25:52.814022 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Apr 16 23:25:52.814030 kernel: ACPI: button: Power Button [PWRB] Apr 16 23:25:52.814093 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Apr 16 23:25:52.814162 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Apr 16 23:25:52.814172 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 16 23:25:52.814179 kernel: thunder_xcv, ver 1.0 Apr 16 23:25:52.814187 kernel: thunder_bgx, ver 1.0 Apr 16 23:25:52.814194 kernel: nicpf, ver 1.0 Apr 16 23:25:52.814201 kernel: nicvf, ver 1.0 Apr 16 23:25:52.814270 kernel: rtc-efi rtc-efi.0: registered as rtc0 Apr 16 23:25:52.814326 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-04-16T23:25:52 UTC (1776381952) Apr 16 23:25:52.814348 kernel: hid: raw HID events driver (C) Jiri Kosina Apr 16 23:25:52.814358 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Apr 16 23:25:52.814365 kernel: watchdog: NMI not fully supported Apr 16 23:25:52.814373 kernel: watchdog: Hard watchdog permanently disabled Apr 16 23:25:52.814381 kernel: NET: Registered PF_INET6 protocol family Apr 16 23:25:52.814389 kernel: Segment Routing with IPv6 Apr 16 23:25:52.814396 kernel: In-situ OAM (IOAM) with IPv6 Apr 16 23:25:52.814404 kernel: NET: Registered PF_PACKET protocol family Apr 16 23:25:52.814412 kernel: Key type dns_resolver registered Apr 16 23:25:52.814419 kernel: registered taskstats version 1 Apr 16 23:25:52.814428 kernel: Loading compiled-in X.509 certificates Apr 16 23:25:52.814436 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.81-flatcar: 4acad53138393591155ecb80320b4c1550e344f8' Apr 16 23:25:52.814443 kernel: Demotion targets for Node 0: null Apr 16 23:25:52.814451 kernel: Key type .fscrypt registered Apr 16 23:25:52.814460 kernel: Key type fscrypt-provisioning registered Apr 16 23:25:52.814467 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 16 23:25:52.814475 kernel: ima: Allocated hash algorithm: sha1 Apr 16 23:25:52.814483 kernel: ima: No architecture policies found Apr 16 23:25:52.814492 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Apr 16 23:25:52.814499 kernel: clk: Disabling unused clocks Apr 16 23:25:52.814507 kernel: PM: genpd: Disabling unused power domains Apr 16 23:25:52.814515 kernel: Warning: unable to open an initial console. Apr 16 23:25:52.814523 kernel: Freeing unused kernel memory: 39552K Apr 16 23:25:52.814530 kernel: Run /init as init process Apr 16 23:25:52.814538 kernel: with arguments: Apr 16 23:25:52.814546 kernel: /init Apr 16 23:25:52.814553 kernel: with environment: Apr 16 23:25:52.814562 kernel: HOME=/ Apr 16 23:25:52.814569 kernel: TERM=linux Apr 16 23:25:52.814578 systemd[1]: Successfully made /usr/ read-only. Apr 16 23:25:52.814589 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Apr 16 23:25:52.814597 systemd[1]: Detected virtualization kvm. Apr 16 23:25:52.814605 systemd[1]: Detected architecture arm64. Apr 16 23:25:52.814613 systemd[1]: Running in initrd. Apr 16 23:25:52.814622 systemd[1]: No hostname configured, using default hostname. Apr 16 23:25:52.814631 systemd[1]: Hostname set to . Apr 16 23:25:52.814638 systemd[1]: Initializing machine ID from VM UUID. Apr 16 23:25:52.814652 systemd[1]: Queued start job for default target initrd.target. Apr 16 23:25:52.814660 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 23:25:52.814668 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 23:25:52.814677 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 16 23:25:52.814686 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 16 23:25:52.814695 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 16 23:25:52.814705 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 16 23:25:52.814714 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 16 23:25:52.814722 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 16 23:25:52.814740 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 23:25:52.814749 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 16 23:25:52.814757 systemd[1]: Reached target paths.target - Path Units. Apr 16 23:25:52.814768 systemd[1]: Reached target slices.target - Slice Units. Apr 16 23:25:52.814776 systemd[1]: Reached target swap.target - Swaps. Apr 16 23:25:52.814784 systemd[1]: Reached target timers.target - Timer Units. Apr 16 23:25:52.814793 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 16 23:25:52.814801 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 16 23:25:52.814810 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 16 23:25:52.814818 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Apr 16 23:25:52.814827 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 16 23:25:52.814835 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 16 23:25:52.814845 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 23:25:52.814853 systemd[1]: Reached target sockets.target - Socket Units. Apr 16 23:25:52.814861 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 16 23:25:52.814870 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 16 23:25:52.814878 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 16 23:25:52.814887 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Apr 16 23:25:52.814895 systemd[1]: Starting systemd-fsck-usr.service... Apr 16 23:25:52.814903 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 16 23:25:52.814912 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 16 23:25:52.814920 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 23:25:52.814928 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 16 23:25:52.814937 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 23:25:52.814945 systemd[1]: Finished systemd-fsck-usr.service. Apr 16 23:25:52.814955 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 16 23:25:52.814988 systemd-journald[312]: Collecting audit messages is disabled. Apr 16 23:25:52.815008 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 16 23:25:52.815018 kernel: Bridge firewalling registered Apr 16 23:25:52.815027 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 23:25:52.815035 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 16 23:25:52.815043 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 16 23:25:52.815052 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 16 23:25:52.815060 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 16 23:25:52.815069 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 16 23:25:52.815077 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 23:25:52.815086 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 16 23:25:52.815094 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 23:25:52.815103 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 16 23:25:52.815112 systemd-journald[312]: Journal started Apr 16 23:25:52.815130 systemd-journald[312]: Runtime Journal (/run/log/journal/4062eef59cbd4826a6af10f9e76e98d8) is 8M, max 319.5M, 311.5M free. Apr 16 23:25:52.760096 systemd-modules-load[313]: Inserted module 'overlay' Apr 16 23:25:52.817426 systemd[1]: Started systemd-journald.service - Journal Service. Apr 16 23:25:52.774521 systemd-modules-load[313]: Inserted module 'br_netfilter' Apr 16 23:25:52.820642 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 16 23:25:52.829111 systemd-tmpfiles[346]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Apr 16 23:25:52.832882 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 23:25:52.834817 dracut-cmdline[344]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=c4961845f9869114226296d88644496bf9e4629823927a5e8ae22de79f1c7b59 Apr 16 23:25:52.835336 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 16 23:25:52.871237 systemd-resolved[372]: Positive Trust Anchors: Apr 16 23:25:52.871258 systemd-resolved[372]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 16 23:25:52.871289 systemd-resolved[372]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 16 23:25:52.876883 systemd-resolved[372]: Defaulting to hostname 'linux'. Apr 16 23:25:52.878557 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 16 23:25:52.879657 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 16 23:25:52.908768 kernel: SCSI subsystem initialized Apr 16 23:25:52.912755 kernel: Loading iSCSI transport class v2.0-870. Apr 16 23:25:52.920758 kernel: iscsi: registered transport (tcp) Apr 16 23:25:52.933780 kernel: iscsi: registered transport (qla4xxx) Apr 16 23:25:52.933801 kernel: QLogic iSCSI HBA Driver Apr 16 23:25:52.950991 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 16 23:25:52.975796 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 16 23:25:52.979222 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 16 23:25:53.027164 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 16 23:25:53.029368 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 16 23:25:53.090786 kernel: raid6: neonx8 gen() 15773 MB/s Apr 16 23:25:53.107754 kernel: raid6: neonx4 gen() 15826 MB/s Apr 16 23:25:53.124775 kernel: raid6: neonx2 gen() 13202 MB/s Apr 16 23:25:53.141743 kernel: raid6: neonx1 gen() 10560 MB/s Apr 16 23:25:53.158746 kernel: raid6: int64x8 gen() 6912 MB/s Apr 16 23:25:53.175780 kernel: raid6: int64x4 gen() 7365 MB/s Apr 16 23:25:53.192748 kernel: raid6: int64x2 gen() 6111 MB/s Apr 16 23:25:53.209745 kernel: raid6: int64x1 gen() 5065 MB/s Apr 16 23:25:53.209771 kernel: raid6: using algorithm neonx4 gen() 15826 MB/s Apr 16 23:25:53.226795 kernel: raid6: .... xor() 12342 MB/s, rmw enabled Apr 16 23:25:53.226857 kernel: raid6: using neon recovery algorithm Apr 16 23:25:53.232203 kernel: xor: measuring software checksum speed Apr 16 23:25:53.232264 kernel: 8regs : 21618 MB/sec Apr 16 23:25:53.232797 kernel: 32regs : 21687 MB/sec Apr 16 23:25:53.233819 kernel: arm64_neon : 27946 MB/sec Apr 16 23:25:53.233835 kernel: xor: using function: arm64_neon (27946 MB/sec) Apr 16 23:25:53.286754 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 16 23:25:53.293638 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 16 23:25:53.296439 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 23:25:53.325016 systemd-udevd[566]: Using default interface naming scheme 'v255'. Apr 16 23:25:53.329108 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 23:25:53.331325 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 16 23:25:53.355359 dracut-pre-trigger[574]: rd.md=0: removing MD RAID activation Apr 16 23:25:53.379138 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 16 23:25:53.381406 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 16 23:25:53.458777 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 23:25:53.460646 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 16 23:25:53.501763 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Apr 16 23:25:53.504084 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Apr 16 23:25:53.510014 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Apr 16 23:25:53.510103 kernel: GPT:17805311 != 104857599 Apr 16 23:25:53.510154 kernel: GPT:Alternate GPT header not at the end of the disk. Apr 16 23:25:53.510902 kernel: GPT:17805311 != 104857599 Apr 16 23:25:53.510919 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 16 23:25:53.511872 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Apr 16 23:25:53.518745 kernel: ACPI: bus type USB registered Apr 16 23:25:53.521810 kernel: usbcore: registered new interface driver usbfs Apr 16 23:25:53.521842 kernel: usbcore: registered new interface driver hub Apr 16 23:25:53.521853 kernel: usbcore: registered new device driver usb Apr 16 23:25:53.544261 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 16 23:25:53.544443 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Apr 16 23:25:53.544526 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Apr 16 23:25:53.547901 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 16 23:25:53.548053 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Apr 16 23:25:53.548135 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Apr 16 23:25:53.549757 kernel: hub 1-0:1.0: USB hub found Apr 16 23:25:53.552358 kernel: hub 1-0:1.0: 4 ports detected Apr 16 23:25:53.552867 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Apr 16 23:25:53.554765 kernel: hub 2-0:1.0: USB hub found Apr 16 23:25:53.556778 kernel: hub 2-0:1.0: 4 ports detected Apr 16 23:25:53.556915 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 16 23:25:53.557022 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 23:25:53.559149 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 23:25:53.561795 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 23:25:53.597301 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Apr 16 23:25:53.598579 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 23:25:53.612186 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Apr 16 23:25:53.613509 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 16 23:25:53.622106 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Apr 16 23:25:53.633985 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Apr 16 23:25:53.634995 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Apr 16 23:25:53.637699 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 16 23:25:53.639661 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 23:25:53.641508 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 16 23:25:53.643877 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 16 23:25:53.645538 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 16 23:25:53.665067 disk-uuid[664]: Primary Header is updated. Apr 16 23:25:53.665067 disk-uuid[664]: Secondary Entries is updated. Apr 16 23:25:53.665067 disk-uuid[664]: Secondary Header is updated. Apr 16 23:25:53.668784 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 16 23:25:53.672752 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Apr 16 23:25:53.788782 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Apr 16 23:25:53.919632 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Apr 16 23:25:53.919685 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Apr 16 23:25:53.919965 kernel: usbcore: registered new interface driver usbhid Apr 16 23:25:53.920311 kernel: usbhid: USB HID core driver Apr 16 23:25:54.025782 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Apr 16 23:25:54.150768 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Apr 16 23:25:54.202775 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Apr 16 23:25:54.683764 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Apr 16 23:25:54.683828 disk-uuid[670]: The operation has completed successfully. Apr 16 23:25:54.723380 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 16 23:25:54.723480 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 16 23:25:54.753841 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 16 23:25:54.768125 sh[686]: Success Apr 16 23:25:54.781770 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 16 23:25:54.781862 kernel: device-mapper: uevent: version 1.0.3 Apr 16 23:25:54.782852 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Apr 16 23:25:54.789824 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Apr 16 23:25:54.843122 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 16 23:25:54.846631 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 16 23:25:54.859703 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 16 23:25:54.883776 kernel: BTRFS: device fsid 10cedb9e-43f1-4d98-9b55-3b84c3a61868 devid 1 transid 33 /dev/mapper/usr (253:0) scanned by mount (698) Apr 16 23:25:54.888298 kernel: BTRFS info (device dm-0): first mount of filesystem 10cedb9e-43f1-4d98-9b55-3b84c3a61868 Apr 16 23:25:54.888340 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Apr 16 23:25:55.109765 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Apr 16 23:25:55.109803 kernel: BTRFS info (device dm-0 state E): enabling free space tree Apr 16 23:25:55.162057 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 16 23:25:55.162948 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Apr 16 23:25:55.164260 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 16 23:25:55.165063 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 16 23:25:55.167886 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 16 23:25:55.199966 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (729) Apr 16 23:25:55.201748 kernel: BTRFS info (device vda6): first mount of filesystem 29b48a10-1a8e-4627-ab21-f0862573351d Apr 16 23:25:55.201778 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 23:25:55.218760 kernel: BTRFS info (device vda6): turning on async discard Apr 16 23:25:55.218802 kernel: BTRFS info (device vda6): enabling free space tree Apr 16 23:25:55.222786 kernel: BTRFS info (device vda6): last unmount of filesystem 29b48a10-1a8e-4627-ab21-f0862573351d Apr 16 23:25:55.223472 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 16 23:25:55.226133 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 16 23:25:55.258416 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 16 23:25:55.261340 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 16 23:25:55.297171 systemd-networkd[867]: lo: Link UP Apr 16 23:25:55.297186 systemd-networkd[867]: lo: Gained carrier Apr 16 23:25:55.298135 systemd-networkd[867]: Enumeration completed Apr 16 23:25:55.298262 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 16 23:25:55.298951 systemd-networkd[867]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 23:25:55.298955 systemd-networkd[867]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 23:25:55.299750 systemd[1]: Reached target network.target - Network. Apr 16 23:25:55.299796 systemd-networkd[867]: eth0: Link UP Apr 16 23:25:55.299895 systemd-networkd[867]: eth0: Gained carrier Apr 16 23:25:55.299906 systemd-networkd[867]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 23:25:55.316830 systemd-networkd[867]: eth0: DHCPv4 address 10.0.3.226/25, gateway 10.0.3.129 acquired from 10.0.3.129 Apr 16 23:25:55.577759 ignition[820]: Ignition 2.22.0 Apr 16 23:25:55.577774 ignition[820]: Stage: fetch-offline Apr 16 23:25:55.577805 ignition[820]: no configs at "/usr/lib/ignition/base.d" Apr 16 23:25:55.577812 ignition[820]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Apr 16 23:25:55.577892 ignition[820]: parsed url from cmdline: "" Apr 16 23:25:55.577895 ignition[820]: no config URL provided Apr 16 23:25:55.577899 ignition[820]: reading system config file "/usr/lib/ignition/user.ign" Apr 16 23:25:55.582218 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 16 23:25:55.577909 ignition[820]: no config at "/usr/lib/ignition/user.ign" Apr 16 23:25:55.584373 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 16 23:25:55.577914 ignition[820]: failed to fetch config: resource requires networking Apr 16 23:25:55.578149 ignition[820]: Ignition finished successfully Apr 16 23:25:55.613984 ignition[887]: Ignition 2.22.0 Apr 16 23:25:55.614002 ignition[887]: Stage: fetch Apr 16 23:25:55.614136 ignition[887]: no configs at "/usr/lib/ignition/base.d" Apr 16 23:25:55.614145 ignition[887]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Apr 16 23:25:55.614217 ignition[887]: parsed url from cmdline: "" Apr 16 23:25:55.614220 ignition[887]: no config URL provided Apr 16 23:25:55.614224 ignition[887]: reading system config file "/usr/lib/ignition/user.ign" Apr 16 23:25:55.614230 ignition[887]: no config at "/usr/lib/ignition/user.ign" Apr 16 23:25:55.614467 ignition[887]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Apr 16 23:25:55.614805 ignition[887]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Apr 16 23:25:55.614815 ignition[887]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Apr 16 23:25:56.615342 ignition[887]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Apr 16 23:25:56.615400 ignition[887]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Apr 16 23:25:56.754023 systemd-networkd[867]: eth0: Gained IPv6LL Apr 16 23:25:57.615566 ignition[887]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Apr 16 23:25:57.615593 ignition[887]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Apr 16 23:25:58.159421 ignition[887]: GET result: OK Apr 16 23:25:58.159625 ignition[887]: parsing config with SHA512: 8c1566ec2b55c0bf6e4a4bfd04fded7a42656ef3167edbaf93ba5ce8f860ebf9cd463c01e4b5a786c243eef9535f937b6e2bb665324814660826f4ccdefc6cc4 Apr 16 23:25:58.164289 unknown[887]: fetched base config from "system" Apr 16 23:25:58.164300 unknown[887]: fetched base config from "system" Apr 16 23:25:58.164642 ignition[887]: fetch: fetch complete Apr 16 23:25:58.164305 unknown[887]: fetched user config from "openstack" Apr 16 23:25:58.164647 ignition[887]: fetch: fetch passed Apr 16 23:25:58.166874 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 16 23:25:58.164688 ignition[887]: Ignition finished successfully Apr 16 23:25:58.169438 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 16 23:25:58.199020 ignition[895]: Ignition 2.22.0 Apr 16 23:25:58.199040 ignition[895]: Stage: kargs Apr 16 23:25:58.199187 ignition[895]: no configs at "/usr/lib/ignition/base.d" Apr 16 23:25:58.199197 ignition[895]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Apr 16 23:25:58.199961 ignition[895]: kargs: kargs passed Apr 16 23:25:58.202139 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 16 23:25:58.200010 ignition[895]: Ignition finished successfully Apr 16 23:25:58.204431 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 16 23:25:58.235355 ignition[903]: Ignition 2.22.0 Apr 16 23:25:58.235373 ignition[903]: Stage: disks Apr 16 23:25:58.235500 ignition[903]: no configs at "/usr/lib/ignition/base.d" Apr 16 23:25:58.235508 ignition[903]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Apr 16 23:25:58.236199 ignition[903]: disks: disks passed Apr 16 23:25:58.238517 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 16 23:25:58.236238 ignition[903]: Ignition finished successfully Apr 16 23:25:58.239851 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 16 23:25:58.241400 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 16 23:25:58.242815 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 16 23:25:58.244365 systemd[1]: Reached target sysinit.target - System Initialization. Apr 16 23:25:58.246033 systemd[1]: Reached target basic.target - Basic System. Apr 16 23:25:58.248492 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 16 23:25:58.281697 systemd-fsck[914]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Apr 16 23:25:58.289990 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 16 23:25:58.292332 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 16 23:25:58.392758 kernel: EXT4-fs (vda9): mounted filesystem 717eabe0-7ee2-4bf7-a9aa-0d27bb05c125 r/w with ordered data mode. Quota mode: none. Apr 16 23:25:58.393282 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 16 23:25:58.394403 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 16 23:25:58.397270 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 16 23:25:58.399163 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 16 23:25:58.400018 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Apr 16 23:25:58.400645 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Apr 16 23:25:58.404698 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 16 23:25:58.404743 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 16 23:25:58.413499 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 16 23:25:58.415445 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 16 23:25:58.426751 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (923) Apr 16 23:25:58.428917 kernel: BTRFS info (device vda6): first mount of filesystem 29b48a10-1a8e-4627-ab21-f0862573351d Apr 16 23:25:58.428953 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 23:25:58.442198 kernel: BTRFS info (device vda6): turning on async discard Apr 16 23:25:58.442250 kernel: BTRFS info (device vda6): enabling free space tree Apr 16 23:25:58.446049 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 16 23:25:58.467757 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:25:58.472439 initrd-setup-root[951]: cut: /sysroot/etc/passwd: No such file or directory Apr 16 23:25:58.477949 initrd-setup-root[958]: cut: /sysroot/etc/group: No such file or directory Apr 16 23:25:58.482954 initrd-setup-root[965]: cut: /sysroot/etc/shadow: No such file or directory Apr 16 23:25:58.487145 initrd-setup-root[972]: cut: /sysroot/etc/gshadow: No such file or directory Apr 16 23:25:58.574596 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 16 23:25:58.576591 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 16 23:25:58.578087 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 16 23:25:58.591714 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 16 23:25:58.593798 kernel: BTRFS info (device vda6): last unmount of filesystem 29b48a10-1a8e-4627-ab21-f0862573351d Apr 16 23:25:58.612203 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 16 23:25:58.617985 ignition[1040]: INFO : Ignition 2.22.0 Apr 16 23:25:58.617985 ignition[1040]: INFO : Stage: mount Apr 16 23:25:58.619328 ignition[1040]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 23:25:58.619328 ignition[1040]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Apr 16 23:25:58.619328 ignition[1040]: INFO : mount: mount passed Apr 16 23:25:58.619328 ignition[1040]: INFO : Ignition finished successfully Apr 16 23:25:58.621617 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 16 23:25:59.502759 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:01.510774 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:05.515751 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:05.520104 coreos-metadata[925]: Apr 16 23:26:05.519 WARN failed to locate config-drive, using the metadata service API instead Apr 16 23:26:05.536861 coreos-metadata[925]: Apr 16 23:26:05.536 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Apr 16 23:26:06.468683 coreos-metadata[925]: Apr 16 23:26:06.468 INFO Fetch successful Apr 16 23:26:06.468683 coreos-metadata[925]: Apr 16 23:26:06.468 INFO wrote hostname ci-4459-2-4-n-b2725589f5 to /sysroot/etc/hostname Apr 16 23:26:06.471955 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Apr 16 23:26:06.472810 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Apr 16 23:26:06.474955 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 16 23:26:06.492761 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 16 23:26:06.520773 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1058) Apr 16 23:26:06.522763 kernel: BTRFS info (device vda6): first mount of filesystem 29b48a10-1a8e-4627-ab21-f0862573351d Apr 16 23:26:06.522796 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Apr 16 23:26:06.528843 kernel: BTRFS info (device vda6): turning on async discard Apr 16 23:26:06.528922 kernel: BTRFS info (device vda6): enabling free space tree Apr 16 23:26:06.530358 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 16 23:26:06.564131 ignition[1076]: INFO : Ignition 2.22.0 Apr 16 23:26:06.564131 ignition[1076]: INFO : Stage: files Apr 16 23:26:06.565612 ignition[1076]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 23:26:06.565612 ignition[1076]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Apr 16 23:26:06.565612 ignition[1076]: DEBUG : files: compiled without relabeling support, skipping Apr 16 23:26:06.568863 ignition[1076]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 16 23:26:06.568863 ignition[1076]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 16 23:26:06.568863 ignition[1076]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 16 23:26:06.572184 ignition[1076]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 16 23:26:06.572184 ignition[1076]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 16 23:26:06.569336 unknown[1076]: wrote ssh authorized keys file for user: core Apr 16 23:26:06.575312 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 16 23:26:06.575312 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Apr 16 23:26:06.630355 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Apr 16 23:26:07.076196 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 16 23:26:07.076196 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Apr 16 23:26:07.079534 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-arm64.raw: attempt #1 Apr 16 23:26:07.256503 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Apr 16 23:26:08.294220 ignition[1076]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-arm64.raw" Apr 16 23:26:08.294220 ignition[1076]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 16 23:26:08.298831 ignition[1076]: INFO : files: files passed Apr 16 23:26:08.298831 ignition[1076]: INFO : Ignition finished successfully Apr 16 23:26:08.299951 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 16 23:26:08.301896 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 16 23:26:08.303931 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 16 23:26:08.316778 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 16 23:26:08.316894 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 16 23:26:08.321555 initrd-setup-root-after-ignition[1107]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 16 23:26:08.321555 initrd-setup-root-after-ignition[1107]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 16 23:26:08.324172 initrd-setup-root-after-ignition[1111]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 16 23:26:08.325483 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 16 23:26:08.327757 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 16 23:26:08.330351 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 16 23:26:08.377042 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 16 23:26:08.377178 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 16 23:26:08.378909 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 16 23:26:08.380681 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 16 23:26:08.382254 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 16 23:26:08.382988 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 16 23:26:08.404425 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 16 23:26:08.406622 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 16 23:26:08.426653 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 16 23:26:08.427806 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 23:26:08.429930 systemd[1]: Stopped target timers.target - Timer Units. Apr 16 23:26:08.431403 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 16 23:26:08.431517 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 16 23:26:08.433720 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 16 23:26:08.435472 systemd[1]: Stopped target basic.target - Basic System. Apr 16 23:26:08.436847 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 16 23:26:08.438344 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 16 23:26:08.439900 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 16 23:26:08.441566 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Apr 16 23:26:08.443187 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 16 23:26:08.444645 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 16 23:26:08.446331 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 16 23:26:08.447920 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 16 23:26:08.449376 systemd[1]: Stopped target swap.target - Swaps. Apr 16 23:26:08.450593 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 16 23:26:08.450709 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 16 23:26:08.452621 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 16 23:26:08.454281 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 23:26:08.455826 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 16 23:26:08.455929 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 23:26:08.457601 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 16 23:26:08.457703 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 16 23:26:08.461709 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 16 23:26:08.461846 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 16 23:26:08.463759 systemd[1]: ignition-files.service: Deactivated successfully. Apr 16 23:26:08.463862 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 16 23:26:08.466137 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 16 23:26:08.468519 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 16 23:26:08.469329 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 16 23:26:08.469443 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 23:26:08.470962 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 16 23:26:08.471055 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 16 23:26:08.475488 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 16 23:26:08.475891 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 16 23:26:08.485793 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 16 23:26:08.490465 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 16 23:26:08.490599 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 16 23:26:08.492697 ignition[1133]: INFO : Ignition 2.22.0 Apr 16 23:26:08.492697 ignition[1133]: INFO : Stage: umount Apr 16 23:26:08.492697 ignition[1133]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 16 23:26:08.492697 ignition[1133]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Apr 16 23:26:08.492697 ignition[1133]: INFO : umount: umount passed Apr 16 23:26:08.492697 ignition[1133]: INFO : Ignition finished successfully Apr 16 23:26:08.493419 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 16 23:26:08.493528 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 16 23:26:08.494881 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 16 23:26:08.494918 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 16 23:26:08.496315 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 16 23:26:08.496350 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 16 23:26:08.498008 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 16 23:26:08.498045 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 16 23:26:08.499299 systemd[1]: Stopped target network.target - Network. Apr 16 23:26:08.500492 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 16 23:26:08.500549 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 16 23:26:08.502161 systemd[1]: Stopped target paths.target - Path Units. Apr 16 23:26:08.503389 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 16 23:26:08.506776 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 23:26:08.508222 systemd[1]: Stopped target slices.target - Slice Units. Apr 16 23:26:08.509521 systemd[1]: Stopped target sockets.target - Socket Units. Apr 16 23:26:08.510800 systemd[1]: iscsid.socket: Deactivated successfully. Apr 16 23:26:08.510834 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 16 23:26:08.512206 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 16 23:26:08.512232 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 16 23:26:08.513694 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 16 23:26:08.513749 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 16 23:26:08.515069 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 16 23:26:08.515103 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 16 23:26:08.516625 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 16 23:26:08.516666 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 16 23:26:08.518712 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 16 23:26:08.520109 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 16 23:26:08.527356 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 16 23:26:08.527459 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 16 23:26:08.530952 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Apr 16 23:26:08.531159 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 16 23:26:08.531193 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 23:26:08.534195 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Apr 16 23:26:08.536186 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 16 23:26:08.536308 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 16 23:26:08.538796 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Apr 16 23:26:08.538915 systemd[1]: Stopped target network-pre.target - Preparation for Network. Apr 16 23:26:08.540099 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 16 23:26:08.540127 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 16 23:26:08.542405 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 16 23:26:08.543262 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 16 23:26:08.543310 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 16 23:26:08.545165 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 16 23:26:08.545208 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 16 23:26:08.547353 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 16 23:26:08.547392 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 16 23:26:08.549173 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 23:26:08.551455 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Apr 16 23:26:08.564784 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 16 23:26:08.565871 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 23:26:08.567316 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 16 23:26:08.567348 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 16 23:26:08.568920 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 16 23:26:08.568947 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 23:26:08.570509 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 16 23:26:08.570547 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 16 23:26:08.572753 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 16 23:26:08.572794 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 16 23:26:08.574976 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 16 23:26:08.575019 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 16 23:26:08.578139 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 16 23:26:08.578999 systemd[1]: systemd-network-generator.service: Deactivated successfully. Apr 16 23:26:08.579046 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Apr 16 23:26:08.581801 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 16 23:26:08.581841 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 23:26:08.584612 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Apr 16 23:26:08.584649 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 16 23:26:08.588574 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 16 23:26:08.588638 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 23:26:08.590676 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 16 23:26:08.590712 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 23:26:08.593872 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 16 23:26:08.594774 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 16 23:26:08.596249 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 16 23:26:08.596362 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 16 23:26:08.599965 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 16 23:26:08.601588 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 16 23:26:08.620341 systemd[1]: Switching root. Apr 16 23:26:08.645491 systemd-journald[312]: Journal stopped Apr 16 23:26:09.601428 systemd-journald[312]: Received SIGTERM from PID 1 (systemd). Apr 16 23:26:09.601511 kernel: SELinux: policy capability network_peer_controls=1 Apr 16 23:26:09.601531 kernel: SELinux: policy capability open_perms=1 Apr 16 23:26:09.601540 kernel: SELinux: policy capability extended_socket_class=1 Apr 16 23:26:09.601552 kernel: SELinux: policy capability always_check_network=0 Apr 16 23:26:09.601564 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 16 23:26:09.601576 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 16 23:26:09.601585 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 16 23:26:09.601595 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 16 23:26:09.601604 kernel: SELinux: policy capability userspace_initial_context=0 Apr 16 23:26:09.601613 kernel: audit: type=1403 audit(1776381968.780:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 16 23:26:09.601624 systemd[1]: Successfully loaded SELinux policy in 49.963ms. Apr 16 23:26:09.601643 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.524ms. Apr 16 23:26:09.601654 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Apr 16 23:26:09.601666 systemd[1]: Detected virtualization kvm. Apr 16 23:26:09.601679 systemd[1]: Detected architecture arm64. Apr 16 23:26:09.601688 systemd[1]: Detected first boot. Apr 16 23:26:09.601699 systemd[1]: Hostname set to . Apr 16 23:26:09.601709 systemd[1]: Initializing machine ID from VM UUID. Apr 16 23:26:09.601718 zram_generator::config[1178]: No configuration found. Apr 16 23:26:09.601739 kernel: NET: Registered PF_VSOCK protocol family Apr 16 23:26:09.601754 systemd[1]: Populated /etc with preset unit settings. Apr 16 23:26:09.601765 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Apr 16 23:26:09.601775 systemd[1]: initrd-switch-root.service: Deactivated successfully. Apr 16 23:26:09.601785 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Apr 16 23:26:09.601797 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Apr 16 23:26:09.601806 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 16 23:26:09.601816 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 16 23:26:09.601825 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 16 23:26:09.601841 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 16 23:26:09.601851 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 16 23:26:09.601861 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 16 23:26:09.601870 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 16 23:26:09.601881 systemd[1]: Created slice user.slice - User and Session Slice. Apr 16 23:26:09.601891 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 16 23:26:09.601901 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 16 23:26:09.601910 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 16 23:26:09.601920 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 16 23:26:09.601930 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 16 23:26:09.601940 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 16 23:26:09.601950 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Apr 16 23:26:09.601961 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 16 23:26:09.601971 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 16 23:26:09.601980 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Apr 16 23:26:09.601991 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Apr 16 23:26:09.602000 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Apr 16 23:26:09.602010 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 16 23:26:09.602020 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 16 23:26:09.602030 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 16 23:26:09.602040 systemd[1]: Reached target slices.target - Slice Units. Apr 16 23:26:09.602050 systemd[1]: Reached target swap.target - Swaps. Apr 16 23:26:09.602061 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 16 23:26:09.602071 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 16 23:26:09.602081 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Apr 16 23:26:09.602090 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 16 23:26:09.602100 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 16 23:26:09.602110 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 16 23:26:09.602119 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 16 23:26:09.602131 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 16 23:26:09.602140 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 16 23:26:09.602150 systemd[1]: Mounting media.mount - External Media Directory... Apr 16 23:26:09.602159 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 16 23:26:09.602169 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 16 23:26:09.602178 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 16 23:26:09.602189 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 16 23:26:09.602199 systemd[1]: Reached target machines.target - Containers. Apr 16 23:26:09.602210 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 16 23:26:09.602220 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 23:26:09.602229 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 16 23:26:09.602239 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 16 23:26:09.602249 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 23:26:09.602258 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 16 23:26:09.602268 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 23:26:09.602277 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 16 23:26:09.602287 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 23:26:09.602298 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 16 23:26:09.602308 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Apr 16 23:26:09.602317 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Apr 16 23:26:09.602327 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Apr 16 23:26:09.602338 systemd[1]: Stopped systemd-fsck-usr.service. Apr 16 23:26:09.602348 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Apr 16 23:26:09.602361 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 16 23:26:09.602370 kernel: fuse: init (API version 7.41) Apr 16 23:26:09.602400 systemd-journald[1239]: Collecting audit messages is disabled. Apr 16 23:26:09.602425 systemd-journald[1239]: Journal started Apr 16 23:26:09.602446 systemd-journald[1239]: Runtime Journal (/run/log/journal/4062eef59cbd4826a6af10f9e76e98d8) is 8M, max 319.5M, 311.5M free. Apr 16 23:26:09.391689 systemd[1]: Queued start job for default target multi-user.target. Apr 16 23:26:09.402974 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Apr 16 23:26:09.403367 systemd[1]: systemd-journald.service: Deactivated successfully. Apr 16 23:26:09.603925 kernel: loop: module loaded Apr 16 23:26:09.603961 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 16 23:26:09.607788 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 16 23:26:09.609778 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 16 23:26:09.619940 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Apr 16 23:26:09.623661 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 16 23:26:09.628123 systemd[1]: verity-setup.service: Deactivated successfully. Apr 16 23:26:09.628166 systemd[1]: Stopped verity-setup.service. Apr 16 23:26:09.632417 systemd[1]: Started systemd-journald.service - Journal Service. Apr 16 23:26:09.632956 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 16 23:26:09.633933 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 16 23:26:09.634994 systemd[1]: Mounted media.mount - External Media Directory. Apr 16 23:26:09.636150 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 16 23:26:09.637465 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 16 23:26:09.638587 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 16 23:26:09.640057 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 16 23:26:09.643424 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 16 23:26:09.643593 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 16 23:26:09.647139 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 23:26:09.647307 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 23:26:09.648622 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 23:26:09.648890 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 23:26:09.650332 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 16 23:26:09.651641 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 16 23:26:09.651823 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 16 23:26:09.653110 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 23:26:09.653289 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 23:26:09.654533 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 16 23:26:09.655794 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 16 23:26:09.657043 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 16 23:26:09.666744 kernel: ACPI: bus type drm_connector registered Apr 16 23:26:09.666655 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 16 23:26:09.668798 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 16 23:26:09.669997 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Apr 16 23:26:09.675574 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 16 23:26:09.677823 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 16 23:26:09.679601 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 16 23:26:09.680795 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 16 23:26:09.680834 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 16 23:26:09.682493 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Apr 16 23:26:09.694861 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 16 23:26:09.695826 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 23:26:09.697381 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 16 23:26:09.699376 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 16 23:26:09.700443 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 16 23:26:09.701407 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 16 23:26:09.702400 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 16 23:26:09.705150 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 16 23:26:09.709893 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 16 23:26:09.713017 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 16 23:26:09.717110 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 16 23:26:09.718240 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 16 23:26:09.719297 systemd-journald[1239]: Time spent on flushing to /var/log/journal/4062eef59cbd4826a6af10f9e76e98d8 is 39.769ms for 1728 entries. Apr 16 23:26:09.719297 systemd-journald[1239]: System Journal (/var/log/journal/4062eef59cbd4826a6af10f9e76e98d8) is 8M, max 584.8M, 576.8M free. Apr 16 23:26:09.778654 systemd-journald[1239]: Received client request to flush runtime journal. Apr 16 23:26:09.778715 kernel: loop0: detected capacity change from 0 to 100632 Apr 16 23:26:09.722034 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 16 23:26:09.728910 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 16 23:26:09.731247 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 16 23:26:09.734001 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Apr 16 23:26:09.750358 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 16 23:26:09.755752 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Apr 16 23:26:09.755762 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Apr 16 23:26:09.758953 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 16 23:26:09.763990 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 16 23:26:09.780042 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 16 23:26:09.789820 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Apr 16 23:26:09.808783 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 16 23:26:09.822912 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 16 23:26:09.825133 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 16 23:26:09.832778 kernel: loop1: detected capacity change from 0 to 119840 Apr 16 23:26:09.845844 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Apr 16 23:26:09.845950 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Apr 16 23:26:09.849397 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 16 23:26:09.887773 kernel: loop2: detected capacity change from 0 to 1632 Apr 16 23:26:09.918769 kernel: loop3: detected capacity change from 0 to 200864 Apr 16 23:26:09.960770 kernel: loop4: detected capacity change from 0 to 100632 Apr 16 23:26:09.973759 kernel: loop5: detected capacity change from 0 to 119840 Apr 16 23:26:09.988757 kernel: loop6: detected capacity change from 0 to 1632 Apr 16 23:26:09.996757 kernel: loop7: detected capacity change from 0 to 200864 Apr 16 23:26:10.017772 (sd-merge)[1328]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Apr 16 23:26:10.018340 (sd-merge)[1328]: Merged extensions into '/usr'. Apr 16 23:26:10.022156 systemd[1]: Reload requested from client PID 1297 ('systemd-sysext') (unit systemd-sysext.service)... Apr 16 23:26:10.022175 systemd[1]: Reloading... Apr 16 23:26:10.075767 zram_generator::config[1351]: No configuration found. Apr 16 23:26:10.226709 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 16 23:26:10.226869 systemd[1]: Reloading finished in 204 ms. Apr 16 23:26:10.244445 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 16 23:26:10.247272 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 16 23:26:10.265012 systemd[1]: Starting ensure-sysext.service... Apr 16 23:26:10.266836 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 16 23:26:10.270893 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 16 23:26:10.278118 systemd[1]: Reload requested from client PID 1391 ('systemctl') (unit ensure-sysext.service)... Apr 16 23:26:10.278139 systemd[1]: Reloading... Apr 16 23:26:10.280185 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Apr 16 23:26:10.280228 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Apr 16 23:26:10.280446 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 16 23:26:10.280660 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 16 23:26:10.281325 systemd-tmpfiles[1393]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 16 23:26:10.281558 systemd-tmpfiles[1393]: ACLs are not supported, ignoring. Apr 16 23:26:10.281609 systemd-tmpfiles[1393]: ACLs are not supported, ignoring. Apr 16 23:26:10.285665 systemd-tmpfiles[1393]: Detected autofs mount point /boot during canonicalization of boot. Apr 16 23:26:10.285682 systemd-tmpfiles[1393]: Skipping /boot Apr 16 23:26:10.292323 systemd-tmpfiles[1393]: Detected autofs mount point /boot during canonicalization of boot. Apr 16 23:26:10.292340 systemd-tmpfiles[1393]: Skipping /boot Apr 16 23:26:10.303629 systemd-udevd[1394]: Using default interface naming scheme 'v255'. Apr 16 23:26:10.332903 zram_generator::config[1422]: No configuration found. Apr 16 23:26:10.455918 kernel: mousedev: PS/2 mouse device common for all mice Apr 16 23:26:10.521162 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Apr 16 23:26:10.521256 systemd[1]: Reloading finished in 242 ms. Apr 16 23:26:10.535377 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 16 23:26:10.543640 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 16 23:26:10.552384 ldconfig[1292]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 16 23:26:10.561774 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Apr 16 23:26:10.561846 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Apr 16 23:26:10.561861 kernel: [drm] features: -context_init Apr 16 23:26:10.562139 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 16 23:26:10.567132 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Apr 16 23:26:10.568971 kernel: [drm] number of scanouts: 1 Apr 16 23:26:10.569027 kernel: [drm] number of cap sets: 0 Apr 16 23:26:10.570085 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Apr 16 23:26:10.572641 systemd[1]: Starting audit-rules.service - Load Audit Rules... Apr 16 23:26:10.575696 kernel: Console: switching to colour frame buffer device 160x50 Apr 16 23:26:10.584575 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Apr 16 23:26:10.593921 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 16 23:26:10.596748 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 16 23:26:10.602458 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 16 23:26:10.612095 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 16 23:26:10.617506 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 16 23:26:10.622774 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 16 23:26:10.627636 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 16 23:26:10.631350 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 23:26:10.645973 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 23:26:10.650077 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 23:26:10.653524 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 23:26:10.656104 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 23:26:10.656248 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Apr 16 23:26:10.659506 augenrules[1545]: No rules Apr 16 23:26:10.664785 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 16 23:26:10.668252 systemd[1]: audit-rules.service: Deactivated successfully. Apr 16 23:26:10.668492 systemd[1]: Finished audit-rules.service - Load Audit Rules. Apr 16 23:26:10.670084 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 16 23:26:10.672502 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 23:26:10.673121 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 23:26:10.675272 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 23:26:10.675431 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 23:26:10.679909 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 23:26:10.680645 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 23:26:10.699334 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 16 23:26:10.712768 systemd[1]: Finished ensure-sysext.service. Apr 16 23:26:10.720562 systemd[1]: Starting audit-rules.service - Load Audit Rules... Apr 16 23:26:10.721590 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 16 23:26:10.722595 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 16 23:26:10.724639 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 16 23:26:10.732399 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 16 23:26:10.735123 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 16 23:26:10.737123 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Apr 16 23:26:10.738480 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 16 23:26:10.738516 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Apr 16 23:26:10.738567 systemd[1]: Reached target time-set.target - System Time Set. Apr 16 23:26:10.743722 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 16 23:26:10.749015 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 16 23:26:10.752190 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 16 23:26:10.754378 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 16 23:26:10.758902 kernel: pps_core: LinuxPPS API ver. 1 registered Apr 16 23:26:10.758950 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Apr 16 23:26:10.757875 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 16 23:26:10.758029 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 16 23:26:10.759672 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 16 23:26:10.759863 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 16 23:26:10.760508 augenrules[1562]: /sbin/augenrules: No change Apr 16 23:26:10.761888 kernel: PTP clock support registered Apr 16 23:26:10.761528 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 16 23:26:10.761668 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 16 23:26:10.764363 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 16 23:26:10.764565 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 16 23:26:10.766228 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 16 23:26:10.770911 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Apr 16 23:26:10.771296 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Apr 16 23:26:10.774432 augenrules[1600]: No rules Apr 16 23:26:10.776339 systemd[1]: audit-rules.service: Deactivated successfully. Apr 16 23:26:10.777770 systemd[1]: Finished audit-rules.service - Load Audit Rules. Apr 16 23:26:10.779387 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 16 23:26:10.779460 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 16 23:26:10.779494 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 16 23:26:10.804287 systemd-networkd[1517]: lo: Link UP Apr 16 23:26:10.804297 systemd-networkd[1517]: lo: Gained carrier Apr 16 23:26:10.805368 systemd-networkd[1517]: Enumeration completed Apr 16 23:26:10.805574 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 16 23:26:10.805882 systemd-networkd[1517]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 23:26:10.805886 systemd-networkd[1517]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 16 23:26:10.806315 systemd-networkd[1517]: eth0: Link UP Apr 16 23:26:10.806470 systemd-networkd[1517]: eth0: Gained carrier Apr 16 23:26:10.806485 systemd-networkd[1517]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 16 23:26:10.808154 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Apr 16 23:26:10.809556 systemd-resolved[1527]: Positive Trust Anchors: Apr 16 23:26:10.809575 systemd-resolved[1527]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 16 23:26:10.809607 systemd-resolved[1527]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 16 23:26:10.812863 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 16 23:26:10.814680 systemd-resolved[1527]: Using system hostname 'ci-4459-2-4-n-b2725589f5'. Apr 16 23:26:10.816033 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 16 23:26:10.817174 systemd[1]: Reached target network.target - Network. Apr 16 23:26:10.818011 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 16 23:26:10.823805 systemd-networkd[1517]: eth0: DHCPv4 address 10.0.3.226/25, gateway 10.0.3.129 acquired from 10.0.3.129 Apr 16 23:26:10.830827 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Apr 16 23:26:10.844409 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 16 23:26:10.845655 systemd[1]: Reached target sysinit.target - System Initialization. Apr 16 23:26:10.846827 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 16 23:26:10.847806 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 16 23:26:10.848937 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 16 23:26:10.849861 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 16 23:26:10.850945 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 16 23:26:10.851925 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 16 23:26:10.851958 systemd[1]: Reached target paths.target - Path Units. Apr 16 23:26:10.852669 systemd[1]: Reached target timers.target - Timer Units. Apr 16 23:26:10.854370 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 16 23:26:10.856328 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 16 23:26:10.858903 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Apr 16 23:26:10.860044 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Apr 16 23:26:10.861100 systemd[1]: Reached target ssh-access.target - SSH Access Available. Apr 16 23:26:10.863834 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 16 23:26:10.865472 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Apr 16 23:26:10.866972 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 16 23:26:10.867912 systemd[1]: Reached target sockets.target - Socket Units. Apr 16 23:26:10.868672 systemd[1]: Reached target basic.target - Basic System. Apr 16 23:26:10.869518 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 16 23:26:10.869547 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 16 23:26:10.871700 systemd[1]: Starting chronyd.service - NTP client/server... Apr 16 23:26:10.873331 systemd[1]: Starting containerd.service - containerd container runtime... Apr 16 23:26:10.875350 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Apr 16 23:26:10.884886 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 16 23:26:10.885760 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:10.886778 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 16 23:26:10.889033 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 16 23:26:10.892325 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 16 23:26:10.893298 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 16 23:26:10.894294 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 16 23:26:10.903105 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 16 23:26:10.907873 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 16 23:26:10.909670 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 16 23:26:10.911341 jq[1624]: false Apr 16 23:26:10.913940 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 16 23:26:10.915645 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 16 23:26:10.916071 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 16 23:26:10.916658 systemd[1]: Starting update-engine.service - Update Engine... Apr 16 23:26:10.918748 extend-filesystems[1625]: Found /dev/vda6 Apr 16 23:26:10.918793 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 16 23:26:10.923805 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 16 23:26:10.925171 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 16 23:26:10.925288 jq[1642]: true Apr 16 23:26:10.925369 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 16 23:26:10.926463 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 16 23:26:10.928139 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 16 23:26:10.931106 systemd[1]: motdgen.service: Deactivated successfully. Apr 16 23:26:10.931282 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 16 23:26:10.937975 extend-filesystems[1625]: Found /dev/vda9 Apr 16 23:26:10.942938 extend-filesystems[1625]: Checking size of /dev/vda9 Apr 16 23:26:10.944079 jq[1647]: true Apr 16 23:26:10.952763 tar[1646]: linux-arm64/LICENSE Apr 16 23:26:10.952763 tar[1646]: linux-arm64/helm Apr 16 23:26:10.956349 chronyd[1617]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Apr 16 23:26:10.957367 chronyd[1617]: Loaded seccomp filter (level 2) Apr 16 23:26:10.957481 systemd[1]: Started chronyd.service - NTP client/server. Apr 16 23:26:10.958631 (ntainerd)[1656]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 16 23:26:10.963250 extend-filesystems[1625]: Resized partition /dev/vda9 Apr 16 23:26:10.964390 update_engine[1637]: I20260416 23:26:10.962246 1637 main.cc:92] Flatcar Update Engine starting Apr 16 23:26:10.967003 extend-filesystems[1666]: resize2fs 1.47.3 (8-Jul-2025) Apr 16 23:26:10.976950 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Apr 16 23:26:11.003851 dbus-daemon[1620]: [system] SELinux support is enabled Apr 16 23:26:11.004044 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 16 23:26:11.008128 update_engine[1637]: I20260416 23:26:11.007813 1637 update_check_scheduler.cc:74] Next update check in 5m46s Apr 16 23:26:11.010344 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 16 23:26:11.010376 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 16 23:26:11.012175 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 16 23:26:11.012191 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 16 23:26:11.017581 systemd[1]: Started update-engine.service - Update Engine. Apr 16 23:26:11.019609 systemd-logind[1635]: New seat seat0. Apr 16 23:26:11.021365 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 16 23:26:11.039647 systemd-logind[1635]: Watching system buttons on /dev/input/event0 (Power Button) Apr 16 23:26:11.039662 systemd-logind[1635]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Apr 16 23:26:11.039902 systemd[1]: Started systemd-logind.service - User Login Management. Apr 16 23:26:11.067990 locksmithd[1683]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 16 23:26:11.140768 bash[1681]: Updated "/home/core/.ssh/authorized_keys" Apr 16 23:26:11.141637 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 16 23:26:11.145515 systemd[1]: Starting sshkeys.service... Apr 16 23:26:11.167983 containerd[1656]: time="2026-04-16T23:26:11Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Apr 16 23:26:11.171105 containerd[1656]: time="2026-04-16T23:26:11.170968160Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Apr 16 23:26:11.177694 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Apr 16 23:26:11.181657 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Apr 16 23:26:11.189448 containerd[1656]: time="2026-04-16T23:26:11.189403600Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.8µs" Apr 16 23:26:11.189570 containerd[1656]: time="2026-04-16T23:26:11.189552680Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Apr 16 23:26:11.189625 containerd[1656]: time="2026-04-16T23:26:11.189612560Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Apr 16 23:26:11.189852 containerd[1656]: time="2026-04-16T23:26:11.189829040Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Apr 16 23:26:11.189929 containerd[1656]: time="2026-04-16T23:26:11.189915000Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Apr 16 23:26:11.189997 containerd[1656]: time="2026-04-16T23:26:11.189983440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Apr 16 23:26:11.190108 containerd[1656]: time="2026-04-16T23:26:11.190089120Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Apr 16 23:26:11.190176 containerd[1656]: time="2026-04-16T23:26:11.190161760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Apr 16 23:26:11.190465 containerd[1656]: time="2026-04-16T23:26:11.190437680Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Apr 16 23:26:11.190534 containerd[1656]: time="2026-04-16T23:26:11.190519600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Apr 16 23:26:11.190585 containerd[1656]: time="2026-04-16T23:26:11.190572000Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Apr 16 23:26:11.190629 containerd[1656]: time="2026-04-16T23:26:11.190617280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Apr 16 23:26:11.190813 containerd[1656]: time="2026-04-16T23:26:11.190791520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Apr 16 23:26:11.191094 containerd[1656]: time="2026-04-16T23:26:11.191068040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Apr 16 23:26:11.191186 containerd[1656]: time="2026-04-16T23:26:11.191170080Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Apr 16 23:26:11.191233 containerd[1656]: time="2026-04-16T23:26:11.191220520Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Apr 16 23:26:11.191331 containerd[1656]: time="2026-04-16T23:26:11.191314760Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Apr 16 23:26:11.191766 containerd[1656]: time="2026-04-16T23:26:11.191723080Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Apr 16 23:26:11.191993 containerd[1656]: time="2026-04-16T23:26:11.191971280Z" level=info msg="metadata content store policy set" policy=shared Apr 16 23:26:11.195749 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:11.222144 containerd[1656]: time="2026-04-16T23:26:11.222092600Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Apr 16 23:26:11.222218 containerd[1656]: time="2026-04-16T23:26:11.222175720Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Apr 16 23:26:11.222218 containerd[1656]: time="2026-04-16T23:26:11.222193560Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Apr 16 23:26:11.222218 containerd[1656]: time="2026-04-16T23:26:11.222206120Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Apr 16 23:26:11.222292 containerd[1656]: time="2026-04-16T23:26:11.222219040Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Apr 16 23:26:11.222292 containerd[1656]: time="2026-04-16T23:26:11.222232360Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Apr 16 23:26:11.222292 containerd[1656]: time="2026-04-16T23:26:11.222244760Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Apr 16 23:26:11.222292 containerd[1656]: time="2026-04-16T23:26:11.222256600Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Apr 16 23:26:11.222292 containerd[1656]: time="2026-04-16T23:26:11.222268520Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Apr 16 23:26:11.222292 containerd[1656]: time="2026-04-16T23:26:11.222286240Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Apr 16 23:26:11.222386 containerd[1656]: time="2026-04-16T23:26:11.222295760Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Apr 16 23:26:11.222386 containerd[1656]: time="2026-04-16T23:26:11.222309560Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Apr 16 23:26:11.222816 containerd[1656]: time="2026-04-16T23:26:11.222687120Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Apr 16 23:26:11.222850 containerd[1656]: time="2026-04-16T23:26:11.222829320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Apr 16 23:26:11.222868 containerd[1656]: time="2026-04-16T23:26:11.222852200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Apr 16 23:26:11.222868 containerd[1656]: time="2026-04-16T23:26:11.222864400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Apr 16 23:26:11.222899 containerd[1656]: time="2026-04-16T23:26:11.222875120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Apr 16 23:26:11.222899 containerd[1656]: time="2026-04-16T23:26:11.222885720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Apr 16 23:26:11.222935 containerd[1656]: time="2026-04-16T23:26:11.222899880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Apr 16 23:26:11.222974 containerd[1656]: time="2026-04-16T23:26:11.222911440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Apr 16 23:26:11.223024 containerd[1656]: time="2026-04-16T23:26:11.223006040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Apr 16 23:26:11.223053 containerd[1656]: time="2026-04-16T23:26:11.223027840Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Apr 16 23:26:11.223053 containerd[1656]: time="2026-04-16T23:26:11.223040680Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Apr 16 23:26:11.223396 containerd[1656]: time="2026-04-16T23:26:11.223372080Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Apr 16 23:26:11.223429 containerd[1656]: time="2026-04-16T23:26:11.223400760Z" level=info msg="Start snapshots syncer" Apr 16 23:26:11.223497 containerd[1656]: time="2026-04-16T23:26:11.223476040Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Apr 16 23:26:11.224285 containerd[1656]: time="2026-04-16T23:26:11.224188000Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Apr 16 23:26:11.224383 containerd[1656]: time="2026-04-16T23:26:11.224306680Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Apr 16 23:26:11.224417 containerd[1656]: time="2026-04-16T23:26:11.224404640Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Apr 16 23:26:11.224661 containerd[1656]: time="2026-04-16T23:26:11.224590880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Apr 16 23:26:11.224688 containerd[1656]: time="2026-04-16T23:26:11.224671080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Apr 16 23:26:11.224688 containerd[1656]: time="2026-04-16T23:26:11.224685360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Apr 16 23:26:11.224725 containerd[1656]: time="2026-04-16T23:26:11.224695800Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Apr 16 23:26:11.224725 containerd[1656]: time="2026-04-16T23:26:11.224707920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Apr 16 23:26:11.224725 containerd[1656]: time="2026-04-16T23:26:11.224718360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Apr 16 23:26:11.224803 containerd[1656]: time="2026-04-16T23:26:11.224746200Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Apr 16 23:26:11.224803 containerd[1656]: time="2026-04-16T23:26:11.224774880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Apr 16 23:26:11.224876 containerd[1656]: time="2026-04-16T23:26:11.224852640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Apr 16 23:26:11.224903 containerd[1656]: time="2026-04-16T23:26:11.224878920Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Apr 16 23:26:11.224981 containerd[1656]: time="2026-04-16T23:26:11.224923920Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.224991120Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225002080Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225013560Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225021320Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225038520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225049760Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225142080Z" level=info msg="runtime interface created" Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225219440Z" level=info msg="created NRI interface" Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225230600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225243720Z" level=info msg="Connect containerd service" Apr 16 23:26:11.226759 containerd[1656]: time="2026-04-16T23:26:11.225267600Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 16 23:26:11.231197 containerd[1656]: time="2026-04-16T23:26:11.231163400Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 16 23:26:11.323448 containerd[1656]: time="2026-04-16T23:26:11.323350760Z" level=info msg="Start subscribing containerd event" Apr 16 23:26:11.323530 containerd[1656]: time="2026-04-16T23:26:11.323405760Z" level=info msg="Start recovering state" Apr 16 23:26:11.323739 containerd[1656]: time="2026-04-16T23:26:11.323704200Z" level=info msg="Start event monitor" Apr 16 23:26:11.323855 containerd[1656]: time="2026-04-16T23:26:11.323831720Z" level=info msg="Start cni network conf syncer for default" Apr 16 23:26:11.323881 containerd[1656]: time="2026-04-16T23:26:11.323853320Z" level=info msg="Start streaming server" Apr 16 23:26:11.323881 containerd[1656]: time="2026-04-16T23:26:11.323871480Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Apr 16 23:26:11.323881 containerd[1656]: time="2026-04-16T23:26:11.323879920Z" level=info msg="runtime interface starting up..." Apr 16 23:26:11.323936 containerd[1656]: time="2026-04-16T23:26:11.323886560Z" level=info msg="starting plugins..." Apr 16 23:26:11.323936 containerd[1656]: time="2026-04-16T23:26:11.323916680Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Apr 16 23:26:11.324253 containerd[1656]: time="2026-04-16T23:26:11.324228040Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 16 23:26:11.324423 containerd[1656]: time="2026-04-16T23:26:11.324403320Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 16 23:26:11.324650 systemd[1]: Started containerd.service - containerd container runtime. Apr 16 23:26:11.326039 containerd[1656]: time="2026-04-16T23:26:11.326012320Z" level=info msg="containerd successfully booted in 0.158562s" Apr 16 23:26:11.365744 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Apr 16 23:26:11.384358 extend-filesystems[1666]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Apr 16 23:26:11.384358 extend-filesystems[1666]: old_desc_blocks = 1, new_desc_blocks = 6 Apr 16 23:26:11.384358 extend-filesystems[1666]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Apr 16 23:26:11.389635 extend-filesystems[1625]: Resized filesystem in /dev/vda9 Apr 16 23:26:11.385821 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 16 23:26:11.386862 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 16 23:26:11.427513 tar[1646]: linux-arm64/README.md Apr 16 23:26:11.446452 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 16 23:26:11.663534 sshd_keygen[1648]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 16 23:26:11.682582 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 16 23:26:11.687121 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 16 23:26:11.703252 systemd[1]: issuegen.service: Deactivated successfully. Apr 16 23:26:11.703446 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 16 23:26:11.707990 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 16 23:26:11.725781 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 16 23:26:11.728351 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 16 23:26:11.730550 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Apr 16 23:26:11.731898 systemd[1]: Reached target getty.target - Login Prompts. Apr 16 23:26:11.898761 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:12.203804 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:12.498023 systemd-networkd[1517]: eth0: Gained IPv6LL Apr 16 23:26:12.500615 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 16 23:26:12.502361 systemd[1]: Reached target network-online.target - Network is Online. Apr 16 23:26:12.506126 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:26:12.508186 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 16 23:26:12.537614 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 16 23:26:13.331841 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:26:13.335452 (kubelet)[1756]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 23:26:13.869462 kubelet[1756]: E0416 23:26:13.869402 1756 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 23:26:13.871641 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 23:26:13.871800 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 23:26:13.872356 systemd[1]: kubelet.service: Consumed 710ms CPU time, 248.4M memory peak. Apr 16 23:26:13.910765 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:14.217770 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:17.920793 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:17.928056 coreos-metadata[1619]: Apr 16 23:26:17.927 WARN failed to locate config-drive, using the metadata service API instead Apr 16 23:26:17.944942 coreos-metadata[1619]: Apr 16 23:26:17.944 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Apr 16 23:26:18.225785 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Apr 16 23:26:18.231829 coreos-metadata[1697]: Apr 16 23:26:18.231 WARN failed to locate config-drive, using the metadata service API instead Apr 16 23:26:18.244685 coreos-metadata[1697]: Apr 16 23:26:18.244 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Apr 16 23:26:24.122345 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 16 23:26:24.123875 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:26:24.256559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:26:24.260819 (kubelet)[1782]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 23:26:24.305371 kubelet[1782]: E0416 23:26:24.305289 1782 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 23:26:24.308146 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 23:26:24.308281 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 23:26:24.309832 systemd[1]: kubelet.service: Consumed 145ms CPU time, 105.7M memory peak. Apr 16 23:26:25.639338 coreos-metadata[1697]: Apr 16 23:26:25.639 INFO Fetch successful Apr 16 23:26:25.639338 coreos-metadata[1697]: Apr 16 23:26:25.639 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Apr 16 23:26:25.641458 coreos-metadata[1619]: Apr 16 23:26:25.641 INFO Fetch successful Apr 16 23:26:25.641691 coreos-metadata[1619]: Apr 16 23:26:25.641 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Apr 16 23:26:27.455470 coreos-metadata[1619]: Apr 16 23:26:27.455 INFO Fetch successful Apr 16 23:26:27.455470 coreos-metadata[1619]: Apr 16 23:26:27.455 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Apr 16 23:26:27.457566 coreos-metadata[1697]: Apr 16 23:26:27.457 INFO Fetch successful Apr 16 23:26:27.459933 unknown[1697]: wrote ssh authorized keys file for user: core Apr 16 23:26:27.493280 update-ssh-keys[1791]: Updated "/home/core/.ssh/authorized_keys" Apr 16 23:26:27.494249 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Apr 16 23:26:27.495672 systemd[1]: Finished sshkeys.service. Apr 16 23:26:33.010292 coreos-metadata[1619]: Apr 16 23:26:33.010 INFO Fetch successful Apr 16 23:26:33.010292 coreos-metadata[1619]: Apr 16 23:26:33.010 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Apr 16 23:26:34.397688 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Apr 16 23:26:34.399212 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:26:34.531826 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:26:34.535857 (kubelet)[1802]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 23:26:34.575245 kubelet[1802]: E0416 23:26:34.575198 1802 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 23:26:34.577709 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 23:26:34.577857 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 23:26:34.578151 systemd[1]: kubelet.service: Consumed 135ms CPU time, 107.9M memory peak. Apr 16 23:26:34.740968 chronyd[1617]: Selected source PHC0 Apr 16 23:26:44.647724 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Apr 16 23:26:44.649165 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:26:44.774845 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:26:44.778726 (kubelet)[1818]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 23:26:44.784434 coreos-metadata[1619]: Apr 16 23:26:44.784 INFO Fetch successful Apr 16 23:26:44.784434 coreos-metadata[1619]: Apr 16 23:26:44.784 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Apr 16 23:26:44.815413 kubelet[1818]: E0416 23:26:44.815348 1818 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 23:26:44.817904 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 23:26:44.818035 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 23:26:44.818598 systemd[1]: kubelet.service: Consumed 142ms CPU time, 107.6M memory peak. Apr 16 23:26:47.604598 coreos-metadata[1619]: Apr 16 23:26:47.604 INFO Fetch successful Apr 16 23:26:47.604598 coreos-metadata[1619]: Apr 16 23:26:47.604 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Apr 16 23:26:49.303596 coreos-metadata[1619]: Apr 16 23:26:49.303 INFO Fetch successful Apr 16 23:26:49.343783 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Apr 16 23:26:49.344451 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 16 23:26:49.344600 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 16 23:26:49.347948 systemd[1]: Startup finished in 3.254s (kernel) + 16.177s (initrd) + 40.617s (userspace) = 1min 49ms. Apr 16 23:26:54.897644 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Apr 16 23:26:54.899275 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:26:55.049397 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:26:55.053015 (kubelet)[1839]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 23:26:55.086181 kubelet[1839]: E0416 23:26:55.086120 1839 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 23:26:55.088645 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 23:26:55.088796 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 23:26:55.090096 systemd[1]: kubelet.service: Consumed 138ms CPU time, 107.1M memory peak. Apr 16 23:26:56.344106 update_engine[1637]: I20260416 23:26:56.344002 1637 update_attempter.cc:509] Updating boot flags... Apr 16 23:26:57.058463 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 16 23:26:57.059959 systemd[1]: Started sshd@0-10.0.3.226:22-50.85.169.122:46012.service - OpenSSH per-connection server daemon (50.85.169.122:46012). Apr 16 23:26:57.210993 sshd[1864]: Accepted publickey for core from 50.85.169.122 port 46012 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:26:57.213319 sshd-session[1864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:26:57.219572 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 16 23:26:57.220536 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 16 23:26:57.226622 systemd-logind[1635]: New session 1 of user core. Apr 16 23:26:57.242915 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 16 23:26:57.245346 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 16 23:26:57.259981 (systemd)[1869]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 16 23:26:57.262507 systemd-logind[1635]: New session c1 of user core. Apr 16 23:26:57.394613 systemd[1869]: Queued start job for default target default.target. Apr 16 23:26:57.405083 systemd[1869]: Created slice app.slice - User Application Slice. Apr 16 23:26:57.405264 systemd[1869]: Reached target paths.target - Paths. Apr 16 23:26:57.405376 systemd[1869]: Reached target timers.target - Timers. Apr 16 23:26:57.406652 systemd[1869]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 16 23:26:57.416997 systemd[1869]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 16 23:26:57.417132 systemd[1869]: Reached target sockets.target - Sockets. Apr 16 23:26:57.417183 systemd[1869]: Reached target basic.target - Basic System. Apr 16 23:26:57.417219 systemd[1869]: Reached target default.target - Main User Target. Apr 16 23:26:57.417250 systemd[1869]: Startup finished in 148ms. Apr 16 23:26:57.417337 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 16 23:26:57.418576 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 16 23:26:57.472681 systemd[1]: Started sshd@1-10.0.3.226:22-50.85.169.122:46028.service - OpenSSH per-connection server daemon (50.85.169.122:46028). Apr 16 23:26:57.586823 sshd[1880]: Accepted publickey for core from 50.85.169.122 port 46028 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:26:57.588084 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:26:57.592775 systemd-logind[1635]: New session 2 of user core. Apr 16 23:26:57.608080 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 16 23:26:57.643423 sshd[1883]: Connection closed by 50.85.169.122 port 46028 Apr 16 23:26:57.643995 sshd-session[1880]: pam_unix(sshd:session): session closed for user core Apr 16 23:26:57.647014 systemd[1]: sshd@1-10.0.3.226:22-50.85.169.122:46028.service: Deactivated successfully. Apr 16 23:26:57.648514 systemd[1]: session-2.scope: Deactivated successfully. Apr 16 23:26:57.652516 systemd-logind[1635]: Session 2 logged out. Waiting for processes to exit. Apr 16 23:26:57.653453 systemd-logind[1635]: Removed session 2. Apr 16 23:26:57.673464 systemd[1]: Started sshd@2-10.0.3.226:22-50.85.169.122:46040.service - OpenSSH per-connection server daemon (50.85.169.122:46040). Apr 16 23:26:57.784281 sshd[1889]: Accepted publickey for core from 50.85.169.122 port 46040 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:26:57.785555 sshd-session[1889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:26:57.790420 systemd-logind[1635]: New session 3 of user core. Apr 16 23:26:57.800175 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 16 23:26:57.832060 sshd[1892]: Connection closed by 50.85.169.122 port 46040 Apr 16 23:26:57.832586 sshd-session[1889]: pam_unix(sshd:session): session closed for user core Apr 16 23:26:57.835602 systemd[1]: sshd@2-10.0.3.226:22-50.85.169.122:46040.service: Deactivated successfully. Apr 16 23:26:57.837211 systemd[1]: session-3.scope: Deactivated successfully. Apr 16 23:26:57.840023 systemd-logind[1635]: Session 3 logged out. Waiting for processes to exit. Apr 16 23:26:57.841585 systemd-logind[1635]: Removed session 3. Apr 16 23:26:57.864638 systemd[1]: Started sshd@3-10.0.3.226:22-50.85.169.122:46046.service - OpenSSH per-connection server daemon (50.85.169.122:46046). Apr 16 23:26:57.965902 sshd[1898]: Accepted publickey for core from 50.85.169.122 port 46046 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:26:57.967212 sshd-session[1898]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:26:57.971073 systemd-logind[1635]: New session 4 of user core. Apr 16 23:26:57.983042 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 16 23:26:58.018823 sshd[1901]: Connection closed by 50.85.169.122 port 46046 Apr 16 23:26:58.019427 sshd-session[1898]: pam_unix(sshd:session): session closed for user core Apr 16 23:26:58.023477 systemd[1]: sshd@3-10.0.3.226:22-50.85.169.122:46046.service: Deactivated successfully. Apr 16 23:26:58.024992 systemd[1]: session-4.scope: Deactivated successfully. Apr 16 23:26:58.025679 systemd-logind[1635]: Session 4 logged out. Waiting for processes to exit. Apr 16 23:26:58.026697 systemd-logind[1635]: Removed session 4. Apr 16 23:26:58.043890 systemd[1]: Started sshd@4-10.0.3.226:22-50.85.169.122:46060.service - OpenSSH per-connection server daemon (50.85.169.122:46060). Apr 16 23:26:58.146755 sshd[1907]: Accepted publickey for core from 50.85.169.122 port 46060 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:26:58.148167 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:26:58.152847 systemd-logind[1635]: New session 5 of user core. Apr 16 23:26:58.160909 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 16 23:26:58.197393 sudo[1911]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 16 23:26:58.197661 sudo[1911]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 23:26:58.211098 sudo[1911]: pam_unix(sudo:session): session closed for user root Apr 16 23:26:58.225600 sshd[1910]: Connection closed by 50.85.169.122 port 46060 Apr 16 23:26:58.226106 sshd-session[1907]: pam_unix(sshd:session): session closed for user core Apr 16 23:26:58.230618 systemd[1]: sshd@4-10.0.3.226:22-50.85.169.122:46060.service: Deactivated successfully. Apr 16 23:26:58.232275 systemd[1]: session-5.scope: Deactivated successfully. Apr 16 23:26:58.233059 systemd-logind[1635]: Session 5 logged out. Waiting for processes to exit. Apr 16 23:26:58.234109 systemd-logind[1635]: Removed session 5. Apr 16 23:26:58.251348 systemd[1]: Started sshd@5-10.0.3.226:22-50.85.169.122:46062.service - OpenSSH per-connection server daemon (50.85.169.122:46062). Apr 16 23:26:58.352015 sshd[1917]: Accepted publickey for core from 50.85.169.122 port 46062 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:26:58.353367 sshd-session[1917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:26:58.358138 systemd-logind[1635]: New session 6 of user core. Apr 16 23:26:58.368066 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 16 23:26:58.390829 sudo[1922]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 16 23:26:58.391091 sudo[1922]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 23:26:58.394474 sudo[1922]: pam_unix(sudo:session): session closed for user root Apr 16 23:26:58.398993 sudo[1921]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Apr 16 23:26:58.399503 sudo[1921]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 23:26:58.409103 systemd[1]: Starting audit-rules.service - Load Audit Rules... Apr 16 23:26:58.447024 augenrules[1944]: No rules Apr 16 23:26:58.448195 systemd[1]: audit-rules.service: Deactivated successfully. Apr 16 23:26:58.448406 systemd[1]: Finished audit-rules.service - Load Audit Rules. Apr 16 23:26:58.449306 sudo[1921]: pam_unix(sudo:session): session closed for user root Apr 16 23:26:58.464560 sshd[1920]: Connection closed by 50.85.169.122 port 46062 Apr 16 23:26:58.463628 sshd-session[1917]: pam_unix(sshd:session): session closed for user core Apr 16 23:26:58.466340 systemd[1]: sshd@5-10.0.3.226:22-50.85.169.122:46062.service: Deactivated successfully. Apr 16 23:26:58.467823 systemd[1]: session-6.scope: Deactivated successfully. Apr 16 23:26:58.468960 systemd-logind[1635]: Session 6 logged out. Waiting for processes to exit. Apr 16 23:26:58.470030 systemd-logind[1635]: Removed session 6. Apr 16 23:26:58.488621 systemd[1]: Started sshd@6-10.0.3.226:22-50.85.169.122:46066.service - OpenSSH per-connection server daemon (50.85.169.122:46066). Apr 16 23:26:58.587838 sshd[1954]: Accepted publickey for core from 50.85.169.122 port 46066 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:26:58.589010 sshd-session[1954]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:26:58.593346 systemd-logind[1635]: New session 7 of user core. Apr 16 23:26:58.602903 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 16 23:26:58.625136 sudo[1958]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 16 23:26:58.625401 sudo[1958]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 16 23:26:58.945476 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 16 23:26:58.955383 (dockerd)[1979]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 16 23:26:59.181440 dockerd[1979]: time="2026-04-16T23:26:59.181375759Z" level=info msg="Starting up" Apr 16 23:26:59.182255 dockerd[1979]: time="2026-04-16T23:26:59.182234801Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Apr 16 23:26:59.192204 dockerd[1979]: time="2026-04-16T23:26:59.192166703Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Apr 16 23:26:59.261476 dockerd[1979]: time="2026-04-16T23:26:59.261263660Z" level=info msg="Loading containers: start." Apr 16 23:26:59.270755 kernel: Initializing XFRM netlink socket Apr 16 23:26:59.487585 systemd-networkd[1517]: docker0: Link UP Apr 16 23:26:59.492724 dockerd[1979]: time="2026-04-16T23:26:59.492234626Z" level=info msg="Loading containers: done." Apr 16 23:26:59.507382 dockerd[1979]: time="2026-04-16T23:26:59.507342500Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 16 23:26:59.507576 dockerd[1979]: time="2026-04-16T23:26:59.507559141Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Apr 16 23:26:59.507700 dockerd[1979]: time="2026-04-16T23:26:59.507685621Z" level=info msg="Initializing buildkit" Apr 16 23:26:59.533815 dockerd[1979]: time="2026-04-16T23:26:59.533693200Z" level=info msg="Completed buildkit initialization" Apr 16 23:26:59.538382 dockerd[1979]: time="2026-04-16T23:26:59.538349731Z" level=info msg="Daemon has completed initialization" Apr 16 23:26:59.538535 dockerd[1979]: time="2026-04-16T23:26:59.538507971Z" level=info msg="API listen on /run/docker.sock" Apr 16 23:26:59.538634 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 16 23:27:00.755563 containerd[1656]: time="2026-04-16T23:27:00.755519179Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.7\"" Apr 16 23:27:01.268310 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2099636944.mount: Deactivated successfully. Apr 16 23:27:01.988449 containerd[1656]: time="2026-04-16T23:27:01.988389583Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:01.989606 containerd[1656]: time="2026-04-16T23:27:01.989562466Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.7: active requests=0, bytes read=24193866" Apr 16 23:27:01.990424 containerd[1656]: time="2026-04-16T23:27:01.990398148Z" level=info msg="ImageCreate event name:\"sha256:bf3fdee5548e267fd53c67a79d712e896d47f48203512415518d59da7f985228\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:01.993722 containerd[1656]: time="2026-04-16T23:27:01.993694995Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:b96b8464d152a24c81d7f0435fd2198f8486970cd26a9e0e9c20826c73d1441c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:01.995107 containerd[1656]: time="2026-04-16T23:27:01.995047758Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.7\" with image id \"sha256:bf3fdee5548e267fd53c67a79d712e896d47f48203512415518d59da7f985228\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:b96b8464d152a24c81d7f0435fd2198f8486970cd26a9e0e9c20826c73d1441c\", size \"24190367\" in 1.239484979s" Apr 16 23:27:01.995107 containerd[1656]: time="2026-04-16T23:27:01.995095078Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.7\" returns image reference \"sha256:bf3fdee5548e267fd53c67a79d712e896d47f48203512415518d59da7f985228\"" Apr 16 23:27:01.996182 containerd[1656]: time="2026-04-16T23:27:01.995938200Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.7\"" Apr 16 23:27:02.806998 containerd[1656]: time="2026-04-16T23:27:02.805898082Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:02.806998 containerd[1656]: time="2026-04-16T23:27:02.806847325Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.7: active requests=0, bytes read=18901464" Apr 16 23:27:02.808345 containerd[1656]: time="2026-04-16T23:27:02.808320888Z" level=info msg="ImageCreate event name:\"sha256:161b12aee2701d72b2e8a7d114f5f83122603d8c5d1d3cd7f72aa6fac5d9524c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:02.812580 containerd[1656]: time="2026-04-16T23:27:02.812536898Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7d759bdc4fef10a3fc1ad60ce9439d58e1a4df7ebb22751f7cc0201ce55f280b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:02.813758 containerd[1656]: time="2026-04-16T23:27:02.813544460Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.7\" with image id \"sha256:161b12aee2701d72b2e8a7d114f5f83122603d8c5d1d3cd7f72aa6fac5d9524c\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7d759bdc4fef10a3fc1ad60ce9439d58e1a4df7ebb22751f7cc0201ce55f280b\", size \"20408083\" in 817.57654ms" Apr 16 23:27:02.813758 containerd[1656]: time="2026-04-16T23:27:02.813576220Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.7\" returns image reference \"sha256:161b12aee2701d72b2e8a7d114f5f83122603d8c5d1d3cd7f72aa6fac5d9524c\"" Apr 16 23:27:02.814149 containerd[1656]: time="2026-04-16T23:27:02.814052501Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.7\"" Apr 16 23:27:03.548763 containerd[1656]: time="2026-04-16T23:27:03.548291171Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:03.549617 containerd[1656]: time="2026-04-16T23:27:03.549575574Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.7: active requests=0, bytes read=14047965" Apr 16 23:27:03.550967 containerd[1656]: time="2026-04-16T23:27:03.550936217Z" level=info msg="ImageCreate event name:\"sha256:85bc0b83d6779f309f0f2d8724ee225e2a061dc60b1b127f8a9b8843bad36e14\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:03.554483 containerd[1656]: time="2026-04-16T23:27:03.554445265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:4ab32f707ff84beaac431797999707757b885196b0b9a52d29cb67f95efce7c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:03.556352 containerd[1656]: time="2026-04-16T23:27:03.556268749Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.7\" with image id \"sha256:85bc0b83d6779f309f0f2d8724ee225e2a061dc60b1b127f8a9b8843bad36e14\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:4ab32f707ff84beaac431797999707757b885196b0b9a52d29cb67f95efce7c1\", size \"15554602\" in 742.160648ms" Apr 16 23:27:03.556352 containerd[1656]: time="2026-04-16T23:27:03.556320589Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.7\" returns image reference \"sha256:85bc0b83d6779f309f0f2d8724ee225e2a061dc60b1b127f8a9b8843bad36e14\"" Apr 16 23:27:03.556786 containerd[1656]: time="2026-04-16T23:27:03.556762950Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.7\"" Apr 16 23:27:04.290295 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1876621501.mount: Deactivated successfully. Apr 16 23:27:04.453250 containerd[1656]: time="2026-04-16T23:27:04.453175829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:04.454529 containerd[1656]: time="2026-04-16T23:27:04.454498432Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.7: active requests=0, bytes read=22606312" Apr 16 23:27:04.455539 containerd[1656]: time="2026-04-16T23:27:04.455480234Z" level=info msg="ImageCreate event name:\"sha256:c63683691df94ddfb3e7b1449f68fd9df087b1bda7cdecd1e9292214f6adc745\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:04.458002 containerd[1656]: time="2026-04-16T23:27:04.457966600Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:062519bc0a14769e2f98c6bdff7816a17e6252de3f3c9cb102e6be33fe38d9e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:04.458553 containerd[1656]: time="2026-04-16T23:27:04.458434001Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.7\" with image id \"sha256:c63683691df94ddfb3e7b1449f68fd9df087b1bda7cdecd1e9292214f6adc745\", repo tag \"registry.k8s.io/kube-proxy:v1.34.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:062519bc0a14769e2f98c6bdff7816a17e6252de3f3c9cb102e6be33fe38d9e2\", size \"22605305\" in 901.637411ms" Apr 16 23:27:04.458553 containerd[1656]: time="2026-04-16T23:27:04.458466241Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.7\" returns image reference \"sha256:c63683691df94ddfb3e7b1449f68fd9df087b1bda7cdecd1e9292214f6adc745\"" Apr 16 23:27:04.458859 containerd[1656]: time="2026-04-16T23:27:04.458839522Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Apr 16 23:27:04.868946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1201451370.mount: Deactivated successfully. Apr 16 23:27:05.147379 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Apr 16 23:27:05.148801 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:27:05.316571 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:27:05.330387 (kubelet)[2327]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 16 23:27:05.433329 kubelet[2327]: E0416 23:27:05.433207 2327 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 16 23:27:05.436433 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 16 23:27:05.436613 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 16 23:27:05.438818 systemd[1]: kubelet.service: Consumed 142ms CPU time, 106.9M memory peak. Apr 16 23:27:05.550463 containerd[1656]: time="2026-04-16T23:27:05.550415045Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:05.552222 containerd[1656]: time="2026-04-16T23:27:05.552191049Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395498" Apr 16 23:27:05.553608 containerd[1656]: time="2026-04-16T23:27:05.553553492Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:05.559226 containerd[1656]: time="2026-04-16T23:27:05.559117104Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:05.560433 containerd[1656]: time="2026-04-16T23:27:05.560359587Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.101422185s" Apr 16 23:27:05.560433 containerd[1656]: time="2026-04-16T23:27:05.560401707Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Apr 16 23:27:05.561404 containerd[1656]: time="2026-04-16T23:27:05.561381510Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Apr 16 23:27:05.951983 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2249718689.mount: Deactivated successfully. Apr 16 23:27:05.958724 containerd[1656]: time="2026-04-16T23:27:05.958674173Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:05.959769 containerd[1656]: time="2026-04-16T23:27:05.959741216Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Apr 16 23:27:05.961185 containerd[1656]: time="2026-04-16T23:27:05.961146179Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:05.963655 containerd[1656]: time="2026-04-16T23:27:05.963611744Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:05.964678 containerd[1656]: time="2026-04-16T23:27:05.964269826Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 402.756916ms" Apr 16 23:27:05.964678 containerd[1656]: time="2026-04-16T23:27:05.964322026Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Apr 16 23:27:05.964888 containerd[1656]: time="2026-04-16T23:27:05.964857707Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Apr 16 23:27:06.377246 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount745974388.mount: Deactivated successfully. Apr 16 23:27:06.949540 containerd[1656]: time="2026-04-16T23:27:06.949484267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:06.951388 containerd[1656]: time="2026-04-16T23:27:06.951350391Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=21139756" Apr 16 23:27:06.954787 containerd[1656]: time="2026-04-16T23:27:06.954755079Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:06.958767 containerd[1656]: time="2026-04-16T23:27:06.958711928Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:06.960428 containerd[1656]: time="2026-04-16T23:27:06.960384611Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"21136588\" in 995.495064ms" Apr 16 23:27:06.960428 containerd[1656]: time="2026-04-16T23:27:06.960422771Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"" Apr 16 23:27:11.915185 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:27:11.915331 systemd[1]: kubelet.service: Consumed 142ms CPU time, 106.9M memory peak. Apr 16 23:27:11.917171 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:27:11.944178 systemd[1]: Reload requested from client PID 2431 ('systemctl') (unit session-7.scope)... Apr 16 23:27:11.944195 systemd[1]: Reloading... Apr 16 23:27:12.016772 zram_generator::config[2474]: No configuration found. Apr 16 23:27:12.183271 systemd[1]: Reloading finished in 238 ms. Apr 16 23:27:12.230513 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Apr 16 23:27:12.230588 systemd[1]: kubelet.service: Failed with result 'signal'. Apr 16 23:27:12.231024 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:27:12.231082 systemd[1]: kubelet.service: Consumed 89ms CPU time, 95M memory peak. Apr 16 23:27:12.232465 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:27:12.360682 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:27:12.364208 (kubelet)[2522]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 16 23:27:12.399743 kubelet[2522]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:27:12.399743 kubelet[2522]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:27:12.401555 kubelet[2522]: I0416 23:27:12.401491 2522 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:27:13.065939 kubelet[2522]: I0416 23:27:13.065892 2522 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Apr 16 23:27:13.065939 kubelet[2522]: I0416 23:27:13.065924 2522 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:27:13.065939 kubelet[2522]: I0416 23:27:13.065945 2522 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 16 23:27:13.065939 kubelet[2522]: I0416 23:27:13.065950 2522 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:27:13.066212 kubelet[2522]: I0416 23:27:13.066179 2522 server.go:956] "Client rotation is on, will bootstrap in background" Apr 16 23:27:13.073682 kubelet[2522]: E0416 23:27:13.073597 2522 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.3.226:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.3.226:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 16 23:27:13.074531 kubelet[2522]: I0416 23:27:13.074490 2522 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 16 23:27:13.077790 kubelet[2522]: I0416 23:27:13.077767 2522 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:27:13.080428 kubelet[2522]: I0416 23:27:13.080409 2522 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 16 23:27:13.080654 kubelet[2522]: I0416 23:27:13.080624 2522 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:27:13.080817 kubelet[2522]: I0416 23:27:13.080653 2522 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-b2725589f5","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:27:13.080918 kubelet[2522]: I0416 23:27:13.080818 2522 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:27:13.080918 kubelet[2522]: I0416 23:27:13.080827 2522 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:27:13.080960 kubelet[2522]: I0416 23:27:13.080921 2522 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Apr 16 23:27:13.082748 kubelet[2522]: I0416 23:27:13.082706 2522 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:27:13.085620 kubelet[2522]: I0416 23:27:13.085600 2522 kubelet.go:475] "Attempting to sync node with API server" Apr 16 23:27:13.085655 kubelet[2522]: I0416 23:27:13.085625 2522 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:27:13.086035 kubelet[2522]: I0416 23:27:13.086020 2522 kubelet.go:387] "Adding apiserver pod source" Apr 16 23:27:13.086083 kubelet[2522]: I0416 23:27:13.086045 2522 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:27:13.086345 kubelet[2522]: E0416 23:27:13.086302 2522 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.3.226:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-b2725589f5&limit=500&resourceVersion=0\": dial tcp 10.0.3.226:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:27:13.087173 kubelet[2522]: E0416 23:27:13.087142 2522 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.3.226:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.3.226:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 23:27:13.087424 kubelet[2522]: I0416 23:27:13.087381 2522 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Apr 16 23:27:13.088056 kubelet[2522]: I0416 23:27:13.088032 2522 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:27:13.088112 kubelet[2522]: I0416 23:27:13.088068 2522 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 16 23:27:13.088112 kubelet[2522]: W0416 23:27:13.088108 2522 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 16 23:27:13.091482 kubelet[2522]: I0416 23:27:13.091460 2522 server.go:1262] "Started kubelet" Apr 16 23:27:13.092590 kubelet[2522]: I0416 23:27:13.092017 2522 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:27:13.092590 kubelet[2522]: I0416 23:27:13.092070 2522 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 16 23:27:13.092590 kubelet[2522]: I0416 23:27:13.092344 2522 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:27:13.094091 kubelet[2522]: I0416 23:27:13.094056 2522 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:27:13.094818 kubelet[2522]: I0416 23:27:13.094784 2522 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:27:13.096165 kubelet[2522]: I0416 23:27:13.096142 2522 server.go:310] "Adding debug handlers to kubelet server" Apr 16 23:27:13.101491 kubelet[2522]: I0416 23:27:13.101472 2522 volume_manager.go:313] "Starting Kubelet Volume Manager" Apr 16 23:27:13.101884 kubelet[2522]: I0416 23:27:13.101861 2522 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 16 23:27:13.102217 kubelet[2522]: E0416 23:27:13.101879 2522 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-b2725589f5\" not found" Apr 16 23:27:13.102379 kubelet[2522]: I0416 23:27:13.102366 2522 reconciler.go:29] "Reconciler: start to sync state" Apr 16 23:27:13.103535 kubelet[2522]: E0416 23:27:13.103495 2522 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.3.226:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.3.226:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 23:27:13.103617 kubelet[2522]: E0416 23:27:13.103591 2522 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-b2725589f5?timeout=10s\": dial tcp 10.0.3.226:6443: connect: connection refused" interval="200ms" Apr 16 23:27:13.104556 kubelet[2522]: I0416 23:27:13.104527 2522 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 16 23:27:13.104846 kubelet[2522]: I0416 23:27:13.104809 2522 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:27:13.105574 kubelet[2522]: I0416 23:27:13.105549 2522 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 16 23:27:13.108427 kubelet[2522]: I0416 23:27:13.108035 2522 factory.go:223] Registration of the containerd container factory successfully Apr 16 23:27:13.109748 kubelet[2522]: E0416 23:27:13.107904 2522 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.3.226:6443/api/v1/namespaces/default/events\": dial tcp 10.0.3.226:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-b2725589f5.18a6fa0c23ccfc1b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-b2725589f5,UID:ci-4459-2-4-n-b2725589f5,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-b2725589f5,},FirstTimestamp:2026-04-16 23:27:13.091427355 +0000 UTC m=+0.724545449,LastTimestamp:2026-04-16 23:27:13.091427355 +0000 UTC m=+0.724545449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-b2725589f5,}" Apr 16 23:27:13.111752 kubelet[2522]: E0416 23:27:13.111713 2522 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 16 23:27:13.116396 kubelet[2522]: I0416 23:27:13.116364 2522 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 16 23:27:13.117895 kubelet[2522]: I0416 23:27:13.117876 2522 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 16 23:27:13.118017 kubelet[2522]: I0416 23:27:13.118006 2522 status_manager.go:244] "Starting to sync pod status with apiserver" Apr 16 23:27:13.118434 kubelet[2522]: I0416 23:27:13.118202 2522 kubelet.go:2428] "Starting kubelet main sync loop" Apr 16 23:27:13.118571 kubelet[2522]: E0416 23:27:13.118539 2522 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.3.226:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.3.226:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 16 23:27:13.118668 kubelet[2522]: E0416 23:27:13.118647 2522 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 16 23:27:13.119329 kubelet[2522]: I0416 23:27:13.119221 2522 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 16 23:27:13.119329 kubelet[2522]: I0416 23:27:13.119235 2522 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 16 23:27:13.119329 kubelet[2522]: I0416 23:27:13.119251 2522 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:27:13.122362 kubelet[2522]: I0416 23:27:13.122316 2522 policy_none.go:49] "None policy: Start" Apr 16 23:27:13.122362 kubelet[2522]: I0416 23:27:13.122342 2522 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 16 23:27:13.122362 kubelet[2522]: I0416 23:27:13.122353 2522 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 16 23:27:13.124514 kubelet[2522]: I0416 23:27:13.124483 2522 policy_none.go:47] "Start" Apr 16 23:27:13.128267 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Apr 16 23:27:13.141543 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Apr 16 23:27:13.144322 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Apr 16 23:27:13.151973 kubelet[2522]: E0416 23:27:13.151827 2522 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:27:13.152031 kubelet[2522]: I0416 23:27:13.151999 2522 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:27:13.152031 kubelet[2522]: I0416 23:27:13.152010 2522 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:27:13.152248 kubelet[2522]: I0416 23:27:13.152211 2522 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:27:13.153839 kubelet[2522]: E0416 23:27:13.153793 2522 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 16 23:27:13.153908 kubelet[2522]: E0416 23:27:13.153859 2522 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-b2725589f5\" not found" Apr 16 23:27:13.228607 systemd[1]: Created slice kubepods-burstable-poda1e3414c0e0f231e941f0dc3f4920754.slice - libcontainer container kubepods-burstable-poda1e3414c0e0f231e941f0dc3f4920754.slice. Apr 16 23:27:13.235529 kubelet[2522]: E0416 23:27:13.235500 2522 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-b2725589f5\" not found" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.239962 systemd[1]: Created slice kubepods-burstable-pod0414c4d62ed2bbea83e972726c85bf68.slice - libcontainer container kubepods-burstable-pod0414c4d62ed2bbea83e972726c85bf68.slice. Apr 16 23:27:13.241311 kubelet[2522]: E0416 23:27:13.241266 2522 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-b2725589f5\" not found" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.249483 systemd[1]: Created slice kubepods-burstable-poda71c6ecaf836f244c28e0b3995360d6f.slice - libcontainer container kubepods-burstable-poda71c6ecaf836f244c28e0b3995360d6f.slice. Apr 16 23:27:13.251173 kubelet[2522]: E0416 23:27:13.251125 2522 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-b2725589f5\" not found" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.254662 kubelet[2522]: I0416 23:27:13.254638 2522 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.255118 kubelet[2522]: E0416 23:27:13.255076 2522 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.226:6443/api/v1/nodes\": dial tcp 10.0.3.226:6443: connect: connection refused" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.304477 kubelet[2522]: I0416 23:27:13.304404 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.304477 kubelet[2522]: I0416 23:27:13.304461 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a71c6ecaf836f244c28e0b3995360d6f-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-b2725589f5\" (UID: \"a71c6ecaf836f244c28e0b3995360d6f\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.304698 kubelet[2522]: I0416 23:27:13.304487 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a1e3414c0e0f231e941f0dc3f4920754-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" (UID: \"a1e3414c0e0f231e941f0dc3f4920754\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.304698 kubelet[2522]: I0416 23:27:13.304516 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.304698 kubelet[2522]: I0416 23:27:13.304546 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.304698 kubelet[2522]: I0416 23:27:13.304570 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.304698 kubelet[2522]: I0416 23:27:13.304599 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.305056 kubelet[2522]: I0416 23:27:13.304625 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a1e3414c0e0f231e941f0dc3f4920754-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" (UID: \"a1e3414c0e0f231e941f0dc3f4920754\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.305056 kubelet[2522]: E0416 23:27:13.304652 2522 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-b2725589f5?timeout=10s\": dial tcp 10.0.3.226:6443: connect: connection refused" interval="400ms" Apr 16 23:27:13.305056 kubelet[2522]: I0416 23:27:13.304665 2522 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a1e3414c0e0f231e941f0dc3f4920754-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" (UID: \"a1e3414c0e0f231e941f0dc3f4920754\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.457497 kubelet[2522]: I0416 23:27:13.457127 2522 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.457866 kubelet[2522]: E0416 23:27:13.457534 2522 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.226:6443/api/v1/nodes\": dial tcp 10.0.3.226:6443: connect: connection refused" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.539395 containerd[1656]: time="2026-04-16T23:27:13.539348214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-b2725589f5,Uid:a1e3414c0e0f231e941f0dc3f4920754,Namespace:kube-system,Attempt:0,}" Apr 16 23:27:13.545258 containerd[1656]: time="2026-04-16T23:27:13.545015627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-b2725589f5,Uid:0414c4d62ed2bbea83e972726c85bf68,Namespace:kube-system,Attempt:0,}" Apr 16 23:27:13.555312 containerd[1656]: time="2026-04-16T23:27:13.555282410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-b2725589f5,Uid:a71c6ecaf836f244c28e0b3995360d6f,Namespace:kube-system,Attempt:0,}" Apr 16 23:27:13.706304 kubelet[2522]: E0416 23:27:13.706187 2522 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-b2725589f5?timeout=10s\": dial tcp 10.0.3.226:6443: connect: connection refused" interval="800ms" Apr 16 23:27:13.859336 kubelet[2522]: I0416 23:27:13.859227 2522 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.859570 kubelet[2522]: E0416 23:27:13.859536 2522 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.226:6443/api/v1/nodes\": dial tcp 10.0.3.226:6443: connect: connection refused" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:13.914883 kubelet[2522]: E0416 23:27:13.914772 2522 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.3.226:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-b2725589f5&limit=500&resourceVersion=0\": dial tcp 10.0.3.226:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:27:14.001378 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2419043935.mount: Deactivated successfully. Apr 16 23:27:14.012137 containerd[1656]: time="2026-04-16T23:27:14.012076249Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 23:27:14.016224 containerd[1656]: time="2026-04-16T23:27:14.016149138Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Apr 16 23:27:14.020358 containerd[1656]: time="2026-04-16T23:27:14.020276787Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 23:27:14.021704 containerd[1656]: time="2026-04-16T23:27:14.021652391Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 23:27:14.024686 containerd[1656]: time="2026-04-16T23:27:14.024644237Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Apr 16 23:27:14.026019 containerd[1656]: time="2026-04-16T23:27:14.025970760Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 23:27:14.028076 containerd[1656]: time="2026-04-16T23:27:14.028006965Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Apr 16 23:27:14.029590 containerd[1656]: time="2026-04-16T23:27:14.029544648Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 16 23:27:14.030532 containerd[1656]: time="2026-04-16T23:27:14.030485771Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 489.089553ms" Apr 16 23:27:14.034036 containerd[1656]: time="2026-04-16T23:27:14.034004459Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 476.868165ms" Apr 16 23:27:14.034657 containerd[1656]: time="2026-04-16T23:27:14.034613300Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 488.290471ms" Apr 16 23:27:14.053468 containerd[1656]: time="2026-04-16T23:27:14.053386983Z" level=info msg="connecting to shim abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5" address="unix:///run/containerd/s/6599331a5eb4207260a7785351372b0ef0e720b35df80ad3ce2a8ad4b5523069" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:14.064320 containerd[1656]: time="2026-04-16T23:27:14.064272847Z" level=info msg="connecting to shim 408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11" address="unix:///run/containerd/s/d5b4dcdf25a0938c705e92b16d6bfdefa0c82b9feaabeed974a850c7a2df19d0" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:14.075040 containerd[1656]: time="2026-04-16T23:27:14.074992792Z" level=info msg="connecting to shim 018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391" address="unix:///run/containerd/s/9001d3af19fab49d13b013a1164cf6c44da1ea6a8d728d37b979197742b5bdbf" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:14.080995 systemd[1]: Started cri-containerd-abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5.scope - libcontainer container abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5. Apr 16 23:27:14.086727 systemd[1]: Started cri-containerd-408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11.scope - libcontainer container 408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11. Apr 16 23:27:14.108020 systemd[1]: Started cri-containerd-018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391.scope - libcontainer container 018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391. Apr 16 23:27:14.128433 containerd[1656]: time="2026-04-16T23:27:14.128225793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-b2725589f5,Uid:a1e3414c0e0f231e941f0dc3f4920754,Namespace:kube-system,Attempt:0,} returns sandbox id \"abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5\"" Apr 16 23:27:14.137929 containerd[1656]: time="2026-04-16T23:27:14.137894135Z" level=info msg="CreateContainer within sandbox \"abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 16 23:27:14.152078 containerd[1656]: time="2026-04-16T23:27:14.151585766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-b2725589f5,Uid:a71c6ecaf836f244c28e0b3995360d6f,Namespace:kube-system,Attempt:0,} returns sandbox id \"408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11\"" Apr 16 23:27:14.156197 containerd[1656]: time="2026-04-16T23:27:14.156165456Z" level=info msg="Container 21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:14.156288 containerd[1656]: time="2026-04-16T23:27:14.156177096Z" level=info msg="CreateContainer within sandbox \"408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 16 23:27:14.157452 containerd[1656]: time="2026-04-16T23:27:14.157420699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-b2725589f5,Uid:0414c4d62ed2bbea83e972726c85bf68,Namespace:kube-system,Attempt:0,} returns sandbox id \"018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391\"" Apr 16 23:27:14.163115 containerd[1656]: time="2026-04-16T23:27:14.163046712Z" level=info msg="CreateContainer within sandbox \"018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 16 23:27:14.167614 containerd[1656]: time="2026-04-16T23:27:14.167563402Z" level=info msg="CreateContainer within sandbox \"abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a\"" Apr 16 23:27:14.168304 containerd[1656]: time="2026-04-16T23:27:14.168107404Z" level=info msg="StartContainer for \"21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a\"" Apr 16 23:27:14.169127 containerd[1656]: time="2026-04-16T23:27:14.169092366Z" level=info msg="connecting to shim 21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a" address="unix:///run/containerd/s/6599331a5eb4207260a7785351372b0ef0e720b35df80ad3ce2a8ad4b5523069" protocol=ttrpc version=3 Apr 16 23:27:14.170755 containerd[1656]: time="2026-04-16T23:27:14.170527929Z" level=info msg="Container c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:14.181413 containerd[1656]: time="2026-04-16T23:27:14.180698232Z" level=info msg="Container 9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:14.186969 systemd[1]: Started cri-containerd-21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a.scope - libcontainer container 21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a. Apr 16 23:27:14.187612 containerd[1656]: time="2026-04-16T23:27:14.186912046Z" level=info msg="CreateContainer within sandbox \"408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51\"" Apr 16 23:27:14.188403 containerd[1656]: time="2026-04-16T23:27:14.188373210Z" level=info msg="StartContainer for \"c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51\"" Apr 16 23:27:14.190763 containerd[1656]: time="2026-04-16T23:27:14.190337574Z" level=info msg="connecting to shim c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51" address="unix:///run/containerd/s/d5b4dcdf25a0938c705e92b16d6bfdefa0c82b9feaabeed974a850c7a2df19d0" protocol=ttrpc version=3 Apr 16 23:27:14.192727 containerd[1656]: time="2026-04-16T23:27:14.192598219Z" level=info msg="CreateContainer within sandbox \"018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2\"" Apr 16 23:27:14.193176 containerd[1656]: time="2026-04-16T23:27:14.193149781Z" level=info msg="StartContainer for \"9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2\"" Apr 16 23:27:14.194370 containerd[1656]: time="2026-04-16T23:27:14.194324903Z" level=info msg="connecting to shim 9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2" address="unix:///run/containerd/s/9001d3af19fab49d13b013a1164cf6c44da1ea6a8d728d37b979197742b5bdbf" protocol=ttrpc version=3 Apr 16 23:27:14.208940 systemd[1]: Started cri-containerd-c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51.scope - libcontainer container c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51. Apr 16 23:27:14.214023 systemd[1]: Started cri-containerd-9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2.scope - libcontainer container 9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2. Apr 16 23:27:14.237282 containerd[1656]: time="2026-04-16T23:27:14.237234641Z" level=info msg="StartContainer for \"21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a\" returns successfully" Apr 16 23:27:14.266813 kubelet[2522]: E0416 23:27:14.265398 2522 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.3.226:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.3.226:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 23:27:14.267940 containerd[1656]: time="2026-04-16T23:27:14.267829350Z" level=info msg="StartContainer for \"c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51\" returns successfully" Apr 16 23:27:14.272043 containerd[1656]: time="2026-04-16T23:27:14.271859600Z" level=info msg="StartContainer for \"9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2\" returns successfully" Apr 16 23:27:14.662260 kubelet[2522]: I0416 23:27:14.662219 2522 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.128758 kubelet[2522]: E0416 23:27:15.128622 2522 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-b2725589f5\" not found" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.132202 kubelet[2522]: E0416 23:27:15.132173 2522 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-b2725589f5\" not found" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.133312 kubelet[2522]: E0416 23:27:15.133285 2522 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-b2725589f5\" not found" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.471379 kubelet[2522]: E0416 23:27:15.471340 2522 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-n-b2725589f5\" not found" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.541679 kubelet[2522]: I0416 23:27:15.541447 2522 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.541679 kubelet[2522]: E0416 23:27:15.541489 2522 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-n-b2725589f5\": node \"ci-4459-2-4-n-b2725589f5\" not found" Apr 16 23:27:15.554663 kubelet[2522]: E0416 23:27:15.554634 2522 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-b2725589f5\" not found" Apr 16 23:27:15.656203 kubelet[2522]: E0416 23:27:15.655513 2522 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-b2725589f5\" not found" Apr 16 23:27:15.803867 kubelet[2522]: I0416 23:27:15.803268 2522 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.809946 kubelet[2522]: E0416 23:27:15.809908 2522 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.809946 kubelet[2522]: I0416 23:27:15.809937 2522 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.811432 kubelet[2522]: E0416 23:27:15.811411 2522 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-b2725589f5\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.811472 kubelet[2522]: I0416 23:27:15.811435 2522 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:15.813005 kubelet[2522]: E0416 23:27:15.812985 2522 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:16.088073 kubelet[2522]: I0416 23:27:16.087869 2522 apiserver.go:52] "Watching apiserver" Apr 16 23:27:16.102649 kubelet[2522]: I0416 23:27:16.102592 2522 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 16 23:27:16.133520 kubelet[2522]: I0416 23:27:16.133312 2522 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:16.133520 kubelet[2522]: I0416 23:27:16.133402 2522 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:16.135417 kubelet[2522]: E0416 23:27:16.135373 2522 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-b2725589f5\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:16.135745 kubelet[2522]: E0416 23:27:16.135580 2522 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:17.823919 systemd[1]: Reload requested from client PID 2813 ('systemctl') (unit session-7.scope)... Apr 16 23:27:17.823942 systemd[1]: Reloading... Apr 16 23:27:17.895826 zram_generator::config[2862]: No configuration found. Apr 16 23:27:18.065026 systemd[1]: Reloading finished in 240 ms. Apr 16 23:27:18.083507 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:27:18.100353 systemd[1]: kubelet.service: Deactivated successfully. Apr 16 23:27:18.100615 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:27:18.100678 systemd[1]: kubelet.service: Consumed 1.101s CPU time, 124.3M memory peak. Apr 16 23:27:18.102930 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 16 23:27:18.245575 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 16 23:27:18.259079 (kubelet)[2901]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 16 23:27:18.297292 kubelet[2901]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:27:18.297292 kubelet[2901]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:27:18.297693 kubelet[2901]: I0416 23:27:18.297301 2901 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:27:18.303760 kubelet[2901]: I0416 23:27:18.303184 2901 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Apr 16 23:27:18.303760 kubelet[2901]: I0416 23:27:18.303215 2901 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:27:18.303760 kubelet[2901]: I0416 23:27:18.303239 2901 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 16 23:27:18.303760 kubelet[2901]: I0416 23:27:18.303245 2901 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:27:18.303760 kubelet[2901]: I0416 23:27:18.303464 2901 server.go:956] "Client rotation is on, will bootstrap in background" Apr 16 23:27:18.305098 kubelet[2901]: I0416 23:27:18.305053 2901 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 16 23:27:18.307224 kubelet[2901]: I0416 23:27:18.307118 2901 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 16 23:27:18.313337 kubelet[2901]: I0416 23:27:18.312910 2901 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:27:18.315935 kubelet[2901]: I0416 23:27:18.315908 2901 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 16 23:27:18.316157 kubelet[2901]: I0416 23:27:18.316132 2901 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:27:18.316303 kubelet[2901]: I0416 23:27:18.316156 2901 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-b2725589f5","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:27:18.316379 kubelet[2901]: I0416 23:27:18.316304 2901 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:27:18.316379 kubelet[2901]: I0416 23:27:18.316313 2901 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:27:18.316379 kubelet[2901]: I0416 23:27:18.316336 2901 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Apr 16 23:27:18.316532 kubelet[2901]: I0416 23:27:18.316518 2901 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:27:18.316674 kubelet[2901]: I0416 23:27:18.316663 2901 kubelet.go:475] "Attempting to sync node with API server" Apr 16 23:27:18.316697 kubelet[2901]: I0416 23:27:18.316679 2901 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:27:18.316718 kubelet[2901]: I0416 23:27:18.316703 2901 kubelet.go:387] "Adding apiserver pod source" Apr 16 23:27:18.316718 kubelet[2901]: I0416 23:27:18.316713 2901 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:27:18.317519 kubelet[2901]: I0416 23:27:18.317497 2901 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Apr 16 23:27:18.318073 kubelet[2901]: I0416 23:27:18.318050 2901 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:27:18.318125 kubelet[2901]: I0416 23:27:18.318084 2901 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 16 23:27:18.322978 kubelet[2901]: I0416 23:27:18.322955 2901 server.go:1262] "Started kubelet" Apr 16 23:27:18.324171 kubelet[2901]: I0416 23:27:18.324122 2901 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:27:18.326090 kubelet[2901]: I0416 23:27:18.326058 2901 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:27:18.327080 kubelet[2901]: I0416 23:27:18.327051 2901 volume_manager.go:313] "Starting Kubelet Volume Manager" Apr 16 23:27:18.327145 kubelet[2901]: I0416 23:27:18.327129 2901 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 16 23:27:18.327254 kubelet[2901]: I0416 23:27:18.327235 2901 reconciler.go:29] "Reconciler: start to sync state" Apr 16 23:27:18.329602 kubelet[2901]: E0416 23:27:18.329259 2901 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-b2725589f5\" not found" Apr 16 23:27:18.329602 kubelet[2901]: I0416 23:27:18.329466 2901 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:27:18.329602 kubelet[2901]: I0416 23:27:18.329527 2901 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 16 23:27:18.329715 kubelet[2901]: I0416 23:27:18.329616 2901 server.go:310] "Adding debug handlers to kubelet server" Apr 16 23:27:18.329924 kubelet[2901]: I0416 23:27:18.329898 2901 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:27:18.332772 kubelet[2901]: I0416 23:27:18.332273 2901 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 16 23:27:18.338870 kubelet[2901]: I0416 23:27:18.338752 2901 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:27:18.340763 kubelet[2901]: I0416 23:27:18.339397 2901 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 16 23:27:18.345425 kubelet[2901]: I0416 23:27:18.345378 2901 factory.go:223] Registration of the containerd container factory successfully Apr 16 23:27:18.349878 kubelet[2901]: E0416 23:27:18.349840 2901 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 16 23:27:18.351290 kubelet[2901]: I0416 23:27:18.351248 2901 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 16 23:27:18.363451 kubelet[2901]: I0416 23:27:18.363416 2901 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 16 23:27:18.363451 kubelet[2901]: I0416 23:27:18.363445 2901 status_manager.go:244] "Starting to sync pod status with apiserver" Apr 16 23:27:18.363597 kubelet[2901]: I0416 23:27:18.363471 2901 kubelet.go:2428] "Starting kubelet main sync loop" Apr 16 23:27:18.363597 kubelet[2901]: E0416 23:27:18.363519 2901 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 16 23:27:18.384281 kubelet[2901]: I0416 23:27:18.384214 2901 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 16 23:27:18.384281 kubelet[2901]: I0416 23:27:18.384236 2901 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 16 23:27:18.384426 kubelet[2901]: I0416 23:27:18.384322 2901 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:27:18.384828 kubelet[2901]: I0416 23:27:18.384798 2901 state_mem.go:88] "Updated default CPUSet" cpuSet="" Apr 16 23:27:18.384873 kubelet[2901]: I0416 23:27:18.384826 2901 state_mem.go:96] "Updated CPUSet assignments" assignments={} Apr 16 23:27:18.384873 kubelet[2901]: I0416 23:27:18.384850 2901 policy_none.go:49] "None policy: Start" Apr 16 23:27:18.384873 kubelet[2901]: I0416 23:27:18.384858 2901 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 16 23:27:18.384873 kubelet[2901]: I0416 23:27:18.384871 2901 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 16 23:27:18.384997 kubelet[2901]: I0416 23:27:18.384980 2901 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Apr 16 23:27:18.384997 kubelet[2901]: I0416 23:27:18.384995 2901 policy_none.go:47] "Start" Apr 16 23:27:18.390491 kubelet[2901]: E0416 23:27:18.390201 2901 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:27:18.390491 kubelet[2901]: I0416 23:27:18.390376 2901 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:27:18.390491 kubelet[2901]: I0416 23:27:18.390387 2901 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:27:18.390825 kubelet[2901]: I0416 23:27:18.390808 2901 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:27:18.393181 kubelet[2901]: E0416 23:27:18.393154 2901 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 16 23:27:18.464593 kubelet[2901]: I0416 23:27:18.464556 2901 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.465026 kubelet[2901]: I0416 23:27:18.465013 2901 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.465355 kubelet[2901]: I0416 23:27:18.465341 2901 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.492862 kubelet[2901]: I0416 23:27:18.492828 2901 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.502931 kubelet[2901]: I0416 23:27:18.502873 2901 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.503952 kubelet[2901]: I0416 23:27:18.503915 2901 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.628845 kubelet[2901]: I0416 23:27:18.628663 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a71c6ecaf836f244c28e0b3995360d6f-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-b2725589f5\" (UID: \"a71c6ecaf836f244c28e0b3995360d6f\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.628845 kubelet[2901]: I0416 23:27:18.628706 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a1e3414c0e0f231e941f0dc3f4920754-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" (UID: \"a1e3414c0e0f231e941f0dc3f4920754\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.628845 kubelet[2901]: I0416 23:27:18.628726 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.628845 kubelet[2901]: I0416 23:27:18.628766 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.628845 kubelet[2901]: I0416 23:27:18.628783 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.629102 kubelet[2901]: I0416 23:27:18.629018 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a1e3414c0e0f231e941f0dc3f4920754-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" (UID: \"a1e3414c0e0f231e941f0dc3f4920754\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.629102 kubelet[2901]: I0416 23:27:18.629052 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a1e3414c0e0f231e941f0dc3f4920754-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-b2725589f5\" (UID: \"a1e3414c0e0f231e941f0dc3f4920754\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.629102 kubelet[2901]: I0416 23:27:18.629069 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:18.629102 kubelet[2901]: I0416 23:27:18.629086 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0414c4d62ed2bbea83e972726c85bf68-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" (UID: \"0414c4d62ed2bbea83e972726c85bf68\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:19.318037 kubelet[2901]: I0416 23:27:19.317995 2901 apiserver.go:52] "Watching apiserver" Apr 16 23:27:19.327249 kubelet[2901]: I0416 23:27:19.327202 2901 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 16 23:27:19.369525 kubelet[2901]: I0416 23:27:19.369450 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" podStartSLOduration=1.369432233 podStartE2EDuration="1.369432233s" podCreationTimestamp="2026-04-16 23:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:27:19.358761209 +0000 UTC m=+1.096495975" watchObservedRunningTime="2026-04-16 23:27:19.369432233 +0000 UTC m=+1.107166999" Apr 16 23:27:19.380997 kubelet[2901]: I0416 23:27:19.380900 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-b2725589f5" podStartSLOduration=1.380883579 podStartE2EDuration="1.380883579s" podCreationTimestamp="2026-04-16 23:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:27:19.369961114 +0000 UTC m=+1.107695840" watchObservedRunningTime="2026-04-16 23:27:19.380883579 +0000 UTC m=+1.118618345" Apr 16 23:27:19.381367 kubelet[2901]: I0416 23:27:19.381250 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" podStartSLOduration=1.3812413399999999 podStartE2EDuration="1.38124134s" podCreationTimestamp="2026-04-16 23:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:27:19.380837139 +0000 UTC m=+1.118571905" watchObservedRunningTime="2026-04-16 23:27:19.38124134 +0000 UTC m=+1.118976106" Apr 16 23:27:19.383140 kubelet[2901]: I0416 23:27:19.382706 2901 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:19.383394 kubelet[2901]: I0416 23:27:19.383346 2901 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:19.391594 kubelet[2901]: E0416 23:27:19.391549 2901 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-b2725589f5\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:19.392690 kubelet[2901]: E0416 23:27:19.392670 2901 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-b2725589f5\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-b2725589f5" Apr 16 23:27:23.864620 kubelet[2901]: I0416 23:27:23.864586 2901 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 16 23:27:23.865320 containerd[1656]: time="2026-04-16T23:27:23.865284098Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 16 23:27:23.865533 kubelet[2901]: I0416 23:27:23.865484 2901 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 16 23:27:24.554127 systemd[1]: Created slice kubepods-besteffort-podcdb2fddc_b9b1_403d_b6e9_c65a43b3c556.slice - libcontainer container kubepods-besteffort-podcdb2fddc_b9b1_403d_b6e9_c65a43b3c556.slice. Apr 16 23:27:24.564014 kubelet[2901]: I0416 23:27:24.563913 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/cdb2fddc-b9b1-403d-b6e9-c65a43b3c556-kube-proxy\") pod \"kube-proxy-hrwzj\" (UID: \"cdb2fddc-b9b1-403d-b6e9-c65a43b3c556\") " pod="kube-system/kube-proxy-hrwzj" Apr 16 23:27:24.564014 kubelet[2901]: I0416 23:27:24.563952 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cdb2fddc-b9b1-403d-b6e9-c65a43b3c556-lib-modules\") pod \"kube-proxy-hrwzj\" (UID: \"cdb2fddc-b9b1-403d-b6e9-c65a43b3c556\") " pod="kube-system/kube-proxy-hrwzj" Apr 16 23:27:24.564014 kubelet[2901]: I0416 23:27:24.563970 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnd59\" (UniqueName: \"kubernetes.io/projected/cdb2fddc-b9b1-403d-b6e9-c65a43b3c556-kube-api-access-jnd59\") pod \"kube-proxy-hrwzj\" (UID: \"cdb2fddc-b9b1-403d-b6e9-c65a43b3c556\") " pod="kube-system/kube-proxy-hrwzj" Apr 16 23:27:24.564014 kubelet[2901]: I0416 23:27:24.563988 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/cdb2fddc-b9b1-403d-b6e9-c65a43b3c556-xtables-lock\") pod \"kube-proxy-hrwzj\" (UID: \"cdb2fddc-b9b1-403d-b6e9-c65a43b3c556\") " pod="kube-system/kube-proxy-hrwzj" Apr 16 23:27:24.869982 containerd[1656]: time="2026-04-16T23:27:24.869878462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hrwzj,Uid:cdb2fddc-b9b1-403d-b6e9-c65a43b3c556,Namespace:kube-system,Attempt:0,}" Apr 16 23:27:24.896493 containerd[1656]: time="2026-04-16T23:27:24.896448843Z" level=info msg="connecting to shim 8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443" address="unix:///run/containerd/s/e4322dff167c59dce2997c4c63348d94afb2f9132dbae04e8f9275033cb8c1bb" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:24.918131 systemd[1]: Started cri-containerd-8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443.scope - libcontainer container 8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443. Apr 16 23:27:24.938762 containerd[1656]: time="2026-04-16T23:27:24.938708899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hrwzj,Uid:cdb2fddc-b9b1-403d-b6e9-c65a43b3c556,Namespace:kube-system,Attempt:0,} returns sandbox id \"8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443\"" Apr 16 23:27:24.945529 containerd[1656]: time="2026-04-16T23:27:24.945490474Z" level=info msg="CreateContainer within sandbox \"8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 16 23:27:24.957752 containerd[1656]: time="2026-04-16T23:27:24.957708142Z" level=info msg="Container 56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:24.969445 containerd[1656]: time="2026-04-16T23:27:24.969393009Z" level=info msg="CreateContainer within sandbox \"8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff\"" Apr 16 23:27:24.970422 containerd[1656]: time="2026-04-16T23:27:24.970396011Z" level=info msg="StartContainer for \"56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff\"" Apr 16 23:27:24.973791 containerd[1656]: time="2026-04-16T23:27:24.973759899Z" level=info msg="connecting to shim 56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff" address="unix:///run/containerd/s/e4322dff167c59dce2997c4c63348d94afb2f9132dbae04e8f9275033cb8c1bb" protocol=ttrpc version=3 Apr 16 23:27:24.992935 systemd[1]: Started cri-containerd-56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff.scope - libcontainer container 56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff. Apr 16 23:27:25.034630 systemd[1]: Created slice kubepods-besteffort-pod867ecad7_0cd8_46ca_b446_c47691f2f99d.slice - libcontainer container kubepods-besteffort-pod867ecad7_0cd8_46ca_b446_c47691f2f99d.slice. Apr 16 23:27:25.065797 containerd[1656]: time="2026-04-16T23:27:25.062870261Z" level=info msg="StartContainer for \"56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff\" returns successfully" Apr 16 23:27:25.069145 kubelet[2901]: I0416 23:27:25.069104 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65269\" (UniqueName: \"kubernetes.io/projected/867ecad7-0cd8-46ca-b446-c47691f2f99d-kube-api-access-65269\") pod \"tigera-operator-5588576f44-d5vm6\" (UID: \"867ecad7-0cd8-46ca-b446-c47691f2f99d\") " pod="tigera-operator/tigera-operator-5588576f44-d5vm6" Apr 16 23:27:25.069478 kubelet[2901]: I0416 23:27:25.069182 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/867ecad7-0cd8-46ca-b446-c47691f2f99d-var-lib-calico\") pod \"tigera-operator-5588576f44-d5vm6\" (UID: \"867ecad7-0cd8-46ca-b446-c47691f2f99d\") " pod="tigera-operator/tigera-operator-5588576f44-d5vm6" Apr 16 23:27:25.340456 containerd[1656]: time="2026-04-16T23:27:25.340411052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-d5vm6,Uid:867ecad7-0cd8-46ca-b446-c47691f2f99d,Namespace:tigera-operator,Attempt:0,}" Apr 16 23:27:25.360006 containerd[1656]: time="2026-04-16T23:27:25.359925777Z" level=info msg="connecting to shim 47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32" address="unix:///run/containerd/s/dde29e35f05d509134bcad93245f417c05a27854bad4c991013c05cf5bb9f722" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:25.378882 systemd[1]: Started cri-containerd-47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32.scope - libcontainer container 47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32. Apr 16 23:27:25.409026 kubelet[2901]: I0416 23:27:25.408925 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hrwzj" podStartSLOduration=1.408907008 podStartE2EDuration="1.408907008s" podCreationTimestamp="2026-04-16 23:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:27:25.407939286 +0000 UTC m=+7.145674092" watchObservedRunningTime="2026-04-16 23:27:25.408907008 +0000 UTC m=+7.146641774" Apr 16 23:27:25.415798 containerd[1656]: time="2026-04-16T23:27:25.415664704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-d5vm6,Uid:867ecad7-0cd8-46ca-b446-c47691f2f99d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32\"" Apr 16 23:27:25.419266 containerd[1656]: time="2026-04-16T23:27:25.419225392Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 16 23:27:26.865149 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1119511894.mount: Deactivated successfully. Apr 16 23:27:27.505268 containerd[1656]: time="2026-04-16T23:27:27.505218896Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:27.507263 containerd[1656]: time="2026-04-16T23:27:27.507228900Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Apr 16 23:27:27.509317 containerd[1656]: time="2026-04-16T23:27:27.509262105Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:27.512757 containerd[1656]: time="2026-04-16T23:27:27.511798111Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:27.513142 containerd[1656]: time="2026-04-16T23:27:27.513117474Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.093851522s" Apr 16 23:27:27.513254 containerd[1656]: time="2026-04-16T23:27:27.513239154Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Apr 16 23:27:27.517488 containerd[1656]: time="2026-04-16T23:27:27.517410083Z" level=info msg="CreateContainer within sandbox \"47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 16 23:27:27.531676 containerd[1656]: time="2026-04-16T23:27:27.531636116Z" level=info msg="Container 90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:27.538080 containerd[1656]: time="2026-04-16T23:27:27.538035330Z" level=info msg="CreateContainer within sandbox \"47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20\"" Apr 16 23:27:27.539847 containerd[1656]: time="2026-04-16T23:27:27.539800614Z" level=info msg="StartContainer for \"90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20\"" Apr 16 23:27:27.540832 containerd[1656]: time="2026-04-16T23:27:27.540803657Z" level=info msg="connecting to shim 90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20" address="unix:///run/containerd/s/dde29e35f05d509134bcad93245f417c05a27854bad4c991013c05cf5bb9f722" protocol=ttrpc version=3 Apr 16 23:27:27.560942 systemd[1]: Started cri-containerd-90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20.scope - libcontainer container 90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20. Apr 16 23:27:27.587680 containerd[1656]: time="2026-04-16T23:27:27.587638843Z" level=info msg="StartContainer for \"90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20\" returns successfully" Apr 16 23:27:28.416427 kubelet[2901]: I0416 23:27:28.416342 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-d5vm6" podStartSLOduration=2.321160963 podStartE2EDuration="4.416326088s" podCreationTimestamp="2026-04-16 23:27:24 +0000 UTC" firstStartedPulling="2026-04-16 23:27:25.418821711 +0000 UTC m=+7.156556477" lastFinishedPulling="2026-04-16 23:27:27.513986836 +0000 UTC m=+9.251721602" observedRunningTime="2026-04-16 23:27:28.415552806 +0000 UTC m=+10.153287652" watchObservedRunningTime="2026-04-16 23:27:28.416326088 +0000 UTC m=+10.154060894" Apr 16 23:27:32.915568 sudo[1958]: pam_unix(sudo:session): session closed for user root Apr 16 23:27:32.932309 sshd[1957]: Connection closed by 50.85.169.122 port 46066 Apr 16 23:27:32.932209 sshd-session[1954]: pam_unix(sshd:session): session closed for user core Apr 16 23:27:32.937162 systemd[1]: sshd@6-10.0.3.226:22-50.85.169.122:46066.service: Deactivated successfully. Apr 16 23:27:32.939766 systemd[1]: session-7.scope: Deactivated successfully. Apr 16 23:27:32.940048 systemd[1]: session-7.scope: Consumed 7.095s CPU time, 224.5M memory peak. Apr 16 23:27:32.941184 systemd-logind[1635]: Session 7 logged out. Waiting for processes to exit. Apr 16 23:27:32.942786 systemd-logind[1635]: Removed session 7. Apr 16 23:27:37.411203 systemd[1]: Created slice kubepods-besteffort-pod17df201f_b3a6_453d_a92e_4099048b9563.slice - libcontainer container kubepods-besteffort-pod17df201f_b3a6_453d_a92e_4099048b9563.slice. Apr 16 23:27:37.445607 kubelet[2901]: I0416 23:27:37.445548 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9xd8\" (UniqueName: \"kubernetes.io/projected/17df201f-b3a6-453d-a92e-4099048b9563-kube-api-access-x9xd8\") pod \"calico-typha-5bb7bc4596-ntj55\" (UID: \"17df201f-b3a6-453d-a92e-4099048b9563\") " pod="calico-system/calico-typha-5bb7bc4596-ntj55" Apr 16 23:27:37.445607 kubelet[2901]: I0416 23:27:37.445600 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/17df201f-b3a6-453d-a92e-4099048b9563-typha-certs\") pod \"calico-typha-5bb7bc4596-ntj55\" (UID: \"17df201f-b3a6-453d-a92e-4099048b9563\") " pod="calico-system/calico-typha-5bb7bc4596-ntj55" Apr 16 23:27:37.445607 kubelet[2901]: I0416 23:27:37.445620 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17df201f-b3a6-453d-a92e-4099048b9563-tigera-ca-bundle\") pod \"calico-typha-5bb7bc4596-ntj55\" (UID: \"17df201f-b3a6-453d-a92e-4099048b9563\") " pod="calico-system/calico-typha-5bb7bc4596-ntj55" Apr 16 23:27:37.469705 systemd[1]: Created slice kubepods-besteffort-pod88ae8a2f_47bc_4918_83cf_688f0318b7f5.slice - libcontainer container kubepods-besteffort-pod88ae8a2f_47bc_4918_83cf_688f0318b7f5.slice. Apr 16 23:27:37.546752 kubelet[2901]: I0416 23:27:37.546697 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ae8a2f-47bc-4918-83cf-688f0318b7f5-tigera-ca-bundle\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.546952 kubelet[2901]: I0416 23:27:37.546936 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-cni-bin-dir\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547018 kubelet[2901]: I0416 23:27:37.547007 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-var-lib-calico\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547087 kubelet[2901]: I0416 23:27:37.547076 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-var-run-calico\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547640 kubelet[2901]: I0416 23:27:37.547145 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-cni-net-dir\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547640 kubelet[2901]: I0416 23:27:37.547165 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-flexvol-driver-host\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547640 kubelet[2901]: I0416 23:27:37.547182 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-nodeproc\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547640 kubelet[2901]: I0416 23:27:37.547199 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/88ae8a2f-47bc-4918-83cf-688f0318b7f5-node-certs\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547640 kubelet[2901]: I0416 23:27:37.547214 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-sys-fs\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547829 kubelet[2901]: I0416 23:27:37.547230 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-lib-modules\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547829 kubelet[2901]: I0416 23:27:37.547243 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-policysync\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547829 kubelet[2901]: I0416 23:27:37.547265 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-xtables-lock\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547829 kubelet[2901]: I0416 23:27:37.547280 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24k2p\" (UniqueName: \"kubernetes.io/projected/88ae8a2f-47bc-4918-83cf-688f0318b7f5-kube-api-access-24k2p\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547829 kubelet[2901]: I0416 23:27:37.547298 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-cni-log-dir\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.547947 kubelet[2901]: I0416 23:27:37.547324 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/88ae8a2f-47bc-4918-83cf-688f0318b7f5-bpffs\") pod \"calico-node-z86qk\" (UID: \"88ae8a2f-47bc-4918-83cf-688f0318b7f5\") " pod="calico-system/calico-node-z86qk" Apr 16 23:27:37.578306 kubelet[2901]: E0416 23:27:37.578257 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zhpzf" podUID="8cac77ff-82cd-4c79-8746-89081cc748b0" Apr 16 23:27:37.648054 kubelet[2901]: I0416 23:27:37.647876 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cac77ff-82cd-4c79-8746-89081cc748b0-kubelet-dir\") pod \"csi-node-driver-zhpzf\" (UID: \"8cac77ff-82cd-4c79-8746-89081cc748b0\") " pod="calico-system/csi-node-driver-zhpzf" Apr 16 23:27:37.648054 kubelet[2901]: I0416 23:27:37.647982 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cac77ff-82cd-4c79-8746-89081cc748b0-registration-dir\") pod \"csi-node-driver-zhpzf\" (UID: \"8cac77ff-82cd-4c79-8746-89081cc748b0\") " pod="calico-system/csi-node-driver-zhpzf" Apr 16 23:27:37.648267 kubelet[2901]: I0416 23:27:37.648071 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plf2c\" (UniqueName: \"kubernetes.io/projected/8cac77ff-82cd-4c79-8746-89081cc748b0-kube-api-access-plf2c\") pod \"csi-node-driver-zhpzf\" (UID: \"8cac77ff-82cd-4c79-8746-89081cc748b0\") " pod="calico-system/csi-node-driver-zhpzf" Apr 16 23:27:37.648662 kubelet[2901]: E0416 23:27:37.648628 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.648662 kubelet[2901]: W0416 23:27:37.648652 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.648815 kubelet[2901]: E0416 23:27:37.648670 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.648884 kubelet[2901]: E0416 23:27:37.648871 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.648915 kubelet[2901]: W0416 23:27:37.648883 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.648915 kubelet[2901]: E0416 23:27:37.648893 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.649111 kubelet[2901]: E0416 23:27:37.649099 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.649111 kubelet[2901]: W0416 23:27:37.649110 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.649176 kubelet[2901]: E0416 23:27:37.649120 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.649274 kubelet[2901]: E0416 23:27:37.649263 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.649303 kubelet[2901]: W0416 23:27:37.649273 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.649303 kubelet[2901]: E0416 23:27:37.649283 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.649345 kubelet[2901]: I0416 23:27:37.649305 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cac77ff-82cd-4c79-8746-89081cc748b0-socket-dir\") pod \"csi-node-driver-zhpzf\" (UID: \"8cac77ff-82cd-4c79-8746-89081cc748b0\") " pod="calico-system/csi-node-driver-zhpzf" Apr 16 23:27:37.649941 kubelet[2901]: E0416 23:27:37.649856 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.649941 kubelet[2901]: W0416 23:27:37.649879 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.650821 kubelet[2901]: E0416 23:27:37.650759 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.651171 kubelet[2901]: E0416 23:27:37.651145 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.651622 kubelet[2901]: W0416 23:27:37.651180 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.651622 kubelet[2901]: E0416 23:27:37.651198 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.651622 kubelet[2901]: E0416 23:27:37.651431 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.651622 kubelet[2901]: W0416 23:27:37.651445 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.651622 kubelet[2901]: E0416 23:27:37.651457 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.651793 kubelet[2901]: E0416 23:27:37.651772 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.651793 kubelet[2901]: W0416 23:27:37.651790 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.651880 kubelet[2901]: E0416 23:27:37.651802 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.652231 kubelet[2901]: E0416 23:27:37.652167 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.652231 kubelet[2901]: W0416 23:27:37.652181 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.652363 kubelet[2901]: E0416 23:27:37.652192 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.652781 kubelet[2901]: E0416 23:27:37.652723 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.652781 kubelet[2901]: W0416 23:27:37.652778 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.652855 kubelet[2901]: E0416 23:27:37.652792 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.654263 kubelet[2901]: E0416 23:27:37.654076 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.654263 kubelet[2901]: W0416 23:27:37.654097 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.654263 kubelet[2901]: E0416 23:27:37.654119 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.654397 kubelet[2901]: E0416 23:27:37.654351 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.654397 kubelet[2901]: W0416 23:27:37.654367 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.654397 kubelet[2901]: E0416 23:27:37.654379 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.654682 kubelet[2901]: E0416 23:27:37.654668 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.654682 kubelet[2901]: W0416 23:27:37.654681 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.654781 kubelet[2901]: E0416 23:27:37.654690 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.654935 kubelet[2901]: E0416 23:27:37.654917 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.654968 kubelet[2901]: W0416 23:27:37.654934 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.654968 kubelet[2901]: E0416 23:27:37.654948 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.655473 kubelet[2901]: E0416 23:27:37.655457 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.655473 kubelet[2901]: W0416 23:27:37.655472 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.655549 kubelet[2901]: E0416 23:27:37.655483 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.656019 kubelet[2901]: E0416 23:27:37.655998 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.656019 kubelet[2901]: W0416 23:27:37.656015 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.656101 kubelet[2901]: E0416 23:27:37.656031 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.656807 kubelet[2901]: E0416 23:27:37.656784 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.656807 kubelet[2901]: W0416 23:27:37.656806 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.656909 kubelet[2901]: E0416 23:27:37.656827 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.657044 kubelet[2901]: E0416 23:27:37.657032 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.657044 kubelet[2901]: W0416 23:27:37.657044 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.657107 kubelet[2901]: E0416 23:27:37.657055 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.657202 kubelet[2901]: I0416 23:27:37.657177 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8cac77ff-82cd-4c79-8746-89081cc748b0-varrun\") pod \"csi-node-driver-zhpzf\" (UID: \"8cac77ff-82cd-4c79-8746-89081cc748b0\") " pod="calico-system/csi-node-driver-zhpzf" Apr 16 23:27:37.657273 kubelet[2901]: E0416 23:27:37.657252 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.657273 kubelet[2901]: W0416 23:27:37.657267 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.657328 kubelet[2901]: E0416 23:27:37.657277 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.657475 kubelet[2901]: E0416 23:27:37.657461 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.657475 kubelet[2901]: W0416 23:27:37.657473 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.657533 kubelet[2901]: E0416 23:27:37.657486 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.657710 kubelet[2901]: E0416 23:27:37.657695 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.657817 kubelet[2901]: W0416 23:27:37.657710 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.657817 kubelet[2901]: E0416 23:27:37.657722 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.658202 kubelet[2901]: E0416 23:27:37.658134 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.658202 kubelet[2901]: W0416 23:27:37.658157 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.658202 kubelet[2901]: E0416 23:27:37.658174 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.658398 kubelet[2901]: E0416 23:27:37.658366 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.658398 kubelet[2901]: W0416 23:27:37.658385 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.658398 kubelet[2901]: E0416 23:27:37.658397 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.658686 kubelet[2901]: E0416 23:27:37.658656 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.658686 kubelet[2901]: W0416 23:27:37.658671 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.658686 kubelet[2901]: E0416 23:27:37.658683 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.658885 kubelet[2901]: E0416 23:27:37.658864 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.658885 kubelet[2901]: W0416 23:27:37.658877 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.658885 kubelet[2901]: E0416 23:27:37.658887 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.659094 kubelet[2901]: E0416 23:27:37.659070 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.659094 kubelet[2901]: W0416 23:27:37.659083 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.659094 kubelet[2901]: E0416 23:27:37.659095 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.659284 kubelet[2901]: E0416 23:27:37.659266 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.659284 kubelet[2901]: W0416 23:27:37.659279 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.659344 kubelet[2901]: E0416 23:27:37.659291 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.659564 kubelet[2901]: E0416 23:27:37.659548 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.659564 kubelet[2901]: W0416 23:27:37.659561 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.659621 kubelet[2901]: E0416 23:27:37.659569 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.659766 kubelet[2901]: E0416 23:27:37.659754 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.659802 kubelet[2901]: W0416 23:27:37.659766 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.659802 kubelet[2901]: E0416 23:27:37.659775 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.659988 kubelet[2901]: E0416 23:27:37.659972 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.659988 kubelet[2901]: W0416 23:27:37.659986 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.660034 kubelet[2901]: E0416 23:27:37.659995 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.660229 kubelet[2901]: E0416 23:27:37.660214 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.660229 kubelet[2901]: W0416 23:27:37.660227 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.660285 kubelet[2901]: E0416 23:27:37.660235 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.660420 kubelet[2901]: E0416 23:27:37.660406 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.660420 kubelet[2901]: W0416 23:27:37.660419 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.660472 kubelet[2901]: E0416 23:27:37.660428 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.660618 kubelet[2901]: E0416 23:27:37.660603 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.660618 kubelet[2901]: W0416 23:27:37.660616 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.660663 kubelet[2901]: E0416 23:27:37.660625 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.660834 kubelet[2901]: E0416 23:27:37.660819 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.660834 kubelet[2901]: W0416 23:27:37.660832 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.660924 kubelet[2901]: E0416 23:27:37.660841 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.661004 kubelet[2901]: E0416 23:27:37.660988 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.661004 kubelet[2901]: W0416 23:27:37.661000 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.661055 kubelet[2901]: E0416 23:27:37.661009 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.661190 kubelet[2901]: E0416 23:27:37.661176 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.661190 kubelet[2901]: W0416 23:27:37.661187 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.661291 kubelet[2901]: E0416 23:27:37.661196 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.661356 kubelet[2901]: E0416 23:27:37.661342 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.661356 kubelet[2901]: W0416 23:27:37.661354 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.661405 kubelet[2901]: E0416 23:27:37.661365 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.661575 kubelet[2901]: E0416 23:27:37.661560 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.661575 kubelet[2901]: W0416 23:27:37.661573 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.661626 kubelet[2901]: E0416 23:27:37.661582 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.661765 kubelet[2901]: E0416 23:27:37.661749 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.661765 kubelet[2901]: W0416 23:27:37.661764 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.661829 kubelet[2901]: E0416 23:27:37.661773 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.662004 kubelet[2901]: E0416 23:27:37.661989 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.662004 kubelet[2901]: W0416 23:27:37.662003 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.662078 kubelet[2901]: E0416 23:27:37.662012 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.662185 kubelet[2901]: E0416 23:27:37.662171 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.662185 kubelet[2901]: W0416 23:27:37.662184 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.662275 kubelet[2901]: E0416 23:27:37.662193 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.662380 kubelet[2901]: E0416 23:27:37.662368 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.662380 kubelet[2901]: W0416 23:27:37.662379 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.662430 kubelet[2901]: E0416 23:27:37.662387 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.662568 kubelet[2901]: E0416 23:27:37.662553 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.662568 kubelet[2901]: W0416 23:27:37.662567 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.662618 kubelet[2901]: E0416 23:27:37.662576 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.662802 kubelet[2901]: E0416 23:27:37.662787 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.662802 kubelet[2901]: W0416 23:27:37.662799 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.662867 kubelet[2901]: E0416 23:27:37.662809 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.662993 kubelet[2901]: E0416 23:27:37.662980 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.662993 kubelet[2901]: W0416 23:27:37.662991 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.663043 kubelet[2901]: E0416 23:27:37.663000 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.664155 kubelet[2901]: E0416 23:27:37.663750 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.664155 kubelet[2901]: W0416 23:27:37.663769 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.664155 kubelet[2901]: E0416 23:27:37.663781 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.664361 kubelet[2901]: E0416 23:27:37.664342 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.664420 kubelet[2901]: W0416 23:27:37.664409 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.664472 kubelet[2901]: E0416 23:27:37.664462 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.719384 containerd[1656]: time="2026-04-16T23:27:37.719341646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bb7bc4596-ntj55,Uid:17df201f-b3a6-453d-a92e-4099048b9563,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:37.743340 containerd[1656]: time="2026-04-16T23:27:37.743255780Z" level=info msg="connecting to shim 4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12" address="unix:///run/containerd/s/00269e93a2856f7934be2a1abdfc36432794d3a42762c3cb53bfd8dc5c730fb5" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:37.761479 kubelet[2901]: E0416 23:27:37.761453 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.761800 kubelet[2901]: W0416 23:27:37.761513 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.761800 kubelet[2901]: E0416 23:27:37.761534 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.761934 kubelet[2901]: E0416 23:27:37.761919 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.761989 kubelet[2901]: W0416 23:27:37.761978 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.762055 kubelet[2901]: E0416 23:27:37.762043 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.762343 kubelet[2901]: E0416 23:27:37.762299 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.762343 kubelet[2901]: W0416 23:27:37.762317 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.762343 kubelet[2901]: E0416 23:27:37.762333 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.762504 kubelet[2901]: E0416 23:27:37.762491 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.762504 kubelet[2901]: W0416 23:27:37.762500 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.762568 kubelet[2901]: E0416 23:27:37.762508 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.762650 kubelet[2901]: E0416 23:27:37.762638 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.762650 kubelet[2901]: W0416 23:27:37.762647 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.762700 kubelet[2901]: E0416 23:27:37.762655 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.762869 kubelet[2901]: E0416 23:27:37.762853 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.762901 kubelet[2901]: W0416 23:27:37.762871 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.762901 kubelet[2901]: E0416 23:27:37.762879 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.762917 systemd[1]: Started cri-containerd-4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12.scope - libcontainer container 4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12. Apr 16 23:27:37.763654 kubelet[2901]: E0416 23:27:37.763637 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.763654 kubelet[2901]: W0416 23:27:37.763651 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.763850 kubelet[2901]: E0416 23:27:37.763662 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.764202 kubelet[2901]: E0416 23:27:37.764074 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.764202 kubelet[2901]: W0416 23:27:37.764091 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.764202 kubelet[2901]: E0416 23:27:37.764104 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.764377 kubelet[2901]: E0416 23:27:37.764364 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.764434 kubelet[2901]: W0416 23:27:37.764423 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.764485 kubelet[2901]: E0416 23:27:37.764476 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.764910 kubelet[2901]: E0416 23:27:37.764800 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.764910 kubelet[2901]: W0416 23:27:37.764815 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.764910 kubelet[2901]: E0416 23:27:37.764827 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.765214 kubelet[2901]: E0416 23:27:37.765198 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.765417 kubelet[2901]: W0416 23:27:37.765278 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.765417 kubelet[2901]: E0416 23:27:37.765302 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.765655 kubelet[2901]: E0416 23:27:37.765641 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.765719 kubelet[2901]: W0416 23:27:37.765706 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.765930 kubelet[2901]: E0416 23:27:37.765769 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.766109 kubelet[2901]: E0416 23:27:37.766069 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.766177 kubelet[2901]: W0416 23:27:37.766164 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.766243 kubelet[2901]: E0416 23:27:37.766230 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.767915 kubelet[2901]: E0416 23:27:37.767879 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.767915 kubelet[2901]: W0416 23:27:37.767903 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.768047 kubelet[2901]: E0416 23:27:37.768022 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.768449 kubelet[2901]: E0416 23:27:37.768329 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.768449 kubelet[2901]: W0416 23:27:37.768346 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.768449 kubelet[2901]: E0416 23:27:37.768358 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.768711 kubelet[2901]: E0416 23:27:37.768695 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.769097 kubelet[2901]: W0416 23:27:37.768831 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.769097 kubelet[2901]: E0416 23:27:37.768852 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.769255 kubelet[2901]: E0416 23:27:37.769238 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.769333 kubelet[2901]: W0416 23:27:37.769312 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.769333 kubelet[2901]: E0416 23:27:37.769358 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.769681 kubelet[2901]: E0416 23:27:37.769665 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.769768 kubelet[2901]: W0416 23:27:37.769754 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.769821 kubelet[2901]: E0416 23:27:37.769811 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.770473 kubelet[2901]: E0416 23:27:37.770449 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.770693 kubelet[2901]: W0416 23:27:37.770553 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.770693 kubelet[2901]: E0416 23:27:37.770578 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.771156 kubelet[2901]: E0416 23:27:37.771073 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.771156 kubelet[2901]: W0416 23:27:37.771090 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.771156 kubelet[2901]: E0416 23:27:37.771102 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.771577 kubelet[2901]: E0416 23:27:37.771413 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.771577 kubelet[2901]: W0416 23:27:37.771426 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.771577 kubelet[2901]: E0416 23:27:37.771436 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.771780 kubelet[2901]: E0416 23:27:37.771764 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.771850 kubelet[2901]: W0416 23:27:37.771838 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.771915 kubelet[2901]: E0416 23:27:37.771904 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.772472 kubelet[2901]: E0416 23:27:37.772256 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.772659 kubelet[2901]: W0416 23:27:37.772567 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.772813 kubelet[2901]: E0416 23:27:37.772769 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.773565 kubelet[2901]: E0416 23:27:37.773279 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.773565 kubelet[2901]: W0416 23:27:37.773296 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.773565 kubelet[2901]: E0416 23:27:37.773309 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.774225 kubelet[2901]: E0416 23:27:37.774207 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.774257 kubelet[2901]: W0416 23:27:37.774225 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.774257 kubelet[2901]: E0416 23:27:37.774238 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.775219 containerd[1656]: time="2026-04-16T23:27:37.775058573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z86qk,Uid:88ae8a2f-47bc-4918-83cf-688f0318b7f5,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:37.779440 kubelet[2901]: E0416 23:27:37.779365 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:37.779440 kubelet[2901]: W0416 23:27:37.779383 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:37.779440 kubelet[2901]: E0416 23:27:37.779398 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:37.802617 containerd[1656]: time="2026-04-16T23:27:37.802572035Z" level=info msg="connecting to shim 01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5" address="unix:///run/containerd/s/3568c88b46408aaa674822b1a3a8490759c49fc014618ebdfaeaa674b0be4801" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:37.807163 containerd[1656]: time="2026-04-16T23:27:37.807103525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bb7bc4596-ntj55,Uid:17df201f-b3a6-453d-a92e-4099048b9563,Namespace:calico-system,Attempt:0,} returns sandbox id \"4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12\"" Apr 16 23:27:37.809220 containerd[1656]: time="2026-04-16T23:27:37.809140370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 16 23:27:37.824983 systemd[1]: Started cri-containerd-01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5.scope - libcontainer container 01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5. Apr 16 23:27:37.848858 containerd[1656]: time="2026-04-16T23:27:37.848809900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z86qk,Uid:88ae8a2f-47bc-4918-83cf-688f0318b7f5,Namespace:calico-system,Attempt:0,} returns sandbox id \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\"" Apr 16 23:27:39.104138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3390544869.mount: Deactivated successfully. Apr 16 23:27:39.364903 kubelet[2901]: E0416 23:27:39.364782 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zhpzf" podUID="8cac77ff-82cd-4c79-8746-89081cc748b0" Apr 16 23:27:39.501990 containerd[1656]: time="2026-04-16T23:27:39.501929700Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:39.503439 containerd[1656]: time="2026-04-16T23:27:39.503401063Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Apr 16 23:27:39.505626 containerd[1656]: time="2026-04-16T23:27:39.505586108Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:39.508896 containerd[1656]: time="2026-04-16T23:27:39.508854756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:39.509850 containerd[1656]: time="2026-04-16T23:27:39.509817318Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 1.700636908s" Apr 16 23:27:39.509850 containerd[1656]: time="2026-04-16T23:27:39.509847878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Apr 16 23:27:39.510937 containerd[1656]: time="2026-04-16T23:27:39.510804640Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 16 23:27:39.520256 containerd[1656]: time="2026-04-16T23:27:39.520188582Z" level=info msg="CreateContainer within sandbox \"4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 16 23:27:39.531287 containerd[1656]: time="2026-04-16T23:27:39.531239007Z" level=info msg="Container a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:39.538575 containerd[1656]: time="2026-04-16T23:27:39.538452943Z" level=info msg="CreateContainer within sandbox \"4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c\"" Apr 16 23:27:39.539078 containerd[1656]: time="2026-04-16T23:27:39.539040304Z" level=info msg="StartContainer for \"a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c\"" Apr 16 23:27:39.540558 containerd[1656]: time="2026-04-16T23:27:39.540524868Z" level=info msg="connecting to shim a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c" address="unix:///run/containerd/s/00269e93a2856f7934be2a1abdfc36432794d3a42762c3cb53bfd8dc5c730fb5" protocol=ttrpc version=3 Apr 16 23:27:39.561035 systemd[1]: Started cri-containerd-a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c.scope - libcontainer container a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c. Apr 16 23:27:39.597025 containerd[1656]: time="2026-04-16T23:27:39.596990996Z" level=info msg="StartContainer for \"a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c\" returns successfully" Apr 16 23:27:40.447280 kubelet[2901]: I0416 23:27:40.447209 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5bb7bc4596-ntj55" podStartSLOduration=1.7450772589999999 podStartE2EDuration="3.44719253s" podCreationTimestamp="2026-04-16 23:27:37 +0000 UTC" firstStartedPulling="2026-04-16 23:27:37.808527449 +0000 UTC m=+19.546262215" lastFinishedPulling="2026-04-16 23:27:39.51064272 +0000 UTC m=+21.248377486" observedRunningTime="2026-04-16 23:27:40.446811449 +0000 UTC m=+22.184546215" watchObservedRunningTime="2026-04-16 23:27:40.44719253 +0000 UTC m=+22.184927296" Apr 16 23:27:40.458096 kubelet[2901]: E0416 23:27:40.458018 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.458096 kubelet[2901]: W0416 23:27:40.458039 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.458096 kubelet[2901]: E0416 23:27:40.458057 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.458552 kubelet[2901]: E0416 23:27:40.458514 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.458666 kubelet[2901]: W0416 23:27:40.458528 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.458666 kubelet[2901]: E0416 23:27:40.458630 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.458950 kubelet[2901]: E0416 23:27:40.458891 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.458950 kubelet[2901]: W0416 23:27:40.458904 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.458950 kubelet[2901]: E0416 23:27:40.458913 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.459221 kubelet[2901]: E0416 23:27:40.459203 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.459297 kubelet[2901]: W0416 23:27:40.459275 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.459352 kubelet[2901]: E0416 23:27:40.459342 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.459656 kubelet[2901]: E0416 23:27:40.459605 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.459656 kubelet[2901]: W0416 23:27:40.459617 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.459656 kubelet[2901]: E0416 23:27:40.459626 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.460021 kubelet[2901]: E0416 23:27:40.459954 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.460021 kubelet[2901]: W0416 23:27:40.459976 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.460021 kubelet[2901]: E0416 23:27:40.459987 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.460336 kubelet[2901]: E0416 23:27:40.460264 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.460336 kubelet[2901]: W0416 23:27:40.460289 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.460336 kubelet[2901]: E0416 23:27:40.460299 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.460571 kubelet[2901]: E0416 23:27:40.460560 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.460695 kubelet[2901]: W0416 23:27:40.460632 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.460695 kubelet[2901]: E0416 23:27:40.460647 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.460955 kubelet[2901]: E0416 23:27:40.460938 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.461014 kubelet[2901]: W0416 23:27:40.461004 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.461063 kubelet[2901]: E0416 23:27:40.461055 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.461306 kubelet[2901]: E0416 23:27:40.461254 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.461306 kubelet[2901]: W0416 23:27:40.461265 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.461306 kubelet[2901]: E0416 23:27:40.461274 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.461547 kubelet[2901]: E0416 23:27:40.461529 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.461607 kubelet[2901]: W0416 23:27:40.461597 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.461661 kubelet[2901]: E0416 23:27:40.461651 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.461855 kubelet[2901]: E0416 23:27:40.461844 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.461914 kubelet[2901]: W0416 23:27:40.461903 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.461972 kubelet[2901]: E0416 23:27:40.461962 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.462224 kubelet[2901]: E0416 23:27:40.462213 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.462284 kubelet[2901]: W0416 23:27:40.462273 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.462330 kubelet[2901]: E0416 23:27:40.462321 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.462529 kubelet[2901]: E0416 23:27:40.462519 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.462632 kubelet[2901]: W0416 23:27:40.462582 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.462632 kubelet[2901]: E0416 23:27:40.462595 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.462862 kubelet[2901]: E0416 23:27:40.462851 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.462988 kubelet[2901]: W0416 23:27:40.462894 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.462988 kubelet[2901]: E0416 23:27:40.462906 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.487914 kubelet[2901]: E0416 23:27:40.487872 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.487914 kubelet[2901]: W0416 23:27:40.487898 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.487914 kubelet[2901]: E0416 23:27:40.487918 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.488221 kubelet[2901]: E0416 23:27:40.488189 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.488221 kubelet[2901]: W0416 23:27:40.488202 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.488221 kubelet[2901]: E0416 23:27:40.488212 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.488424 kubelet[2901]: E0416 23:27:40.488412 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.488424 kubelet[2901]: W0416 23:27:40.488423 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.488474 kubelet[2901]: E0416 23:27:40.488433 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.488646 kubelet[2901]: E0416 23:27:40.488632 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.488646 kubelet[2901]: W0416 23:27:40.488643 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.488691 kubelet[2901]: E0416 23:27:40.488653 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.488810 kubelet[2901]: E0416 23:27:40.488798 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.488810 kubelet[2901]: W0416 23:27:40.488808 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.488862 kubelet[2901]: E0416 23:27:40.488817 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.488958 kubelet[2901]: E0416 23:27:40.488946 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.488958 kubelet[2901]: W0416 23:27:40.488956 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.488996 kubelet[2901]: E0416 23:27:40.488964 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.489142 kubelet[2901]: E0416 23:27:40.489131 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.489168 kubelet[2901]: W0416 23:27:40.489143 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.489168 kubelet[2901]: E0416 23:27:40.489152 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.489454 kubelet[2901]: E0416 23:27:40.489418 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.489454 kubelet[2901]: W0416 23:27:40.489439 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.489515 kubelet[2901]: E0416 23:27:40.489453 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.489617 kubelet[2901]: E0416 23:27:40.489604 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.489644 kubelet[2901]: W0416 23:27:40.489616 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.489644 kubelet[2901]: E0416 23:27:40.489641 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.489853 kubelet[2901]: E0416 23:27:40.489840 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.489853 kubelet[2901]: W0416 23:27:40.489852 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.489929 kubelet[2901]: E0416 23:27:40.489863 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.490039 kubelet[2901]: E0416 23:27:40.490026 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.490039 kubelet[2901]: W0416 23:27:40.490037 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.490093 kubelet[2901]: E0416 23:27:40.490046 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.490279 kubelet[2901]: E0416 23:27:40.490264 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.490279 kubelet[2901]: W0416 23:27:40.490276 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.490344 kubelet[2901]: E0416 23:27:40.490286 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.490466 kubelet[2901]: E0416 23:27:40.490455 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.490466 kubelet[2901]: W0416 23:27:40.490465 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.490515 kubelet[2901]: E0416 23:27:40.490474 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.490901 kubelet[2901]: E0416 23:27:40.490782 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.490901 kubelet[2901]: W0416 23:27:40.490799 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.490901 kubelet[2901]: E0416 23:27:40.490812 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.491076 kubelet[2901]: E0416 23:27:40.491062 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.491126 kubelet[2901]: W0416 23:27:40.491116 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.491178 kubelet[2901]: E0416 23:27:40.491168 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.491595 kubelet[2901]: E0416 23:27:40.491413 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.491595 kubelet[2901]: W0416 23:27:40.491425 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.491595 kubelet[2901]: E0416 23:27:40.491439 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.491713 kubelet[2901]: E0416 23:27:40.491687 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.491713 kubelet[2901]: W0416 23:27:40.491705 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.491774 kubelet[2901]: E0416 23:27:40.491755 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.492065 kubelet[2901]: E0416 23:27:40.492048 2901 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 16 23:27:40.492065 kubelet[2901]: W0416 23:27:40.492064 2901 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 16 23:27:40.492129 kubelet[2901]: E0416 23:27:40.492077 2901 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 16 23:27:40.797269 containerd[1656]: time="2026-04-16T23:27:40.797155806Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:40.800064 containerd[1656]: time="2026-04-16T23:27:40.798808850Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Apr 16 23:27:40.805111 containerd[1656]: time="2026-04-16T23:27:40.804839143Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:40.808750 containerd[1656]: time="2026-04-16T23:27:40.808676192Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:40.809780 containerd[1656]: time="2026-04-16T23:27:40.809495234Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.298656594s" Apr 16 23:27:40.809780 containerd[1656]: time="2026-04-16T23:27:40.809581114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Apr 16 23:27:40.816210 containerd[1656]: time="2026-04-16T23:27:40.816170529Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 16 23:27:40.828762 containerd[1656]: time="2026-04-16T23:27:40.826231872Z" level=info msg="Container 49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:40.846061 containerd[1656]: time="2026-04-16T23:27:40.845999197Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651\"" Apr 16 23:27:40.846970 containerd[1656]: time="2026-04-16T23:27:40.846932159Z" level=info msg="StartContainer for \"49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651\"" Apr 16 23:27:40.848923 containerd[1656]: time="2026-04-16T23:27:40.848889724Z" level=info msg="connecting to shim 49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651" address="unix:///run/containerd/s/3568c88b46408aaa674822b1a3a8490759c49fc014618ebdfaeaa674b0be4801" protocol=ttrpc version=3 Apr 16 23:27:40.869010 systemd[1]: Started cri-containerd-49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651.scope - libcontainer container 49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651. Apr 16 23:27:40.934560 containerd[1656]: time="2026-04-16T23:27:40.934516158Z" level=info msg="StartContainer for \"49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651\" returns successfully" Apr 16 23:27:40.944713 systemd[1]: cri-containerd-49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651.scope: Deactivated successfully. Apr 16 23:27:40.947641 containerd[1656]: time="2026-04-16T23:27:40.947594468Z" level=info msg="received container exit event container_id:\"49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651\" id:\"49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651\" pid:3590 exited_at:{seconds:1776382060 nanos:947249787}" Apr 16 23:27:40.972351 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651-rootfs.mount: Deactivated successfully. Apr 16 23:27:41.364277 kubelet[2901]: E0416 23:27:41.364216 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zhpzf" podUID="8cac77ff-82cd-4c79-8746-89081cc748b0" Apr 16 23:27:41.438496 containerd[1656]: time="2026-04-16T23:27:41.438450184Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 16 23:27:43.363831 kubelet[2901]: E0416 23:27:43.363775 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zhpzf" podUID="8cac77ff-82cd-4c79-8746-89081cc748b0" Apr 16 23:27:44.972808 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount841933420.mount: Deactivated successfully. Apr 16 23:27:45.004597 containerd[1656]: time="2026-04-16T23:27:45.004532015Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:45.006482 containerd[1656]: time="2026-04-16T23:27:45.006384699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Apr 16 23:27:45.008569 containerd[1656]: time="2026-04-16T23:27:45.008507144Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:45.011581 containerd[1656]: time="2026-04-16T23:27:45.011520151Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:45.012581 containerd[1656]: time="2026-04-16T23:27:45.012107512Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 3.573583567s" Apr 16 23:27:45.012581 containerd[1656]: time="2026-04-16T23:27:45.012137912Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Apr 16 23:27:45.017964 containerd[1656]: time="2026-04-16T23:27:45.017930766Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 16 23:27:45.031765 containerd[1656]: time="2026-04-16T23:27:45.031699397Z" level=info msg="Container bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:45.044376 containerd[1656]: time="2026-04-16T23:27:45.044317866Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63\"" Apr 16 23:27:45.046033 containerd[1656]: time="2026-04-16T23:27:45.045836469Z" level=info msg="StartContainer for \"bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63\"" Apr 16 23:27:45.048361 containerd[1656]: time="2026-04-16T23:27:45.048325075Z" level=info msg="connecting to shim bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63" address="unix:///run/containerd/s/3568c88b46408aaa674822b1a3a8490759c49fc014618ebdfaeaa674b0be4801" protocol=ttrpc version=3 Apr 16 23:27:45.069901 systemd[1]: Started cri-containerd-bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63.scope - libcontainer container bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63. Apr 16 23:27:45.137265 containerd[1656]: time="2026-04-16T23:27:45.137136117Z" level=info msg="StartContainer for \"bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63\" returns successfully" Apr 16 23:27:45.239704 systemd[1]: cri-containerd-bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63.scope: Deactivated successfully. Apr 16 23:27:45.241649 containerd[1656]: time="2026-04-16T23:27:45.241611514Z" level=info msg="received container exit event container_id:\"bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63\" id:\"bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63\" pid:3646 exited_at:{seconds:1776382065 nanos:241390474}" Apr 16 23:27:45.259954 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63-rootfs.mount: Deactivated successfully. Apr 16 23:27:45.364703 kubelet[2901]: E0416 23:27:45.364502 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zhpzf" podUID="8cac77ff-82cd-4c79-8746-89081cc748b0" Apr 16 23:27:45.452305 containerd[1656]: time="2026-04-16T23:27:45.452265273Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 16 23:27:47.364162 kubelet[2901]: E0416 23:27:47.364111 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zhpzf" podUID="8cac77ff-82cd-4c79-8746-89081cc748b0" Apr 16 23:27:47.606955 containerd[1656]: time="2026-04-16T23:27:47.606370293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:47.607705 containerd[1656]: time="2026-04-16T23:27:47.607654416Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Apr 16 23:27:47.609195 containerd[1656]: time="2026-04-16T23:27:47.609163419Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:47.611727 containerd[1656]: time="2026-04-16T23:27:47.611694105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:47.612440 containerd[1656]: time="2026-04-16T23:27:47.612400746Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.160092713s" Apr 16 23:27:47.612440 containerd[1656]: time="2026-04-16T23:27:47.612436186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Apr 16 23:27:47.620052 containerd[1656]: time="2026-04-16T23:27:47.619894523Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 16 23:27:47.635691 containerd[1656]: time="2026-04-16T23:27:47.635639559Z" level=info msg="Container 9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:47.648311 containerd[1656]: time="2026-04-16T23:27:47.648256988Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a\"" Apr 16 23:27:47.648944 containerd[1656]: time="2026-04-16T23:27:47.648900789Z" level=info msg="StartContainer for \"9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a\"" Apr 16 23:27:47.650667 containerd[1656]: time="2026-04-16T23:27:47.650628153Z" level=info msg="connecting to shim 9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a" address="unix:///run/containerd/s/3568c88b46408aaa674822b1a3a8490759c49fc014618ebdfaeaa674b0be4801" protocol=ttrpc version=3 Apr 16 23:27:47.670080 systemd[1]: Started cri-containerd-9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a.scope - libcontainer container 9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a. Apr 16 23:27:47.731464 containerd[1656]: time="2026-04-16T23:27:47.731417057Z" level=info msg="StartContainer for \"9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a\" returns successfully" Apr 16 23:27:48.140254 containerd[1656]: time="2026-04-16T23:27:48.140197747Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 16 23:27:48.142117 systemd[1]: cri-containerd-9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a.scope: Deactivated successfully. Apr 16 23:27:48.142721 systemd[1]: cri-containerd-9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a.scope: Consumed 507ms CPU time, 192.7M memory peak, 171.3M written to disk. Apr 16 23:27:48.145323 containerd[1656]: time="2026-04-16T23:27:48.145267798Z" level=info msg="received container exit event container_id:\"9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a\" id:\"9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a\" pid:3705 exited_at:{seconds:1776382068 nanos:145064398}" Apr 16 23:27:48.163564 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a-rootfs.mount: Deactivated successfully. Apr 16 23:27:48.196124 kubelet[2901]: I0416 23:27:48.196067 2901 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Apr 16 23:27:48.251673 systemd[1]: Created slice kubepods-burstable-pod169c0663_4b7e_43a8_8244_c185c534f7be.slice - libcontainer container kubepods-burstable-pod169c0663_4b7e_43a8_8244_c185c534f7be.slice. Apr 16 23:27:48.258630 systemd[1]: Created slice kubepods-besteffort-pod50b472ca_47d9_4e73_b333_2bd45cc28f36.slice - libcontainer container kubepods-besteffort-pod50b472ca_47d9_4e73_b333_2bd45cc28f36.slice. Apr 16 23:27:48.266948 systemd[1]: Created slice kubepods-burstable-pod828b6488_b286_419f_a0a8_9005759cd92d.slice - libcontainer container kubepods-burstable-pod828b6488_b286_419f_a0a8_9005759cd92d.slice. Apr 16 23:27:48.275505 systemd[1]: Created slice kubepods-besteffort-pod8ee87be8_da6a_4477_8f26_c0de20ec6969.slice - libcontainer container kubepods-besteffort-pod8ee87be8_da6a_4477_8f26_c0de20ec6969.slice. Apr 16 23:27:48.282875 systemd[1]: Created slice kubepods-besteffort-pod124108c4_421d_4c57_8b62_11516c0f88f5.slice - libcontainer container kubepods-besteffort-pod124108c4_421d_4c57_8b62_11516c0f88f5.slice. Apr 16 23:27:48.290291 systemd[1]: Created slice kubepods-besteffort-poddb7bac57_1d40_4d74_a39c_beae008d27e7.slice - libcontainer container kubepods-besteffort-poddb7bac57_1d40_4d74_a39c_beae008d27e7.slice. Apr 16 23:27:48.294913 systemd[1]: Created slice kubepods-besteffort-pod2ce7a087_c66f_4222_b574_7deffd6b3781.slice - libcontainer container kubepods-besteffort-pod2ce7a087_c66f_4222_b574_7deffd6b3781.slice. Apr 16 23:27:48.343510 kubelet[2901]: I0416 23:27:48.343469 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-ca-bundle\") pod \"whisker-75565fcbc4-qbr5q\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " pod="calico-system/whisker-75565fcbc4-qbr5q" Apr 16 23:27:48.344006 kubelet[2901]: I0416 23:27:48.343551 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2wv\" (UniqueName: \"kubernetes.io/projected/169c0663-4b7e-43a8-8244-c185c534f7be-kube-api-access-tt2wv\") pod \"coredns-66bc5c9577-pm682\" (UID: \"169c0663-4b7e-43a8-8244-c185c534f7be\") " pod="kube-system/coredns-66bc5c9577-pm682" Apr 16 23:27:48.344006 kubelet[2901]: I0416 23:27:48.343973 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b472ca-47d9-4e73-b333-2bd45cc28f36-tigera-ca-bundle\") pod \"calico-kube-controllers-56489995f-wm9gf\" (UID: \"50b472ca-47d9-4e73-b333-2bd45cc28f36\") " pod="calico-system/calico-kube-controllers-56489995f-wm9gf" Apr 16 23:27:48.344006 kubelet[2901]: I0416 23:27:48.344001 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db7bac57-1d40-4d74-a39c-beae008d27e7-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-lx7nq\" (UID: \"db7bac57-1d40-4d74-a39c-beae008d27e7\") " pod="calico-system/goldmane-cccfbd5cf-lx7nq" Apr 16 23:27:48.344139 kubelet[2901]: I0416 23:27:48.344062 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx56x\" (UniqueName: \"kubernetes.io/projected/50b472ca-47d9-4e73-b333-2bd45cc28f36-kube-api-access-xx56x\") pod \"calico-kube-controllers-56489995f-wm9gf\" (UID: \"50b472ca-47d9-4e73-b333-2bd45cc28f36\") " pod="calico-system/calico-kube-controllers-56489995f-wm9gf" Apr 16 23:27:48.344139 kubelet[2901]: I0416 23:27:48.344107 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twzvt\" (UniqueName: \"kubernetes.io/projected/828b6488-b286-419f-a0a8-9005759cd92d-kube-api-access-twzvt\") pod \"coredns-66bc5c9577-sl2t6\" (UID: \"828b6488-b286-419f-a0a8-9005759cd92d\") " pod="kube-system/coredns-66bc5c9577-sl2t6" Apr 16 23:27:48.344184 kubelet[2901]: I0416 23:27:48.344162 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db7bac57-1d40-4d74-a39c-beae008d27e7-config\") pod \"goldmane-cccfbd5cf-lx7nq\" (UID: \"db7bac57-1d40-4d74-a39c-beae008d27e7\") " pod="calico-system/goldmane-cccfbd5cf-lx7nq" Apr 16 23:27:48.344240 kubelet[2901]: I0416 23:27:48.344202 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-backend-key-pair\") pod \"whisker-75565fcbc4-qbr5q\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " pod="calico-system/whisker-75565fcbc4-qbr5q" Apr 16 23:27:48.344271 kubelet[2901]: I0416 23:27:48.344250 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8ee87be8-da6a-4477-8f26-c0de20ec6969-calico-apiserver-certs\") pod \"calico-apiserver-5bf56bfb56-hzvn2\" (UID: \"8ee87be8-da6a-4477-8f26-c0de20ec6969\") " pod="calico-system/calico-apiserver-5bf56bfb56-hzvn2" Apr 16 23:27:48.344328 kubelet[2901]: I0416 23:27:48.344305 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdlzh\" (UniqueName: \"kubernetes.io/projected/8ee87be8-da6a-4477-8f26-c0de20ec6969-kube-api-access-xdlzh\") pod \"calico-apiserver-5bf56bfb56-hzvn2\" (UID: \"8ee87be8-da6a-4477-8f26-c0de20ec6969\") " pod="calico-system/calico-apiserver-5bf56bfb56-hzvn2" Apr 16 23:27:48.344362 kubelet[2901]: I0416 23:27:48.344347 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/828b6488-b286-419f-a0a8-9005759cd92d-config-volume\") pod \"coredns-66bc5c9577-sl2t6\" (UID: \"828b6488-b286-419f-a0a8-9005759cd92d\") " pod="kube-system/coredns-66bc5c9577-sl2t6" Apr 16 23:27:48.344396 kubelet[2901]: I0416 23:27:48.344383 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/124108c4-421d-4c57-8b62-11516c0f88f5-calico-apiserver-certs\") pod \"calico-apiserver-5bf56bfb56-gl7td\" (UID: \"124108c4-421d-4c57-8b62-11516c0f88f5\") " pod="calico-system/calico-apiserver-5bf56bfb56-gl7td" Apr 16 23:27:48.344426 kubelet[2901]: I0416 23:27:48.344414 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh5f5\" (UniqueName: \"kubernetes.io/projected/124108c4-421d-4c57-8b62-11516c0f88f5-kube-api-access-rh5f5\") pod \"calico-apiserver-5bf56bfb56-gl7td\" (UID: \"124108c4-421d-4c57-8b62-11516c0f88f5\") " pod="calico-system/calico-apiserver-5bf56bfb56-gl7td" Apr 16 23:27:48.344491 kubelet[2901]: I0416 23:27:48.344476 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/db7bac57-1d40-4d74-a39c-beae008d27e7-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-lx7nq\" (UID: \"db7bac57-1d40-4d74-a39c-beae008d27e7\") " pod="calico-system/goldmane-cccfbd5cf-lx7nq" Apr 16 23:27:48.344520 kubelet[2901]: I0416 23:27:48.344508 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/169c0663-4b7e-43a8-8244-c185c534f7be-config-volume\") pod \"coredns-66bc5c9577-pm682\" (UID: \"169c0663-4b7e-43a8-8244-c185c534f7be\") " pod="kube-system/coredns-66bc5c9577-pm682" Apr 16 23:27:48.344590 kubelet[2901]: I0416 23:27:48.344575 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hq7j\" (UniqueName: \"kubernetes.io/projected/db7bac57-1d40-4d74-a39c-beae008d27e7-kube-api-access-6hq7j\") pod \"goldmane-cccfbd5cf-lx7nq\" (UID: \"db7bac57-1d40-4d74-a39c-beae008d27e7\") " pod="calico-system/goldmane-cccfbd5cf-lx7nq" Apr 16 23:27:48.344616 kubelet[2901]: I0416 23:27:48.344596 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-nginx-config\") pod \"whisker-75565fcbc4-qbr5q\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " pod="calico-system/whisker-75565fcbc4-qbr5q" Apr 16 23:27:48.344651 kubelet[2901]: I0416 23:27:48.344620 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnhn4\" (UniqueName: \"kubernetes.io/projected/2ce7a087-c66f-4222-b574-7deffd6b3781-kube-api-access-dnhn4\") pod \"whisker-75565fcbc4-qbr5q\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " pod="calico-system/whisker-75565fcbc4-qbr5q" Apr 16 23:27:48.487405 containerd[1656]: time="2026-04-16T23:27:48.487358216Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 16 23:27:48.497114 containerd[1656]: time="2026-04-16T23:27:48.497069038Z" level=info msg="Container 22b5204654f5b412f7cac8cac5363f0d6bb43a38ff30f9d5a5db83ea004ede24: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:48.508811 containerd[1656]: time="2026-04-16T23:27:48.508749145Z" level=info msg="CreateContainer within sandbox \"01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"22b5204654f5b412f7cac8cac5363f0d6bb43a38ff30f9d5a5db83ea004ede24\"" Apr 16 23:27:48.509342 containerd[1656]: time="2026-04-16T23:27:48.509309426Z" level=info msg="StartContainer for \"22b5204654f5b412f7cac8cac5363f0d6bb43a38ff30f9d5a5db83ea004ede24\"" Apr 16 23:27:48.511023 containerd[1656]: time="2026-04-16T23:27:48.510994750Z" level=info msg="connecting to shim 22b5204654f5b412f7cac8cac5363f0d6bb43a38ff30f9d5a5db83ea004ede24" address="unix:///run/containerd/s/3568c88b46408aaa674822b1a3a8490759c49fc014618ebdfaeaa674b0be4801" protocol=ttrpc version=3 Apr 16 23:27:48.534000 systemd[1]: Started cri-containerd-22b5204654f5b412f7cac8cac5363f0d6bb43a38ff30f9d5a5db83ea004ede24.scope - libcontainer container 22b5204654f5b412f7cac8cac5363f0d6bb43a38ff30f9d5a5db83ea004ede24. Apr 16 23:27:48.559439 containerd[1656]: time="2026-04-16T23:27:48.559396260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-pm682,Uid:169c0663-4b7e-43a8-8244-c185c534f7be,Namespace:kube-system,Attempt:0,}" Apr 16 23:27:48.564595 containerd[1656]: time="2026-04-16T23:27:48.564544552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56489995f-wm9gf,Uid:50b472ca-47d9-4e73-b333-2bd45cc28f36,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:48.574720 containerd[1656]: time="2026-04-16T23:27:48.574683335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sl2t6,Uid:828b6488-b286-419f-a0a8-9005759cd92d,Namespace:kube-system,Attempt:0,}" Apr 16 23:27:48.584556 containerd[1656]: time="2026-04-16T23:27:48.584511877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-hzvn2,Uid:8ee87be8-da6a-4477-8f26-c0de20ec6969,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:48.589345 containerd[1656]: time="2026-04-16T23:27:48.588848007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-gl7td,Uid:124108c4-421d-4c57-8b62-11516c0f88f5,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:48.595953 containerd[1656]: time="2026-04-16T23:27:48.595893343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-lx7nq,Uid:db7bac57-1d40-4d74-a39c-beae008d27e7,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:48.602788 containerd[1656]: time="2026-04-16T23:27:48.602743239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-75565fcbc4-qbr5q,Uid:2ce7a087-c66f-4222-b574-7deffd6b3781,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:48.668831 containerd[1656]: time="2026-04-16T23:27:48.668777229Z" level=info msg="StartContainer for \"22b5204654f5b412f7cac8cac5363f0d6bb43a38ff30f9d5a5db83ea004ede24\" returns successfully" Apr 16 23:27:48.708490 containerd[1656]: time="2026-04-16T23:27:48.708353559Z" level=error msg="Failed to destroy network for sandbox \"dbc3b6191bc76d692de09e024ab34daab63dfee0f4bbbe9b5602ee4a159c197d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.711110 systemd[1]: run-netns-cni\x2de24a0648\x2dd07e\x2d819e\x2d4cdc\x2dc79ba26d8bd3.mount: Deactivated successfully. Apr 16 23:27:48.712504 containerd[1656]: time="2026-04-16T23:27:48.712386928Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-pm682,Uid:169c0663-4b7e-43a8-8244-c185c534f7be,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbc3b6191bc76d692de09e024ab34daab63dfee0f4bbbe9b5602ee4a159c197d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.713109 kubelet[2901]: E0416 23:27:48.712677 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbc3b6191bc76d692de09e024ab34daab63dfee0f4bbbe9b5602ee4a159c197d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.713109 kubelet[2901]: E0416 23:27:48.712775 2901 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbc3b6191bc76d692de09e024ab34daab63dfee0f4bbbe9b5602ee4a159c197d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-pm682" Apr 16 23:27:48.713109 kubelet[2901]: E0416 23:27:48.712799 2901 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbc3b6191bc76d692de09e024ab34daab63dfee0f4bbbe9b5602ee4a159c197d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-pm682" Apr 16 23:27:48.713432 kubelet[2901]: E0416 23:27:48.712848 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-pm682_kube-system(169c0663-4b7e-43a8-8244-c185c534f7be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-pm682_kube-system(169c0663-4b7e-43a8-8244-c185c534f7be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dbc3b6191bc76d692de09e024ab34daab63dfee0f4bbbe9b5602ee4a159c197d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-pm682" podUID="169c0663-4b7e-43a8-8244-c185c534f7be" Apr 16 23:27:48.722888 containerd[1656]: time="2026-04-16T23:27:48.722816752Z" level=error msg="Failed to destroy network for sandbox \"94437d87f80093b74a1ffeac827618fd07c2a92d53186fa459f08b6d36baf2f7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.725220 containerd[1656]: time="2026-04-16T23:27:48.725161597Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sl2t6,Uid:828b6488-b286-419f-a0a8-9005759cd92d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"94437d87f80093b74a1ffeac827618fd07c2a92d53186fa459f08b6d36baf2f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.725314 systemd[1]: run-netns-cni\x2df48ad36e\x2d4a3b\x2d76a3\x2dcb9a\x2d395e79b8f668.mount: Deactivated successfully. Apr 16 23:27:48.725436 kubelet[2901]: E0416 23:27:48.725390 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94437d87f80093b74a1ffeac827618fd07c2a92d53186fa459f08b6d36baf2f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.725473 kubelet[2901]: E0416 23:27:48.725443 2901 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94437d87f80093b74a1ffeac827618fd07c2a92d53186fa459f08b6d36baf2f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-sl2t6" Apr 16 23:27:48.725473 kubelet[2901]: E0416 23:27:48.725464 2901 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94437d87f80093b74a1ffeac827618fd07c2a92d53186fa459f08b6d36baf2f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-sl2t6" Apr 16 23:27:48.725696 kubelet[2901]: E0416 23:27:48.725511 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-sl2t6_kube-system(828b6488-b286-419f-a0a8-9005759cd92d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-sl2t6_kube-system(828b6488-b286-419f-a0a8-9005759cd92d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94437d87f80093b74a1ffeac827618fd07c2a92d53186fa459f08b6d36baf2f7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-sl2t6" podUID="828b6488-b286-419f-a0a8-9005759cd92d" Apr 16 23:27:48.733873 containerd[1656]: time="2026-04-16T23:27:48.733827177Z" level=error msg="Failed to destroy network for sandbox \"b3e38f63c829707b740e01fc63b73914afac41cf118b2f819f45ba691978256a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.737197 containerd[1656]: time="2026-04-16T23:27:48.737134664Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56489995f-wm9gf,Uid:50b472ca-47d9-4e73-b333-2bd45cc28f36,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3e38f63c829707b740e01fc63b73914afac41cf118b2f819f45ba691978256a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.738175 systemd[1]: run-netns-cni\x2ddd217d4e\x2d194c\x2de723\x2d1a06\x2dc0218637ccbb.mount: Deactivated successfully. Apr 16 23:27:48.741770 kubelet[2901]: E0416 23:27:48.740528 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3e38f63c829707b740e01fc63b73914afac41cf118b2f819f45ba691978256a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.741770 kubelet[2901]: E0416 23:27:48.740585 2901 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3e38f63c829707b740e01fc63b73914afac41cf118b2f819f45ba691978256a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56489995f-wm9gf" Apr 16 23:27:48.741770 kubelet[2901]: E0416 23:27:48.740604 2901 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3e38f63c829707b740e01fc63b73914afac41cf118b2f819f45ba691978256a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56489995f-wm9gf" Apr 16 23:27:48.741949 kubelet[2901]: E0416 23:27:48.740661 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-56489995f-wm9gf_calico-system(50b472ca-47d9-4e73-b333-2bd45cc28f36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-56489995f-wm9gf_calico-system(50b472ca-47d9-4e73-b333-2bd45cc28f36)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b3e38f63c829707b740e01fc63b73914afac41cf118b2f819f45ba691978256a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-56489995f-wm9gf" podUID="50b472ca-47d9-4e73-b333-2bd45cc28f36" Apr 16 23:27:48.753043 containerd[1656]: time="2026-04-16T23:27:48.752988220Z" level=error msg="Failed to destroy network for sandbox \"c1bc75be380a9ec68b8ae6f461c8e1e3600983581527430e5caf268b1ac07647\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.756251 systemd[1]: run-netns-cni\x2d0bc81d37\x2d70ae\x2daddc\x2d7fd7\x2d811de7569b57.mount: Deactivated successfully. Apr 16 23:27:48.757423 containerd[1656]: time="2026-04-16T23:27:48.757381870Z" level=error msg="Failed to destroy network for sandbox \"17a56172db109881728e5e3581a1e211d3a7e6b62d03aee8248efaa8d57601e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.758552 containerd[1656]: time="2026-04-16T23:27:48.758468513Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-lx7nq,Uid:db7bac57-1d40-4d74-a39c-beae008d27e7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1bc75be380a9ec68b8ae6f461c8e1e3600983581527430e5caf268b1ac07647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.758934 kubelet[2901]: E0416 23:27:48.758897 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1bc75be380a9ec68b8ae6f461c8e1e3600983581527430e5caf268b1ac07647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.760153 kubelet[2901]: E0416 23:27:48.760123 2901 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1bc75be380a9ec68b8ae6f461c8e1e3600983581527430e5caf268b1ac07647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-lx7nq" Apr 16 23:27:48.760271 kubelet[2901]: E0416 23:27:48.760252 2901 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1bc75be380a9ec68b8ae6f461c8e1e3600983581527430e5caf268b1ac07647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-lx7nq" Apr 16 23:27:48.760485 kubelet[2901]: E0416 23:27:48.760372 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-lx7nq_calico-system(db7bac57-1d40-4d74-a39c-beae008d27e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-lx7nq_calico-system(db7bac57-1d40-4d74-a39c-beae008d27e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1bc75be380a9ec68b8ae6f461c8e1e3600983581527430e5caf268b1ac07647\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-lx7nq" podUID="db7bac57-1d40-4d74-a39c-beae008d27e7" Apr 16 23:27:48.761024 containerd[1656]: time="2026-04-16T23:27:48.760972879Z" level=error msg="Failed to destroy network for sandbox \"ad0bf0e22f214613cc36262fba5f32e7017bbf97cb862c254009eb571bf276e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.761231 containerd[1656]: time="2026-04-16T23:27:48.761096119Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-gl7td,Uid:124108c4-421d-4c57-8b62-11516c0f88f5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"17a56172db109881728e5e3581a1e211d3a7e6b62d03aee8248efaa8d57601e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.762193 kubelet[2901]: E0416 23:27:48.762155 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17a56172db109881728e5e3581a1e211d3a7e6b62d03aee8248efaa8d57601e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.762265 kubelet[2901]: E0416 23:27:48.762205 2901 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17a56172db109881728e5e3581a1e211d3a7e6b62d03aee8248efaa8d57601e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5bf56bfb56-gl7td" Apr 16 23:27:48.762265 kubelet[2901]: E0416 23:27:48.762222 2901 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17a56172db109881728e5e3581a1e211d3a7e6b62d03aee8248efaa8d57601e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5bf56bfb56-gl7td" Apr 16 23:27:48.762347 kubelet[2901]: E0416 23:27:48.762267 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5bf56bfb56-gl7td_calico-system(124108c4-421d-4c57-8b62-11516c0f88f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5bf56bfb56-gl7td_calico-system(124108c4-421d-4c57-8b62-11516c0f88f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"17a56172db109881728e5e3581a1e211d3a7e6b62d03aee8248efaa8d57601e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5bf56bfb56-gl7td" podUID="124108c4-421d-4c57-8b62-11516c0f88f5" Apr 16 23:27:48.763238 containerd[1656]: time="2026-04-16T23:27:48.763144924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-hzvn2,Uid:8ee87be8-da6a-4477-8f26-c0de20ec6969,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad0bf0e22f214613cc36262fba5f32e7017bbf97cb862c254009eb571bf276e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.763345 kubelet[2901]: E0416 23:27:48.763314 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad0bf0e22f214613cc36262fba5f32e7017bbf97cb862c254009eb571bf276e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.763801 kubelet[2901]: E0416 23:27:48.763360 2901 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad0bf0e22f214613cc36262fba5f32e7017bbf97cb862c254009eb571bf276e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5bf56bfb56-hzvn2" Apr 16 23:27:48.763801 kubelet[2901]: E0416 23:27:48.763376 2901 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad0bf0e22f214613cc36262fba5f32e7017bbf97cb862c254009eb571bf276e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5bf56bfb56-hzvn2" Apr 16 23:27:48.763801 kubelet[2901]: E0416 23:27:48.763458 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5bf56bfb56-hzvn2_calico-system(8ee87be8-da6a-4477-8f26-c0de20ec6969)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5bf56bfb56-hzvn2_calico-system(8ee87be8-da6a-4477-8f26-c0de20ec6969)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad0bf0e22f214613cc36262fba5f32e7017bbf97cb862c254009eb571bf276e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5bf56bfb56-hzvn2" podUID="8ee87be8-da6a-4477-8f26-c0de20ec6969" Apr 16 23:27:48.772365 containerd[1656]: time="2026-04-16T23:27:48.772316664Z" level=error msg="Failed to destroy network for sandbox \"9c5e18dc477e2c9a51fb8f8fb948673a8f0710de9e2cd773ccf2d59992425f95\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.774082 containerd[1656]: time="2026-04-16T23:27:48.774045388Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-75565fcbc4-qbr5q,Uid:2ce7a087-c66f-4222-b574-7deffd6b3781,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c5e18dc477e2c9a51fb8f8fb948673a8f0710de9e2cd773ccf2d59992425f95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.774443 kubelet[2901]: E0416 23:27:48.774400 2901 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c5e18dc477e2c9a51fb8f8fb948673a8f0710de9e2cd773ccf2d59992425f95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 16 23:27:48.774500 kubelet[2901]: E0416 23:27:48.774460 2901 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c5e18dc477e2c9a51fb8f8fb948673a8f0710de9e2cd773ccf2d59992425f95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-75565fcbc4-qbr5q" Apr 16 23:27:48.774500 kubelet[2901]: E0416 23:27:48.774486 2901 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c5e18dc477e2c9a51fb8f8fb948673a8f0710de9e2cd773ccf2d59992425f95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-75565fcbc4-qbr5q" Apr 16 23:27:48.774561 kubelet[2901]: E0416 23:27:48.774533 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-75565fcbc4-qbr5q_calico-system(2ce7a087-c66f-4222-b574-7deffd6b3781)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-75565fcbc4-qbr5q_calico-system(2ce7a087-c66f-4222-b574-7deffd6b3781)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c5e18dc477e2c9a51fb8f8fb948673a8f0710de9e2cd773ccf2d59992425f95\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-75565fcbc4-qbr5q" podUID="2ce7a087-c66f-4222-b574-7deffd6b3781" Apr 16 23:27:49.370104 systemd[1]: Created slice kubepods-besteffort-pod8cac77ff_82cd_4c79_8746_89081cc748b0.slice - libcontainer container kubepods-besteffort-pod8cac77ff_82cd_4c79_8746_89081cc748b0.slice. Apr 16 23:27:49.375479 containerd[1656]: time="2026-04-16T23:27:49.375356596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zhpzf,Uid:8cac77ff-82cd-4c79-8746-89081cc748b0,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:49.526959 kubelet[2901]: I0416 23:27:49.526894 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-z86qk" podStartSLOduration=2.763568856 podStartE2EDuration="12.526879021s" podCreationTimestamp="2026-04-16 23:27:37 +0000 UTC" firstStartedPulling="2026-04-16 23:27:37.849994183 +0000 UTC m=+19.587728949" lastFinishedPulling="2026-04-16 23:27:47.613304348 +0000 UTC m=+29.351039114" observedRunningTime="2026-04-16 23:27:49.525842098 +0000 UTC m=+31.263576864" watchObservedRunningTime="2026-04-16 23:27:49.526879021 +0000 UTC m=+31.264613787" Apr 16 23:27:49.553447 kubelet[2901]: I0416 23:27:49.553401 2901 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-nginx-config\") pod \"2ce7a087-c66f-4222-b574-7deffd6b3781\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " Apr 16 23:27:49.553614 kubelet[2901]: I0416 23:27:49.553465 2901 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-backend-key-pair\") pod \"2ce7a087-c66f-4222-b574-7deffd6b3781\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " Apr 16 23:27:49.553614 kubelet[2901]: I0416 23:27:49.553491 2901 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnhn4\" (UniqueName: \"kubernetes.io/projected/2ce7a087-c66f-4222-b574-7deffd6b3781-kube-api-access-dnhn4\") pod \"2ce7a087-c66f-4222-b574-7deffd6b3781\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " Apr 16 23:27:49.553614 kubelet[2901]: I0416 23:27:49.553527 2901 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-ca-bundle\") pod \"2ce7a087-c66f-4222-b574-7deffd6b3781\" (UID: \"2ce7a087-c66f-4222-b574-7deffd6b3781\") " Apr 16 23:27:49.554008 kubelet[2901]: I0416 23:27:49.553710 2901 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "2ce7a087-c66f-4222-b574-7deffd6b3781" (UID: "2ce7a087-c66f-4222-b574-7deffd6b3781"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:27:49.554351 kubelet[2901]: I0416 23:27:49.554316 2901 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "2ce7a087-c66f-4222-b574-7deffd6b3781" (UID: "2ce7a087-c66f-4222-b574-7deffd6b3781"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:27:49.556244 kubelet[2901]: I0416 23:27:49.556206 2901 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "2ce7a087-c66f-4222-b574-7deffd6b3781" (UID: "2ce7a087-c66f-4222-b574-7deffd6b3781"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:27:49.556542 kubelet[2901]: I0416 23:27:49.556511 2901 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce7a087-c66f-4222-b574-7deffd6b3781-kube-api-access-dnhn4" (OuterVolumeSpecName: "kube-api-access-dnhn4") pod "2ce7a087-c66f-4222-b574-7deffd6b3781" (UID: "2ce7a087-c66f-4222-b574-7deffd6b3781"). InnerVolumeSpecName "kube-api-access-dnhn4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:27:49.638082 systemd[1]: run-netns-cni\x2d8fa5b97d\x2db780\x2db932\x2dbb7e\x2d1380534c51dc.mount: Deactivated successfully. Apr 16 23:27:49.638164 systemd[1]: run-netns-cni\x2d2d37e347\x2d7c66\x2df706\x2db2df\x2d3646a5f73831.mount: Deactivated successfully. Apr 16 23:27:49.638227 systemd[1]: run-netns-cni\x2d20344209\x2da0f5\x2d61c7\x2d16bf\x2d7d6dc7dc45b8.mount: Deactivated successfully. Apr 16 23:27:49.638274 systemd[1]: var-lib-kubelet-pods-2ce7a087\x2dc66f\x2d4222\x2db574\x2d7deffd6b3781-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddnhn4.mount: Deactivated successfully. Apr 16 23:27:49.638333 systemd[1]: var-lib-kubelet-pods-2ce7a087\x2dc66f\x2d4222\x2db574\x2d7deffd6b3781-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 16 23:27:49.654315 kubelet[2901]: I0416 23:27:49.654243 2901 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-ca-bundle\") on node \"ci-4459-2-4-n-b2725589f5\" DevicePath \"\"" Apr 16 23:27:49.654315 kubelet[2901]: I0416 23:27:49.654311 2901 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2ce7a087-c66f-4222-b574-7deffd6b3781-nginx-config\") on node \"ci-4459-2-4-n-b2725589f5\" DevicePath \"\"" Apr 16 23:27:49.654432 kubelet[2901]: I0416 23:27:49.654330 2901 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2ce7a087-c66f-4222-b574-7deffd6b3781-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-b2725589f5\" DevicePath \"\"" Apr 16 23:27:49.654432 kubelet[2901]: I0416 23:27:49.654339 2901 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnhn4\" (UniqueName: \"kubernetes.io/projected/2ce7a087-c66f-4222-b574-7deffd6b3781-kube-api-access-dnhn4\") on node \"ci-4459-2-4-n-b2725589f5\" DevicePath \"\"" Apr 16 23:27:49.678693 systemd-networkd[1517]: calif2084682417: Link UP Apr 16 23:27:49.678882 systemd-networkd[1517]: calif2084682417: Gained carrier Apr 16 23:27:49.695506 containerd[1656]: 2026-04-16 23:27:49.396 [ERROR][4030] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 16 23:27:49.695506 containerd[1656]: 2026-04-16 23:27:49.448 [INFO][4030] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0 csi-node-driver- calico-system 8cac77ff-82cd-4c79-8746-89081cc748b0 700 0 2026-04-16 23:27:37 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 csi-node-driver-zhpzf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif2084682417 [] [] }} ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-" Apr 16 23:27:49.695506 containerd[1656]: 2026-04-16 23:27:49.448 [INFO][4030] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" Apr 16 23:27:49.695506 containerd[1656]: 2026-04-16 23:27:49.498 [INFO][4043] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" HandleID="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Workload="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.511 [INFO][4043] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" HandleID="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Workload="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000515790), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"csi-node-driver-zhpzf", "timestamp":"2026-04-16 23:27:49.498752077 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000cadc0)} Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.511 [INFO][4043] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.512 [INFO][4043] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.512 [INFO][4043] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.517 [INFO][4043] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.523 [INFO][4043] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.531 [INFO][4043] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.533 [INFO][4043] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.695924 containerd[1656]: 2026-04-16 23:27:49.536 [INFO][4043] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.696111 containerd[1656]: 2026-04-16 23:27:49.536 [INFO][4043] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.696111 containerd[1656]: 2026-04-16 23:27:49.538 [INFO][4043] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6 Apr 16 23:27:49.696111 containerd[1656]: 2026-04-16 23:27:49.542 [INFO][4043] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.696111 containerd[1656]: 2026-04-16 23:27:49.548 [INFO][4043] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.65/26] block=192.168.114.64/26 handle="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.696111 containerd[1656]: 2026-04-16 23:27:49.548 [INFO][4043] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.65/26] handle="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:49.696111 containerd[1656]: 2026-04-16 23:27:49.548 [INFO][4043] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:27:49.696111 containerd[1656]: 2026-04-16 23:27:49.548 [INFO][4043] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.65/26] IPv6=[] ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" HandleID="k8s-pod-network.a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Workload="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" Apr 16 23:27:49.696231 containerd[1656]: 2026-04-16 23:27:49.550 [INFO][4030] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8cac77ff-82cd-4c79-8746-89081cc748b0", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"csi-node-driver-zhpzf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.114.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif2084682417", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:27:49.696282 containerd[1656]: 2026-04-16 23:27:49.551 [INFO][4030] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.65/32] ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" Apr 16 23:27:49.696282 containerd[1656]: 2026-04-16 23:27:49.551 [INFO][4030] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif2084682417 ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" Apr 16 23:27:49.696282 containerd[1656]: 2026-04-16 23:27:49.679 [INFO][4030] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" Apr 16 23:27:49.696338 containerd[1656]: 2026-04-16 23:27:49.680 [INFO][4030] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8cac77ff-82cd-4c79-8746-89081cc748b0", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6", Pod:"csi-node-driver-zhpzf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.114.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif2084682417", MAC:"1e:54:5e:b2:82:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:27:49.696385 containerd[1656]: 2026-04-16 23:27:49.693 [INFO][4030] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" Namespace="calico-system" Pod="csi-node-driver-zhpzf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-csi--node--driver--zhpzf-eth0" Apr 16 23:27:49.716178 containerd[1656]: time="2026-04-16T23:27:49.716124331Z" level=info msg="connecting to shim a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6" address="unix:///run/containerd/s/30f2760014ff9ce216bed896d1f091f7ac4a8f0f8f8d87806c42db3ab245c7f5" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:49.739994 systemd[1]: Started cri-containerd-a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6.scope - libcontainer container a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6. Apr 16 23:27:49.763513 containerd[1656]: time="2026-04-16T23:27:49.763460759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zhpzf,Uid:8cac77ff-82cd-4c79-8746-89081cc748b0,Namespace:calico-system,Attempt:0,} returns sandbox id \"a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6\"" Apr 16 23:27:49.765065 containerd[1656]: time="2026-04-16T23:27:49.765036642Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 16 23:27:50.372270 systemd[1]: Removed slice kubepods-besteffort-pod2ce7a087_c66f_4222_b574_7deffd6b3781.slice - libcontainer container kubepods-besteffort-pod2ce7a087_c66f_4222_b574_7deffd6b3781.slice. Apr 16 23:27:50.488643 kubelet[2901]: I0416 23:27:50.488146 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:27:50.564034 systemd[1]: Created slice kubepods-besteffort-pod73d08be4_71b4_48bd_8b60_f57c7a8b9866.slice - libcontainer container kubepods-besteffort-pod73d08be4_71b4_48bd_8b60_f57c7a8b9866.slice. Apr 16 23:27:50.602322 systemd-networkd[1517]: vxlan.calico: Link UP Apr 16 23:27:50.604399 systemd-networkd[1517]: vxlan.calico: Gained carrier Apr 16 23:27:50.661301 kubelet[2901]: I0416 23:27:50.661225 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/73d08be4-71b4-48bd-8b60-f57c7a8b9866-nginx-config\") pod \"whisker-79f7858647-n48n2\" (UID: \"73d08be4-71b4-48bd-8b60-f57c7a8b9866\") " pod="calico-system/whisker-79f7858647-n48n2" Apr 16 23:27:50.661301 kubelet[2901]: I0416 23:27:50.661279 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73d08be4-71b4-48bd-8b60-f57c7a8b9866-whisker-ca-bundle\") pod \"whisker-79f7858647-n48n2\" (UID: \"73d08be4-71b4-48bd-8b60-f57c7a8b9866\") " pod="calico-system/whisker-79f7858647-n48n2" Apr 16 23:27:50.661301 kubelet[2901]: I0416 23:27:50.661308 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5ls\" (UniqueName: \"kubernetes.io/projected/73d08be4-71b4-48bd-8b60-f57c7a8b9866-kube-api-access-kt5ls\") pod \"whisker-79f7858647-n48n2\" (UID: \"73d08be4-71b4-48bd-8b60-f57c7a8b9866\") " pod="calico-system/whisker-79f7858647-n48n2" Apr 16 23:27:50.661629 kubelet[2901]: I0416 23:27:50.661332 2901 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/73d08be4-71b4-48bd-8b60-f57c7a8b9866-whisker-backend-key-pair\") pod \"whisker-79f7858647-n48n2\" (UID: \"73d08be4-71b4-48bd-8b60-f57c7a8b9866\") " pod="calico-system/whisker-79f7858647-n48n2" Apr 16 23:27:50.873552 containerd[1656]: time="2026-04-16T23:27:50.873485403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79f7858647-n48n2,Uid:73d08be4-71b4-48bd-8b60-f57c7a8b9866,Namespace:calico-system,Attempt:0,}" Apr 16 23:27:50.999299 systemd-networkd[1517]: cali2851f23d8b5: Link UP Apr 16 23:27:50.999498 systemd-networkd[1517]: cali2851f23d8b5: Gained carrier Apr 16 23:27:51.017673 containerd[1656]: 2026-04-16 23:27:50.911 [INFO][4325] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0 whisker-79f7858647- calico-system 73d08be4-71b4-48bd-8b60-f57c7a8b9866 871 0 2026-04-16 23:27:50 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79f7858647 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 whisker-79f7858647-n48n2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2851f23d8b5 [] [] }} ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-" Apr 16 23:27:51.017673 containerd[1656]: 2026-04-16 23:27:50.911 [INFO][4325] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" Apr 16 23:27:51.017673 containerd[1656]: 2026-04-16 23:27:50.940 [INFO][4341] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" HandleID="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Workload="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.952 [INFO][4341] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" HandleID="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Workload="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400040edc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"whisker-79f7858647-n48n2", "timestamp":"2026-04-16 23:27:50.940925797 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004086e0)} Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.952 [INFO][4341] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.953 [INFO][4341] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.953 [INFO][4341] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.957 [INFO][4341] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.963 [INFO][4341] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.970 [INFO][4341] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.973 [INFO][4341] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018143 containerd[1656]: 2026-04-16 23:27:50.976 [INFO][4341] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018326 containerd[1656]: 2026-04-16 23:27:50.977 [INFO][4341] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018326 containerd[1656]: 2026-04-16 23:27:50.979 [INFO][4341] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a Apr 16 23:27:51.018326 containerd[1656]: 2026-04-16 23:27:50.985 [INFO][4341] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018326 containerd[1656]: 2026-04-16 23:27:50.994 [INFO][4341] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.66/26] block=192.168.114.64/26 handle="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018326 containerd[1656]: 2026-04-16 23:27:50.994 [INFO][4341] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.66/26] handle="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:27:51.018326 containerd[1656]: 2026-04-16 23:27:50.994 [INFO][4341] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:27:51.018326 containerd[1656]: 2026-04-16 23:27:50.994 [INFO][4341] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.66/26] IPv6=[] ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" HandleID="k8s-pod-network.96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Workload="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" Apr 16 23:27:51.018448 containerd[1656]: 2026-04-16 23:27:50.996 [INFO][4325] cni-plugin/k8s.go 418: Populated endpoint ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0", GenerateName:"whisker-79f7858647-", Namespace:"calico-system", SelfLink:"", UID:"73d08be4-71b4-48bd-8b60-f57c7a8b9866", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79f7858647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"whisker-79f7858647-n48n2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.114.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2851f23d8b5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:27:51.018448 containerd[1656]: 2026-04-16 23:27:50.997 [INFO][4325] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.66/32] ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" Apr 16 23:27:51.018515 containerd[1656]: 2026-04-16 23:27:50.997 [INFO][4325] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2851f23d8b5 ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" Apr 16 23:27:51.018515 containerd[1656]: 2026-04-16 23:27:50.999 [INFO][4325] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" Apr 16 23:27:51.018554 containerd[1656]: 2026-04-16 23:27:51.000 [INFO][4325] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0", GenerateName:"whisker-79f7858647-", Namespace:"calico-system", SelfLink:"", UID:"73d08be4-71b4-48bd-8b60-f57c7a8b9866", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79f7858647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a", Pod:"whisker-79f7858647-n48n2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.114.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2851f23d8b5", MAC:"be:8b:36:30:7f:9b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:27:51.018596 containerd[1656]: 2026-04-16 23:27:51.013 [INFO][4325] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" Namespace="calico-system" Pod="whisker-79f7858647-n48n2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-whisker--79f7858647--n48n2-eth0" Apr 16 23:27:51.049706 containerd[1656]: time="2026-04-16T23:27:51.049654844Z" level=info msg="connecting to shim 96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a" address="unix:///run/containerd/s/2b90b4d32bf7ae84bfba56cffbeb8a7ed9f7cc7c09655107829865160fec55a6" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:27:51.058213 systemd-networkd[1517]: calif2084682417: Gained IPv6LL Apr 16 23:27:51.097938 systemd[1]: Started cri-containerd-96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a.scope - libcontainer container 96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a. Apr 16 23:27:51.149600 containerd[1656]: time="2026-04-16T23:27:51.149547111Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79f7858647-n48n2,Uid:73d08be4-71b4-48bd-8b60-f57c7a8b9866,Namespace:calico-system,Attempt:0,} returns sandbox id \"96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a\"" Apr 16 23:27:51.149862 containerd[1656]: time="2026-04-16T23:27:51.149827312Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Apr 16 23:27:51.150919 containerd[1656]: time="2026-04-16T23:27:51.150238913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:51.152579 containerd[1656]: time="2026-04-16T23:27:51.152550318Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:51.154241 containerd[1656]: time="2026-04-16T23:27:51.154210882Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:51.154928 containerd[1656]: time="2026-04-16T23:27:51.154902203Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 1.389833441s" Apr 16 23:27:51.155023 containerd[1656]: time="2026-04-16T23:27:51.155006564Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Apr 16 23:27:51.156122 containerd[1656]: time="2026-04-16T23:27:51.156095686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 16 23:27:51.160328 containerd[1656]: time="2026-04-16T23:27:51.160291416Z" level=info msg="CreateContainer within sandbox \"a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 16 23:27:51.172942 containerd[1656]: time="2026-04-16T23:27:51.172895324Z" level=info msg="Container d85c037b29d6013b5a92045672e73625f1826d96f2cf0c1f1f9f45d77e0ad65f: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:51.181916 containerd[1656]: time="2026-04-16T23:27:51.181847425Z" level=info msg="CreateContainer within sandbox \"a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d85c037b29d6013b5a92045672e73625f1826d96f2cf0c1f1f9f45d77e0ad65f\"" Apr 16 23:27:51.183042 containerd[1656]: time="2026-04-16T23:27:51.183014627Z" level=info msg="StartContainer for \"d85c037b29d6013b5a92045672e73625f1826d96f2cf0c1f1f9f45d77e0ad65f\"" Apr 16 23:27:51.184648 containerd[1656]: time="2026-04-16T23:27:51.184620111Z" level=info msg="connecting to shim d85c037b29d6013b5a92045672e73625f1826d96f2cf0c1f1f9f45d77e0ad65f" address="unix:///run/containerd/s/30f2760014ff9ce216bed896d1f091f7ac4a8f0f8f8d87806c42db3ab245c7f5" protocol=ttrpc version=3 Apr 16 23:27:51.202915 systemd[1]: Started cri-containerd-d85c037b29d6013b5a92045672e73625f1826d96f2cf0c1f1f9f45d77e0ad65f.scope - libcontainer container d85c037b29d6013b5a92045672e73625f1826d96f2cf0c1f1f9f45d77e0ad65f. Apr 16 23:27:51.262074 containerd[1656]: time="2026-04-16T23:27:51.261972647Z" level=info msg="StartContainer for \"d85c037b29d6013b5a92045672e73625f1826d96f2cf0c1f1f9f45d77e0ad65f\" returns successfully" Apr 16 23:27:52.081861 systemd-networkd[1517]: vxlan.calico: Gained IPv6LL Apr 16 23:27:52.368557 kubelet[2901]: I0416 23:27:52.368334 2901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce7a087-c66f-4222-b574-7deffd6b3781" path="/var/lib/kubelet/pods/2ce7a087-c66f-4222-b574-7deffd6b3781/volumes" Apr 16 23:27:52.424760 containerd[1656]: time="2026-04-16T23:27:52.424328570Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:52.426336 containerd[1656]: time="2026-04-16T23:27:52.426296455Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Apr 16 23:27:52.427566 containerd[1656]: time="2026-04-16T23:27:52.427540178Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:52.430046 containerd[1656]: time="2026-04-16T23:27:52.430003023Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:52.430744 containerd[1656]: time="2026-04-16T23:27:52.430706065Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.274578739s" Apr 16 23:27:52.430864 containerd[1656]: time="2026-04-16T23:27:52.430823905Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Apr 16 23:27:52.432047 containerd[1656]: time="2026-04-16T23:27:52.431884668Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 16 23:27:52.435706 containerd[1656]: time="2026-04-16T23:27:52.435666316Z" level=info msg="CreateContainer within sandbox \"96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 16 23:27:52.447891 containerd[1656]: time="2026-04-16T23:27:52.447043102Z" level=info msg="Container acaf947456de4dc5480819d9c5af4b4a0e92cad0edaaae2376a666e704df1ed7: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:52.458284 containerd[1656]: time="2026-04-16T23:27:52.458229768Z" level=info msg="CreateContainer within sandbox \"96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"acaf947456de4dc5480819d9c5af4b4a0e92cad0edaaae2376a666e704df1ed7\"" Apr 16 23:27:52.458963 containerd[1656]: time="2026-04-16T23:27:52.458936729Z" level=info msg="StartContainer for \"acaf947456de4dc5480819d9c5af4b4a0e92cad0edaaae2376a666e704df1ed7\"" Apr 16 23:27:52.460811 containerd[1656]: time="2026-04-16T23:27:52.460774293Z" level=info msg="connecting to shim acaf947456de4dc5480819d9c5af4b4a0e92cad0edaaae2376a666e704df1ed7" address="unix:///run/containerd/s/2b90b4d32bf7ae84bfba56cffbeb8a7ed9f7cc7c09655107829865160fec55a6" protocol=ttrpc version=3 Apr 16 23:27:52.482473 systemd[1]: Started cri-containerd-acaf947456de4dc5480819d9c5af4b4a0e92cad0edaaae2376a666e704df1ed7.scope - libcontainer container acaf947456de4dc5480819d9c5af4b4a0e92cad0edaaae2376a666e704df1ed7. Apr 16 23:27:52.521268 containerd[1656]: time="2026-04-16T23:27:52.521231431Z" level=info msg="StartContainer for \"acaf947456de4dc5480819d9c5af4b4a0e92cad0edaaae2376a666e704df1ed7\" returns successfully" Apr 16 23:27:52.786051 systemd-networkd[1517]: cali2851f23d8b5: Gained IPv6LL Apr 16 23:27:53.801754 containerd[1656]: time="2026-04-16T23:27:53.801586903Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:53.802956 containerd[1656]: time="2026-04-16T23:27:53.802918986Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Apr 16 23:27:53.804953 containerd[1656]: time="2026-04-16T23:27:53.804912390Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:53.808395 containerd[1656]: time="2026-04-16T23:27:53.808319198Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:53.810178 containerd[1656]: time="2026-04-16T23:27:53.809244360Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.377331692s" Apr 16 23:27:53.810178 containerd[1656]: time="2026-04-16T23:27:53.809279360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Apr 16 23:27:53.815948 containerd[1656]: time="2026-04-16T23:27:53.815896095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 16 23:27:53.822694 containerd[1656]: time="2026-04-16T23:27:53.822646471Z" level=info msg="CreateContainer within sandbox \"a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 16 23:27:53.844383 containerd[1656]: time="2026-04-16T23:27:53.844328600Z" level=info msg="Container a17f3fcd8330932040dd7412947fb12de0623053c752b0f87bb90b3efc763bde: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:53.857257 containerd[1656]: time="2026-04-16T23:27:53.857185629Z" level=info msg="CreateContainer within sandbox \"a753926917643b9cb5d250fd0a42d5e24f6918c7ba005404baee1194bcbdd2e6\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a17f3fcd8330932040dd7412947fb12de0623053c752b0f87bb90b3efc763bde\"" Apr 16 23:27:53.858771 containerd[1656]: time="2026-04-16T23:27:53.857688470Z" level=info msg="StartContainer for \"a17f3fcd8330932040dd7412947fb12de0623053c752b0f87bb90b3efc763bde\"" Apr 16 23:27:53.861679 containerd[1656]: time="2026-04-16T23:27:53.861630999Z" level=info msg="connecting to shim a17f3fcd8330932040dd7412947fb12de0623053c752b0f87bb90b3efc763bde" address="unix:///run/containerd/s/30f2760014ff9ce216bed896d1f091f7ac4a8f0f8f8d87806c42db3ab245c7f5" protocol=ttrpc version=3 Apr 16 23:27:53.898026 systemd[1]: Started cri-containerd-a17f3fcd8330932040dd7412947fb12de0623053c752b0f87bb90b3efc763bde.scope - libcontainer container a17f3fcd8330932040dd7412947fb12de0623053c752b0f87bb90b3efc763bde. Apr 16 23:27:53.956280 containerd[1656]: time="2026-04-16T23:27:53.956212375Z" level=info msg="StartContainer for \"a17f3fcd8330932040dd7412947fb12de0623053c752b0f87bb90b3efc763bde\" returns successfully" Apr 16 23:27:54.412231 kubelet[2901]: I0416 23:27:54.412199 2901 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 16 23:27:54.412231 kubelet[2901]: I0416 23:27:54.412232 2901 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 16 23:27:54.521741 kubelet[2901]: I0416 23:27:54.521676 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zhpzf" podStartSLOduration=13.471872411 podStartE2EDuration="17.521641781s" podCreationTimestamp="2026-04-16 23:27:37 +0000 UTC" firstStartedPulling="2026-04-16 23:27:49.764710562 +0000 UTC m=+31.502445328" lastFinishedPulling="2026-04-16 23:27:53.814479932 +0000 UTC m=+35.552214698" observedRunningTime="2026-04-16 23:27:54.5214857 +0000 UTC m=+36.259220426" watchObservedRunningTime="2026-04-16 23:27:54.521641781 +0000 UTC m=+36.259376587" Apr 16 23:27:55.262078 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2503525855.mount: Deactivated successfully. Apr 16 23:27:55.283559 containerd[1656]: time="2026-04-16T23:27:55.283499873Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:55.284764 containerd[1656]: time="2026-04-16T23:27:55.284371555Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Apr 16 23:27:55.285499 containerd[1656]: time="2026-04-16T23:27:55.285468358Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:55.288101 containerd[1656]: time="2026-04-16T23:27:55.288063804Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:27:55.288857 containerd[1656]: time="2026-04-16T23:27:55.288818285Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 1.472739869s" Apr 16 23:27:55.288923 containerd[1656]: time="2026-04-16T23:27:55.288857645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Apr 16 23:27:55.298786 containerd[1656]: time="2026-04-16T23:27:55.298743748Z" level=info msg="CreateContainer within sandbox \"96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 16 23:27:55.306917 containerd[1656]: time="2026-04-16T23:27:55.306865566Z" level=info msg="Container f3765c7e66764dfccfbd9265d74da63d2b91dadb6d3c47152fc276b1a34ac429: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:27:55.315753 containerd[1656]: time="2026-04-16T23:27:55.315692467Z" level=info msg="CreateContainer within sandbox \"96b1e2c12ca3169ed11a1ea1c1893b24a953afdbb45089f6cf5bba4bc3804d3a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"f3765c7e66764dfccfbd9265d74da63d2b91dadb6d3c47152fc276b1a34ac429\"" Apr 16 23:27:55.316320 containerd[1656]: time="2026-04-16T23:27:55.316252508Z" level=info msg="StartContainer for \"f3765c7e66764dfccfbd9265d74da63d2b91dadb6d3c47152fc276b1a34ac429\"" Apr 16 23:27:55.317642 containerd[1656]: time="2026-04-16T23:27:55.317607471Z" level=info msg="connecting to shim f3765c7e66764dfccfbd9265d74da63d2b91dadb6d3c47152fc276b1a34ac429" address="unix:///run/containerd/s/2b90b4d32bf7ae84bfba56cffbeb8a7ed9f7cc7c09655107829865160fec55a6" protocol=ttrpc version=3 Apr 16 23:27:55.338898 systemd[1]: Started cri-containerd-f3765c7e66764dfccfbd9265d74da63d2b91dadb6d3c47152fc276b1a34ac429.scope - libcontainer container f3765c7e66764dfccfbd9265d74da63d2b91dadb6d3c47152fc276b1a34ac429. Apr 16 23:27:55.380161 containerd[1656]: time="2026-04-16T23:27:55.380121133Z" level=info msg="StartContainer for \"f3765c7e66764dfccfbd9265d74da63d2b91dadb6d3c47152fc276b1a34ac429\" returns successfully" Apr 16 23:27:55.528528 kubelet[2901]: I0416 23:27:55.528388 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-79f7858647-n48n2" podStartSLOduration=1.3888334150000001 podStartE2EDuration="5.52837355s" podCreationTimestamp="2026-04-16 23:27:50 +0000 UTC" firstStartedPulling="2026-04-16 23:27:51.152696598 +0000 UTC m=+32.890431364" lastFinishedPulling="2026-04-16 23:27:55.292236733 +0000 UTC m=+37.029971499" observedRunningTime="2026-04-16 23:27:55.527621549 +0000 UTC m=+37.265356355" watchObservedRunningTime="2026-04-16 23:27:55.52837355 +0000 UTC m=+37.266108276" Apr 16 23:27:56.322130 kubelet[2901]: I0416 23:27:56.321950 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:28:00.368158 containerd[1656]: time="2026-04-16T23:28:00.368055437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-hzvn2,Uid:8ee87be8-da6a-4477-8f26-c0de20ec6969,Namespace:calico-system,Attempt:0,}" Apr 16 23:28:00.482297 systemd-networkd[1517]: cali34d54ab0629: Link UP Apr 16 23:28:00.485182 systemd-networkd[1517]: cali34d54ab0629: Gained carrier Apr 16 23:28:00.499899 containerd[1656]: 2026-04-16 23:28:00.409 [INFO][4661] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0 calico-apiserver-5bf56bfb56- calico-system 8ee87be8-da6a-4477-8f26-c0de20ec6969 814 0 2026-04-16 23:27:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5bf56bfb56 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 calico-apiserver-5bf56bfb56-hzvn2 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali34d54ab0629 [] [] }} ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-" Apr 16 23:28:00.499899 containerd[1656]: 2026-04-16 23:28:00.409 [INFO][4661] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" Apr 16 23:28:00.499899 containerd[1656]: 2026-04-16 23:28:00.432 [INFO][4671] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" HandleID="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.443 [INFO][4671] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" HandleID="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f96d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"calico-apiserver-5bf56bfb56-hzvn2", "timestamp":"2026-04-16 23:28:00.432756945 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40003c9080)} Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.443 [INFO][4671] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.443 [INFO][4671] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.443 [INFO][4671] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.446 [INFO][4671] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.451 [INFO][4671] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.457 [INFO][4671] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.459 [INFO][4671] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500096 containerd[1656]: 2026-04-16 23:28:00.461 [INFO][4671] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500290 containerd[1656]: 2026-04-16 23:28:00.461 [INFO][4671] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500290 containerd[1656]: 2026-04-16 23:28:00.464 [INFO][4671] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03 Apr 16 23:28:00.500290 containerd[1656]: 2026-04-16 23:28:00.470 [INFO][4671] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500290 containerd[1656]: 2026-04-16 23:28:00.478 [INFO][4671] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.67/26] block=192.168.114.64/26 handle="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500290 containerd[1656]: 2026-04-16 23:28:00.478 [INFO][4671] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.67/26] handle="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:00.500290 containerd[1656]: 2026-04-16 23:28:00.478 [INFO][4671] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:28:00.500290 containerd[1656]: 2026-04-16 23:28:00.478 [INFO][4671] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.67/26] IPv6=[] ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" HandleID="k8s-pod-network.c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" Apr 16 23:28:00.500413 containerd[1656]: 2026-04-16 23:28:00.480 [INFO][4661] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0", GenerateName:"calico-apiserver-5bf56bfb56-", Namespace:"calico-system", SelfLink:"", UID:"8ee87be8-da6a-4477-8f26-c0de20ec6969", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf56bfb56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"calico-apiserver-5bf56bfb56-hzvn2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali34d54ab0629", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:00.502119 containerd[1656]: 2026-04-16 23:28:00.480 [INFO][4661] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.67/32] ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" Apr 16 23:28:00.502119 containerd[1656]: 2026-04-16 23:28:00.480 [INFO][4661] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali34d54ab0629 ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" Apr 16 23:28:00.502119 containerd[1656]: 2026-04-16 23:28:00.483 [INFO][4661] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" Apr 16 23:28:00.502209 containerd[1656]: 2026-04-16 23:28:00.485 [INFO][4661] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0", GenerateName:"calico-apiserver-5bf56bfb56-", Namespace:"calico-system", SelfLink:"", UID:"8ee87be8-da6a-4477-8f26-c0de20ec6969", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf56bfb56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03", Pod:"calico-apiserver-5bf56bfb56-hzvn2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali34d54ab0629", MAC:"4e:b3:89:7e:ec:c7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:00.502316 containerd[1656]: 2026-04-16 23:28:00.495 [INFO][4661] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-hzvn2" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--hzvn2-eth0" Apr 16 23:28:00.529966 containerd[1656]: time="2026-04-16T23:28:00.529912286Z" level=info msg="connecting to shim c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03" address="unix:///run/containerd/s/d274bfd223f6d29c402d7ff8260b8a7c70fef4116c100d6fe6dc373941e555e3" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:28:00.556897 systemd[1]: Started cri-containerd-c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03.scope - libcontainer container c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03. Apr 16 23:28:00.587296 containerd[1656]: time="2026-04-16T23:28:00.587223656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-hzvn2,Uid:8ee87be8-da6a-4477-8f26-c0de20ec6969,Namespace:calico-system,Attempt:0,} returns sandbox id \"c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03\"" Apr 16 23:28:00.589833 containerd[1656]: time="2026-04-16T23:28:00.589792062Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 16 23:28:01.366362 containerd[1656]: time="2026-04-16T23:28:01.366246108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-gl7td,Uid:124108c4-421d-4c57-8b62-11516c0f88f5,Namespace:calico-system,Attempt:0,}" Apr 16 23:28:01.479232 systemd-networkd[1517]: cali1008942f2a9: Link UP Apr 16 23:28:01.479589 systemd-networkd[1517]: cali1008942f2a9: Gained carrier Apr 16 23:28:01.497912 containerd[1656]: 2026-04-16 23:28:01.405 [INFO][4761] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0 calico-apiserver-5bf56bfb56- calico-system 124108c4-421d-4c57-8b62-11516c0f88f5 810 0 2026-04-16 23:27:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5bf56bfb56 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 calico-apiserver-5bf56bfb56-gl7td eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali1008942f2a9 [] [] }} ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-" Apr 16 23:28:01.497912 containerd[1656]: 2026-04-16 23:28:01.405 [INFO][4761] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" Apr 16 23:28:01.497912 containerd[1656]: 2026-04-16 23:28:01.428 [INFO][4776] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" HandleID="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.439 [INFO][4776] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" HandleID="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000503cb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"calico-apiserver-5bf56bfb56-gl7td", "timestamp":"2026-04-16 23:28:01.42881509 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004e8160)} Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.439 [INFO][4776] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.439 [INFO][4776] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.439 [INFO][4776] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.442 [INFO][4776] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.447 [INFO][4776] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.455 [INFO][4776] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.457 [INFO][4776] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498342 containerd[1656]: 2026-04-16 23:28:01.460 [INFO][4776] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498531 containerd[1656]: 2026-04-16 23:28:01.460 [INFO][4776] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498531 containerd[1656]: 2026-04-16 23:28:01.462 [INFO][4776] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c Apr 16 23:28:01.498531 containerd[1656]: 2026-04-16 23:28:01.467 [INFO][4776] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498531 containerd[1656]: 2026-04-16 23:28:01.474 [INFO][4776] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.68/26] block=192.168.114.64/26 handle="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498531 containerd[1656]: 2026-04-16 23:28:01.474 [INFO][4776] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.68/26] handle="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:01.498531 containerd[1656]: 2026-04-16 23:28:01.475 [INFO][4776] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:28:01.498531 containerd[1656]: 2026-04-16 23:28:01.475 [INFO][4776] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.68/26] IPv6=[] ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" HandleID="k8s-pod-network.0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" Apr 16 23:28:01.498652 containerd[1656]: 2026-04-16 23:28:01.476 [INFO][4761] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0", GenerateName:"calico-apiserver-5bf56bfb56-", Namespace:"calico-system", SelfLink:"", UID:"124108c4-421d-4c57-8b62-11516c0f88f5", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf56bfb56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"calico-apiserver-5bf56bfb56-gl7td", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1008942f2a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:01.498698 containerd[1656]: 2026-04-16 23:28:01.476 [INFO][4761] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.68/32] ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" Apr 16 23:28:01.498698 containerd[1656]: 2026-04-16 23:28:01.477 [INFO][4761] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1008942f2a9 ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" Apr 16 23:28:01.498698 containerd[1656]: 2026-04-16 23:28:01.480 [INFO][4761] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" Apr 16 23:28:01.498801 containerd[1656]: 2026-04-16 23:28:01.480 [INFO][4761] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0", GenerateName:"calico-apiserver-5bf56bfb56-", Namespace:"calico-system", SelfLink:"", UID:"124108c4-421d-4c57-8b62-11516c0f88f5", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf56bfb56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c", Pod:"calico-apiserver-5bf56bfb56-gl7td", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1008942f2a9", MAC:"8a:74:93:14:9e:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:01.498864 containerd[1656]: 2026-04-16 23:28:01.495 [INFO][4761] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" Namespace="calico-system" Pod="calico-apiserver-5bf56bfb56-gl7td" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--apiserver--5bf56bfb56--gl7td-eth0" Apr 16 23:28:01.519598 containerd[1656]: time="2026-04-16T23:28:01.519230976Z" level=info msg="connecting to shim 0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c" address="unix:///run/containerd/s/924024d1d5d8cbeae7e3a5ca7c7fe5ed8681b91f79c23af8929c32078d683760" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:28:01.544894 systemd[1]: Started cri-containerd-0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c.scope - libcontainer container 0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c. Apr 16 23:28:01.575621 containerd[1656]: time="2026-04-16T23:28:01.575581024Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf56bfb56-gl7td,Uid:124108c4-421d-4c57-8b62-11516c0f88f5,Namespace:calico-system,Attempt:0,} returns sandbox id \"0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c\"" Apr 16 23:28:02.368702 containerd[1656]: time="2026-04-16T23:28:02.368658588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56489995f-wm9gf,Uid:50b472ca-47d9-4e73-b333-2bd45cc28f36,Namespace:calico-system,Attempt:0,}" Apr 16 23:28:02.371224 containerd[1656]: time="2026-04-16T23:28:02.371055193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-lx7nq,Uid:db7bac57-1d40-4d74-a39c-beae008d27e7,Namespace:calico-system,Attempt:0,}" Apr 16 23:28:02.449949 systemd-networkd[1517]: cali34d54ab0629: Gained IPv6LL Apr 16 23:28:02.534479 systemd-networkd[1517]: calibd967cb2f65: Link UP Apr 16 23:28:02.535553 systemd-networkd[1517]: calibd967cb2f65: Gained carrier Apr 16 23:28:02.565136 containerd[1656]: 2026-04-16 23:28:02.413 [INFO][4855] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0 calico-kube-controllers-56489995f- calico-system 50b472ca-47d9-4e73-b333-2bd45cc28f36 812 0 2026-04-16 23:27:37 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:56489995f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 calico-kube-controllers-56489995f-wm9gf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibd967cb2f65 [] [] }} ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-" Apr 16 23:28:02.565136 containerd[1656]: 2026-04-16 23:28:02.414 [INFO][4855] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" Apr 16 23:28:02.565136 containerd[1656]: 2026-04-16 23:28:02.448 [INFO][4887] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" HandleID="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.469 [INFO][4887] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" HandleID="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000408020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"calico-kube-controllers-56489995f-wm9gf", "timestamp":"2026-04-16 23:28:02.44895581 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40004bcdc0)} Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.469 [INFO][4887] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.469 [INFO][4887] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.469 [INFO][4887] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.473 [INFO][4887] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.482 [INFO][4887] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.488 [INFO][4887] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.491 [INFO][4887] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565521 containerd[1656]: 2026-04-16 23:28:02.493 [INFO][4887] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565709 containerd[1656]: 2026-04-16 23:28:02.494 [INFO][4887] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565709 containerd[1656]: 2026-04-16 23:28:02.496 [INFO][4887] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378 Apr 16 23:28:02.565709 containerd[1656]: 2026-04-16 23:28:02.504 [INFO][4887] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565709 containerd[1656]: 2026-04-16 23:28:02.525 [INFO][4887] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.69/26] block=192.168.114.64/26 handle="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565709 containerd[1656]: 2026-04-16 23:28:02.525 [INFO][4887] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.69/26] handle="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.565709 containerd[1656]: 2026-04-16 23:28:02.525 [INFO][4887] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:28:02.565709 containerd[1656]: 2026-04-16 23:28:02.525 [INFO][4887] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.69/26] IPv6=[] ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" HandleID="k8s-pod-network.7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Workload="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" Apr 16 23:28:02.566925 containerd[1656]: 2026-04-16 23:28:02.531 [INFO][4855] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0", GenerateName:"calico-kube-controllers-56489995f-", Namespace:"calico-system", SelfLink:"", UID:"50b472ca-47d9-4e73-b333-2bd45cc28f36", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56489995f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"calico-kube-controllers-56489995f-wm9gf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.114.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibd967cb2f65", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:02.566987 containerd[1656]: 2026-04-16 23:28:02.531 [INFO][4855] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.69/32] ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" Apr 16 23:28:02.566987 containerd[1656]: 2026-04-16 23:28:02.531 [INFO][4855] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibd967cb2f65 ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" Apr 16 23:28:02.566987 containerd[1656]: 2026-04-16 23:28:02.537 [INFO][4855] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" Apr 16 23:28:02.567049 containerd[1656]: 2026-04-16 23:28:02.544 [INFO][4855] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0", GenerateName:"calico-kube-controllers-56489995f-", Namespace:"calico-system", SelfLink:"", UID:"50b472ca-47d9-4e73-b333-2bd45cc28f36", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56489995f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378", Pod:"calico-kube-controllers-56489995f-wm9gf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.114.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibd967cb2f65", MAC:"32:72:45:39:8f:d2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:02.567094 containerd[1656]: 2026-04-16 23:28:02.562 [INFO][4855] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" Namespace="calico-system" Pod="calico-kube-controllers-56489995f-wm9gf" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-calico--kube--controllers--56489995f--wm9gf-eth0" Apr 16 23:28:02.577917 systemd-networkd[1517]: cali1008942f2a9: Gained IPv6LL Apr 16 23:28:02.596621 containerd[1656]: time="2026-04-16T23:28:02.596574426Z" level=info msg="connecting to shim 7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378" address="unix:///run/containerd/s/887446078bf06cbad779112ed0c4be68e44a860fe7578cc8c96fdb6ca7f9ab11" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:28:02.630894 systemd[1]: Started cri-containerd-7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378.scope - libcontainer container 7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378. Apr 16 23:28:02.640296 systemd-networkd[1517]: calie259fe5cc07: Link UP Apr 16 23:28:02.641673 systemd-networkd[1517]: calie259fe5cc07: Gained carrier Apr 16 23:28:02.663179 containerd[1656]: 2026-04-16 23:28:02.423 [INFO][4867] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0 goldmane-cccfbd5cf- calico-system db7bac57-1d40-4d74-a39c-beae008d27e7 815 0 2026-04-16 23:27:35 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 goldmane-cccfbd5cf-lx7nq eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie259fe5cc07 [] [] }} ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-" Apr 16 23:28:02.663179 containerd[1656]: 2026-04-16 23:28:02.423 [INFO][4867] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" Apr 16 23:28:02.663179 containerd[1656]: 2026-04-16 23:28:02.473 [INFO][4893] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" HandleID="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Workload="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.486 [INFO][4893] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" HandleID="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Workload="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005127c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"goldmane-cccfbd5cf-lx7nq", "timestamp":"2026-04-16 23:28:02.473642466 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000142c60)} Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.486 [INFO][4893] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.525 [INFO][4893] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.526 [INFO][4893] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.576 [INFO][4893] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.585 [INFO][4893] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.594 [INFO][4893] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.598 [INFO][4893] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663391 containerd[1656]: 2026-04-16 23:28:02.603 [INFO][4893] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663580 containerd[1656]: 2026-04-16 23:28:02.603 [INFO][4893] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663580 containerd[1656]: 2026-04-16 23:28:02.609 [INFO][4893] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae Apr 16 23:28:02.663580 containerd[1656]: 2026-04-16 23:28:02.619 [INFO][4893] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663580 containerd[1656]: 2026-04-16 23:28:02.632 [INFO][4893] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.70/26] block=192.168.114.64/26 handle="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663580 containerd[1656]: 2026-04-16 23:28:02.632 [INFO][4893] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.70/26] handle="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:02.663580 containerd[1656]: 2026-04-16 23:28:02.632 [INFO][4893] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:28:02.663580 containerd[1656]: 2026-04-16 23:28:02.632 [INFO][4893] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.70/26] IPv6=[] ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" HandleID="k8s-pod-network.8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Workload="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" Apr 16 23:28:02.663773 containerd[1656]: 2026-04-16 23:28:02.635 [INFO][4867] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"db7bac57-1d40-4d74-a39c-beae008d27e7", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"goldmane-cccfbd5cf-lx7nq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.114.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie259fe5cc07", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:02.663773 containerd[1656]: 2026-04-16 23:28:02.635 [INFO][4867] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.70/32] ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" Apr 16 23:28:02.663885 containerd[1656]: 2026-04-16 23:28:02.635 [INFO][4867] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie259fe5cc07 ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" Apr 16 23:28:02.663885 containerd[1656]: 2026-04-16 23:28:02.641 [INFO][4867] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" Apr 16 23:28:02.663923 containerd[1656]: 2026-04-16 23:28:02.646 [INFO][4867] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"db7bac57-1d40-4d74-a39c-beae008d27e7", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae", Pod:"goldmane-cccfbd5cf-lx7nq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.114.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie259fe5cc07", MAC:"92:2f:56:b4:1c:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:02.663974 containerd[1656]: 2026-04-16 23:28:02.661 [INFO][4867] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" Namespace="calico-system" Pod="goldmane-cccfbd5cf-lx7nq" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-goldmane--cccfbd5cf--lx7nq-eth0" Apr 16 23:28:02.677562 containerd[1656]: time="2026-04-16T23:28:02.677510970Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:02.680938 containerd[1656]: time="2026-04-16T23:28:02.680905698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56489995f-wm9gf,Uid:50b472ca-47d9-4e73-b333-2bd45cc28f36,Namespace:calico-system,Attempt:0,} returns sandbox id \"7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378\"" Apr 16 23:28:02.681889 containerd[1656]: time="2026-04-16T23:28:02.681863460Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Apr 16 23:28:02.683126 containerd[1656]: time="2026-04-16T23:28:02.683082983Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:02.688692 containerd[1656]: time="2026-04-16T23:28:02.688639995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:02.689343 containerd[1656]: time="2026-04-16T23:28:02.689306077Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 2.099471935s" Apr 16 23:28:02.689343 containerd[1656]: time="2026-04-16T23:28:02.689340277Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 16 23:28:02.691297 containerd[1656]: time="2026-04-16T23:28:02.691180001Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 16 23:28:02.696192 containerd[1656]: time="2026-04-16T23:28:02.695959692Z" level=info msg="CreateContainer within sandbox \"c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 16 23:28:02.713766 containerd[1656]: time="2026-04-16T23:28:02.713697492Z" level=info msg="connecting to shim 8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae" address="unix:///run/containerd/s/35bcf98ac920f5f858c309bec7c61acf95d8cccfe97a3cd3ff5c1a7b16b53b2d" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:28:02.716619 containerd[1656]: time="2026-04-16T23:28:02.716046418Z" level=info msg="Container 0b71bbcd10c7ec006cd4d7988c3146df4b80217d93158a37a858c6c557b562f9: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:28:02.730284 containerd[1656]: time="2026-04-16T23:28:02.730236810Z" level=info msg="CreateContainer within sandbox \"c51df9574a2e58d4018ee910ac13521d52f036e3ea5b06aeafc6d4cda2d95d03\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0b71bbcd10c7ec006cd4d7988c3146df4b80217d93158a37a858c6c557b562f9\"" Apr 16 23:28:02.730928 containerd[1656]: time="2026-04-16T23:28:02.730900011Z" level=info msg="StartContainer for \"0b71bbcd10c7ec006cd4d7988c3146df4b80217d93158a37a858c6c557b562f9\"" Apr 16 23:28:02.734606 containerd[1656]: time="2026-04-16T23:28:02.733887098Z" level=info msg="connecting to shim 0b71bbcd10c7ec006cd4d7988c3146df4b80217d93158a37a858c6c557b562f9" address="unix:///run/containerd/s/d274bfd223f6d29c402d7ff8260b8a7c70fef4116c100d6fe6dc373941e555e3" protocol=ttrpc version=3 Apr 16 23:28:02.740925 systemd[1]: Started cri-containerd-8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae.scope - libcontainer container 8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae. Apr 16 23:28:02.759942 systemd[1]: Started cri-containerd-0b71bbcd10c7ec006cd4d7988c3146df4b80217d93158a37a858c6c557b562f9.scope - libcontainer container 0b71bbcd10c7ec006cd4d7988c3146df4b80217d93158a37a858c6c557b562f9. Apr 16 23:28:02.791433 containerd[1656]: time="2026-04-16T23:28:02.791389429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-lx7nq,Uid:db7bac57-1d40-4d74-a39c-beae008d27e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae\"" Apr 16 23:28:02.808145 containerd[1656]: time="2026-04-16T23:28:02.808103787Z" level=info msg="StartContainer for \"0b71bbcd10c7ec006cd4d7988c3146df4b80217d93158a37a858c6c557b562f9\" returns successfully" Apr 16 23:28:03.062839 containerd[1656]: time="2026-04-16T23:28:03.062771406Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:03.064121 containerd[1656]: time="2026-04-16T23:28:03.064046049Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 16 23:28:03.067098 containerd[1656]: time="2026-04-16T23:28:03.067036896Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 375.773175ms" Apr 16 23:28:03.067098 containerd[1656]: time="2026-04-16T23:28:03.067076856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 16 23:28:03.068614 containerd[1656]: time="2026-04-16T23:28:03.068572699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 16 23:28:03.074604 containerd[1656]: time="2026-04-16T23:28:03.074556273Z" level=info msg="CreateContainer within sandbox \"0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 16 23:28:03.084726 containerd[1656]: time="2026-04-16T23:28:03.084680816Z" level=info msg="Container 714d832feeefced128d4b545df68f4ee3a6946ef5e20b32abe21d4b28eb8c315: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:28:03.094668 containerd[1656]: time="2026-04-16T23:28:03.094611759Z" level=info msg="CreateContainer within sandbox \"0df12dcdca0c37bbbf3ccdbd350e3ec0a7497483cc004ece6ff77b197ea2b16c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"714d832feeefced128d4b545df68f4ee3a6946ef5e20b32abe21d4b28eb8c315\"" Apr 16 23:28:03.095389 containerd[1656]: time="2026-04-16T23:28:03.095295320Z" level=info msg="StartContainer for \"714d832feeefced128d4b545df68f4ee3a6946ef5e20b32abe21d4b28eb8c315\"" Apr 16 23:28:03.096767 containerd[1656]: time="2026-04-16T23:28:03.096719483Z" level=info msg="connecting to shim 714d832feeefced128d4b545df68f4ee3a6946ef5e20b32abe21d4b28eb8c315" address="unix:///run/containerd/s/924024d1d5d8cbeae7e3a5ca7c7fe5ed8681b91f79c23af8929c32078d683760" protocol=ttrpc version=3 Apr 16 23:28:03.115094 systemd[1]: Started cri-containerd-714d832feeefced128d4b545df68f4ee3a6946ef5e20b32abe21d4b28eb8c315.scope - libcontainer container 714d832feeefced128d4b545df68f4ee3a6946ef5e20b32abe21d4b28eb8c315. Apr 16 23:28:03.150801 containerd[1656]: time="2026-04-16T23:28:03.150758166Z" level=info msg="StartContainer for \"714d832feeefced128d4b545df68f4ee3a6946ef5e20b32abe21d4b28eb8c315\" returns successfully" Apr 16 23:28:03.367623 containerd[1656]: time="2026-04-16T23:28:03.367227339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-pm682,Uid:169c0663-4b7e-43a8-8244-c185c534f7be,Namespace:kube-system,Attempt:0,}" Apr 16 23:28:03.497789 systemd-networkd[1517]: cali42b9ca70b9e: Link UP Apr 16 23:28:03.498867 systemd-networkd[1517]: cali42b9ca70b9e: Gained carrier Apr 16 23:28:03.515642 containerd[1656]: 2026-04-16 23:28:03.410 [INFO][5124] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0 coredns-66bc5c9577- kube-system 169c0663-4b7e-43a8-8244-c185c534f7be 806 0 2026-04-16 23:27:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 coredns-66bc5c9577-pm682 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali42b9ca70b9e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-" Apr 16 23:28:03.515642 containerd[1656]: 2026-04-16 23:28:03.410 [INFO][5124] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" Apr 16 23:28:03.515642 containerd[1656]: 2026-04-16 23:28:03.437 [INFO][5138] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" HandleID="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Workload="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.451 [INFO][5138] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" HandleID="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Workload="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400058db10), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"coredns-66bc5c9577-pm682", "timestamp":"2026-04-16 23:28:03.437027857 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000265340)} Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.451 [INFO][5138] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.451 [INFO][5138] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.452 [INFO][5138] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.455 [INFO][5138] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.463 [INFO][5138] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.469 [INFO][5138] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.472 [INFO][5138] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516037 containerd[1656]: 2026-04-16 23:28:03.475 [INFO][5138] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516240 containerd[1656]: 2026-04-16 23:28:03.475 [INFO][5138] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516240 containerd[1656]: 2026-04-16 23:28:03.477 [INFO][5138] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578 Apr 16 23:28:03.516240 containerd[1656]: 2026-04-16 23:28:03.482 [INFO][5138] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516240 containerd[1656]: 2026-04-16 23:28:03.490 [INFO][5138] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.71/26] block=192.168.114.64/26 handle="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516240 containerd[1656]: 2026-04-16 23:28:03.491 [INFO][5138] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.71/26] handle="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:03.516240 containerd[1656]: 2026-04-16 23:28:03.491 [INFO][5138] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:28:03.516240 containerd[1656]: 2026-04-16 23:28:03.491 [INFO][5138] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.71/26] IPv6=[] ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" HandleID="k8s-pod-network.b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Workload="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" Apr 16 23:28:03.516365 containerd[1656]: 2026-04-16 23:28:03.494 [INFO][5124] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"169c0663-4b7e-43a8-8244-c185c534f7be", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"coredns-66bc5c9577-pm682", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali42b9ca70b9e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:03.516365 containerd[1656]: 2026-04-16 23:28:03.494 [INFO][5124] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.71/32] ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" Apr 16 23:28:03.516365 containerd[1656]: 2026-04-16 23:28:03.494 [INFO][5124] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali42b9ca70b9e ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" Apr 16 23:28:03.516365 containerd[1656]: 2026-04-16 23:28:03.498 [INFO][5124] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" Apr 16 23:28:03.516365 containerd[1656]: 2026-04-16 23:28:03.499 [INFO][5124] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"169c0663-4b7e-43a8-8244-c185c534f7be", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578", Pod:"coredns-66bc5c9577-pm682", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali42b9ca70b9e", MAC:"f6:1d:5f:67:ce:79", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:03.516535 containerd[1656]: 2026-04-16 23:28:03.513 [INFO][5124] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" Namespace="kube-system" Pod="coredns-66bc5c9577-pm682" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--pm682-eth0" Apr 16 23:28:03.548758 containerd[1656]: time="2026-04-16T23:28:03.547253868Z" level=info msg="connecting to shim b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578" address="unix:///run/containerd/s/b94940e1fe7d77a04cedc81d4d6ae74b84dd933fdbf58013cd7ab326a89ea423" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:28:03.574673 kubelet[2901]: I0416 23:28:03.574583 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5bf56bfb56-hzvn2" podStartSLOduration=26.47289463 podStartE2EDuration="28.57456617s" podCreationTimestamp="2026-04-16 23:27:35 +0000 UTC" firstStartedPulling="2026-04-16 23:28:00.58921878 +0000 UTC m=+42.326953546" lastFinishedPulling="2026-04-16 23:28:02.69089024 +0000 UTC m=+44.428625086" observedRunningTime="2026-04-16 23:28:03.573838849 +0000 UTC m=+45.311573655" watchObservedRunningTime="2026-04-16 23:28:03.57456617 +0000 UTC m=+45.312300896" Apr 16 23:28:03.607967 systemd[1]: Started cri-containerd-b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578.scope - libcontainer container b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578. Apr 16 23:28:03.649312 containerd[1656]: time="2026-04-16T23:28:03.649241260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-pm682,Uid:169c0663-4b7e-43a8-8244-c185c534f7be,Namespace:kube-system,Attempt:0,} returns sandbox id \"b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578\"" Apr 16 23:28:03.655948 containerd[1656]: time="2026-04-16T23:28:03.655905515Z" level=info msg="CreateContainer within sandbox \"b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 16 23:28:03.668738 containerd[1656]: time="2026-04-16T23:28:03.666910380Z" level=info msg="Container 739df398eb7cd308352b2764880f4faf03622f356c5686b54b7fca8812461f62: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:28:03.676410 containerd[1656]: time="2026-04-16T23:28:03.676364402Z" level=info msg="CreateContainer within sandbox \"b6dce817432d582741d485a3c91dea84be532c63d15831d649a08bc0fed1c578\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"739df398eb7cd308352b2764880f4faf03622f356c5686b54b7fca8812461f62\"" Apr 16 23:28:03.677214 containerd[1656]: time="2026-04-16T23:28:03.677188204Z" level=info msg="StartContainer for \"739df398eb7cd308352b2764880f4faf03622f356c5686b54b7fca8812461f62\"" Apr 16 23:28:03.678294 containerd[1656]: time="2026-04-16T23:28:03.678255486Z" level=info msg="connecting to shim 739df398eb7cd308352b2764880f4faf03622f356c5686b54b7fca8812461f62" address="unix:///run/containerd/s/b94940e1fe7d77a04cedc81d4d6ae74b84dd933fdbf58013cd7ab326a89ea423" protocol=ttrpc version=3 Apr 16 23:28:03.699920 systemd[1]: Started cri-containerd-739df398eb7cd308352b2764880f4faf03622f356c5686b54b7fca8812461f62.scope - libcontainer container 739df398eb7cd308352b2764880f4faf03622f356c5686b54b7fca8812461f62. Apr 16 23:28:03.730902 containerd[1656]: time="2026-04-16T23:28:03.730703765Z" level=info msg="StartContainer for \"739df398eb7cd308352b2764880f4faf03622f356c5686b54b7fca8812461f62\" returns successfully" Apr 16 23:28:04.114961 systemd-networkd[1517]: calibd967cb2f65: Gained IPv6LL Apr 16 23:28:04.370993 containerd[1656]: time="2026-04-16T23:28:04.370775581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sl2t6,Uid:828b6488-b286-419f-a0a8-9005759cd92d,Namespace:kube-system,Attempt:0,}" Apr 16 23:28:04.542859 systemd-networkd[1517]: cali3c1157d98b6: Link UP Apr 16 23:28:04.543496 systemd-networkd[1517]: cali3c1157d98b6: Gained carrier Apr 16 23:28:04.562562 kubelet[2901]: I0416 23:28:04.561349 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5bf56bfb56-gl7td" podStartSLOduration=28.069860423 podStartE2EDuration="29.561328335s" podCreationTimestamp="2026-04-16 23:27:35 +0000 UTC" firstStartedPulling="2026-04-16 23:28:01.576952547 +0000 UTC m=+43.314687313" lastFinishedPulling="2026-04-16 23:28:03.068420419 +0000 UTC m=+44.806155225" observedRunningTime="2026-04-16 23:28:03.595096017 +0000 UTC m=+45.332830783" watchObservedRunningTime="2026-04-16 23:28:04.561328335 +0000 UTC m=+46.299063101" Apr 16 23:28:04.562827 systemd-networkd[1517]: cali42b9ca70b9e: Gained IPv6LL Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.424 [INFO][5253] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0 coredns-66bc5c9577- kube-system 828b6488-b286-419f-a0a8-9005759cd92d 813 0 2026-04-16 23:27:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-b2725589f5 coredns-66bc5c9577-sl2t6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3c1157d98b6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.424 [INFO][5253] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.463 [INFO][5268] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" HandleID="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Workload="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.479 [INFO][5268] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" HandleID="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Workload="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003f8890), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-b2725589f5", "pod":"coredns-66bc5c9577-sl2t6", "timestamp":"2026-04-16 23:28:04.463874593 +0000 UTC"}, Hostname:"ci-4459-2-4-n-b2725589f5", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40001702c0)} Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.480 [INFO][5268] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.480 [INFO][5268] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.480 [INFO][5268] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-b2725589f5' Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.487 [INFO][5268] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.494 [INFO][5268] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.501 [INFO][5268] ipam/ipam.go 526: Trying affinity for 192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.504 [INFO][5268] ipam/ipam.go 160: Attempting to load block cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.508 [INFO][5268] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.114.64/26 host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.508 [INFO][5268] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.114.64/26 handle="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.511 [INFO][5268] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.523 [INFO][5268] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.114.64/26 handle="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.536 [INFO][5268] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.114.72/26] block=192.168.114.64/26 handle="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.536 [INFO][5268] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.114.72/26] handle="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" host="ci-4459-2-4-n-b2725589f5" Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.536 [INFO][5268] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 16 23:28:04.563431 containerd[1656]: 2026-04-16 23:28:04.536 [INFO][5268] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.114.72/26] IPv6=[] ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" HandleID="k8s-pod-network.9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Workload="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" Apr 16 23:28:04.565189 containerd[1656]: 2026-04-16 23:28:04.539 [INFO][5253] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"828b6488-b286-419f-a0a8-9005759cd92d", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"", Pod:"coredns-66bc5c9577-sl2t6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3c1157d98b6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:04.565189 containerd[1656]: 2026-04-16 23:28:04.539 [INFO][5253] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.72/32] ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" Apr 16 23:28:04.565189 containerd[1656]: 2026-04-16 23:28:04.539 [INFO][5253] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3c1157d98b6 ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" Apr 16 23:28:04.565189 containerd[1656]: 2026-04-16 23:28:04.543 [INFO][5253] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" Apr 16 23:28:04.565189 containerd[1656]: 2026-04-16 23:28:04.544 [INFO][5253] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"828b6488-b286-419f-a0a8-9005759cd92d", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2026, time.April, 16, 23, 27, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-b2725589f5", ContainerID:"9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b", Pod:"coredns-66bc5c9577-sl2t6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3c1157d98b6", MAC:"fe:f0:cd:3c:7c:54", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 16 23:28:04.565365 containerd[1656]: 2026-04-16 23:28:04.560 [INFO][5253] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" Namespace="kube-system" Pod="coredns-66bc5c9577-sl2t6" WorkloadEndpoint="ci--4459--2--4--n--b2725589f5-k8s-coredns--66bc5c9577--sl2t6-eth0" Apr 16 23:28:04.573843 kubelet[2901]: I0416 23:28:04.573800 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:28:04.574541 kubelet[2901]: I0416 23:28:04.573820 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:28:04.619368 kubelet[2901]: I0416 23:28:04.619312 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-pm682" podStartSLOduration=40.619277346 podStartE2EDuration="40.619277346s" podCreationTimestamp="2026-04-16 23:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:28:04.599542421 +0000 UTC m=+46.337277227" watchObservedRunningTime="2026-04-16 23:28:04.619277346 +0000 UTC m=+46.357012112" Apr 16 23:28:04.623523 containerd[1656]: time="2026-04-16T23:28:04.623312276Z" level=info msg="connecting to shim 9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b" address="unix:///run/containerd/s/f2af1b17238f4894b4106390921988a4ecbfc154d5de00e5570653accbb96856" namespace=k8s.io protocol=ttrpc version=3 Apr 16 23:28:04.682072 systemd[1]: Started cri-containerd-9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b.scope - libcontainer container 9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b. Apr 16 23:28:04.691445 systemd-networkd[1517]: calie259fe5cc07: Gained IPv6LL Apr 16 23:28:04.734279 containerd[1656]: time="2026-04-16T23:28:04.734232048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sl2t6,Uid:828b6488-b286-419f-a0a8-9005759cd92d,Namespace:kube-system,Attempt:0,} returns sandbox id \"9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b\"" Apr 16 23:28:04.743307 containerd[1656]: time="2026-04-16T23:28:04.743266108Z" level=info msg="CreateContainer within sandbox \"9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 16 23:28:04.758430 containerd[1656]: time="2026-04-16T23:28:04.757974902Z" level=info msg="Container 9111484c6273d61160ae434eab636416868177d4be876c9b4943e31f1bc1f523: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:28:04.771102 containerd[1656]: time="2026-04-16T23:28:04.770815531Z" level=info msg="CreateContainer within sandbox \"9f8c998bd02b67565164b76359c0aaa4700e28fc37c1141f2b2ca3c9b3c96d3b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9111484c6273d61160ae434eab636416868177d4be876c9b4943e31f1bc1f523\"" Apr 16 23:28:04.771852 containerd[1656]: time="2026-04-16T23:28:04.771778893Z" level=info msg="StartContainer for \"9111484c6273d61160ae434eab636416868177d4be876c9b4943e31f1bc1f523\"" Apr 16 23:28:04.773642 containerd[1656]: time="2026-04-16T23:28:04.773557697Z" level=info msg="connecting to shim 9111484c6273d61160ae434eab636416868177d4be876c9b4943e31f1bc1f523" address="unix:///run/containerd/s/f2af1b17238f4894b4106390921988a4ecbfc154d5de00e5570653accbb96856" protocol=ttrpc version=3 Apr 16 23:28:04.798959 systemd[1]: Started cri-containerd-9111484c6273d61160ae434eab636416868177d4be876c9b4943e31f1bc1f523.scope - libcontainer container 9111484c6273d61160ae434eab636416868177d4be876c9b4943e31f1bc1f523. Apr 16 23:28:04.843036 containerd[1656]: time="2026-04-16T23:28:04.842997295Z" level=info msg="StartContainer for \"9111484c6273d61160ae434eab636416868177d4be876c9b4943e31f1bc1f523\" returns successfully" Apr 16 23:28:05.194827 containerd[1656]: time="2026-04-16T23:28:05.194451095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:05.196110 containerd[1656]: time="2026-04-16T23:28:05.196054578Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Apr 16 23:28:05.197575 containerd[1656]: time="2026-04-16T23:28:05.197502701Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:05.200476 containerd[1656]: time="2026-04-16T23:28:05.200433108Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:05.201284 containerd[1656]: time="2026-04-16T23:28:05.201150670Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 2.130400166s" Apr 16 23:28:05.201284 containerd[1656]: time="2026-04-16T23:28:05.201181710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Apr 16 23:28:05.202554 containerd[1656]: time="2026-04-16T23:28:05.202522993Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 16 23:28:05.213441 containerd[1656]: time="2026-04-16T23:28:05.213400378Z" level=info msg="CreateContainer within sandbox \"7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 16 23:28:05.223432 containerd[1656]: time="2026-04-16T23:28:05.223386320Z" level=info msg="Container 1f782d20998f2ed8c60de6fdaf3ab0671fcdf99126fd03d33a7520c59d9a1f09: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:28:05.232405 containerd[1656]: time="2026-04-16T23:28:05.232345221Z" level=info msg="CreateContainer within sandbox \"7702fbb024e1bb84230f76e004c7d1328edd807a69f3939e70db1aabbe58e378\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1f782d20998f2ed8c60de6fdaf3ab0671fcdf99126fd03d33a7520c59d9a1f09\"" Apr 16 23:28:05.233157 containerd[1656]: time="2026-04-16T23:28:05.233124542Z" level=info msg="StartContainer for \"1f782d20998f2ed8c60de6fdaf3ab0671fcdf99126fd03d33a7520c59d9a1f09\"" Apr 16 23:28:05.234324 containerd[1656]: time="2026-04-16T23:28:05.234289545Z" level=info msg="connecting to shim 1f782d20998f2ed8c60de6fdaf3ab0671fcdf99126fd03d33a7520c59d9a1f09" address="unix:///run/containerd/s/887446078bf06cbad779112ed0c4be68e44a860fe7578cc8c96fdb6ca7f9ab11" protocol=ttrpc version=3 Apr 16 23:28:05.250900 systemd[1]: Started cri-containerd-1f782d20998f2ed8c60de6fdaf3ab0671fcdf99126fd03d33a7520c59d9a1f09.scope - libcontainer container 1f782d20998f2ed8c60de6fdaf3ab0671fcdf99126fd03d33a7520c59d9a1f09. Apr 16 23:28:05.295144 containerd[1656]: time="2026-04-16T23:28:05.295043083Z" level=info msg="StartContainer for \"1f782d20998f2ed8c60de6fdaf3ab0671fcdf99126fd03d33a7520c59d9a1f09\" returns successfully" Apr 16 23:28:05.603707 kubelet[2901]: I0416 23:28:05.603502 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-56489995f-wm9gf" podStartSLOduration=26.084452216 podStartE2EDuration="28.603483185s" podCreationTimestamp="2026-04-16 23:27:37 +0000 UTC" firstStartedPulling="2026-04-16 23:28:02.683064063 +0000 UTC m=+44.420798829" lastFinishedPulling="2026-04-16 23:28:05.202095032 +0000 UTC m=+46.939829798" observedRunningTime="2026-04-16 23:28:05.601998901 +0000 UTC m=+47.339733667" watchObservedRunningTime="2026-04-16 23:28:05.603483185 +0000 UTC m=+47.341217991" Apr 16 23:28:05.623789 kubelet[2901]: I0416 23:28:05.623063 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-sl2t6" podStartSLOduration=41.623045429 podStartE2EDuration="41.623045429s" podCreationTimestamp="2026-04-16 23:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:28:05.621574066 +0000 UTC m=+47.359308832" watchObservedRunningTime="2026-04-16 23:28:05.623045429 +0000 UTC m=+47.360780155" Apr 16 23:28:06.546493 systemd-networkd[1517]: cali3c1157d98b6: Gained IPv6LL Apr 16 23:28:06.907243 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3441634391.mount: Deactivated successfully. Apr 16 23:28:07.149441 containerd[1656]: time="2026-04-16T23:28:07.149391661Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:07.154163 containerd[1656]: time="2026-04-16T23:28:07.154101872Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Apr 16 23:28:07.155507 containerd[1656]: time="2026-04-16T23:28:07.154549193Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:07.157605 containerd[1656]: time="2026-04-16T23:28:07.157368719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 16 23:28:07.158066 containerd[1656]: time="2026-04-16T23:28:07.158009640Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 1.955449967s" Apr 16 23:28:07.158066 containerd[1656]: time="2026-04-16T23:28:07.158050040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Apr 16 23:28:07.162793 containerd[1656]: time="2026-04-16T23:28:07.162757571Z" level=info msg="CreateContainer within sandbox \"8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 16 23:28:07.171317 containerd[1656]: time="2026-04-16T23:28:07.171268591Z" level=info msg="Container cd29fb55c0cb71f1d7f2f27f8a10e17d17aa878170bd4e73108f18277b410faf: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:28:07.181971 containerd[1656]: time="2026-04-16T23:28:07.181905495Z" level=info msg="CreateContainer within sandbox \"8c5fb72f0d0a7d05cb30af30887f83aeb28f1caa6e682bd9c121c86bcc85ddae\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"cd29fb55c0cb71f1d7f2f27f8a10e17d17aa878170bd4e73108f18277b410faf\"" Apr 16 23:28:07.182941 containerd[1656]: time="2026-04-16T23:28:07.182906057Z" level=info msg="StartContainer for \"cd29fb55c0cb71f1d7f2f27f8a10e17d17aa878170bd4e73108f18277b410faf\"" Apr 16 23:28:07.184528 containerd[1656]: time="2026-04-16T23:28:07.184482661Z" level=info msg="connecting to shim cd29fb55c0cb71f1d7f2f27f8a10e17d17aa878170bd4e73108f18277b410faf" address="unix:///run/containerd/s/35bcf98ac920f5f858c309bec7c61acf95d8cccfe97a3cd3ff5c1a7b16b53b2d" protocol=ttrpc version=3 Apr 16 23:28:07.205927 systemd[1]: Started cri-containerd-cd29fb55c0cb71f1d7f2f27f8a10e17d17aa878170bd4e73108f18277b410faf.scope - libcontainer container cd29fb55c0cb71f1d7f2f27f8a10e17d17aa878170bd4e73108f18277b410faf. Apr 16 23:28:07.242279 containerd[1656]: time="2026-04-16T23:28:07.242233352Z" level=info msg="StartContainer for \"cd29fb55c0cb71f1d7f2f27f8a10e17d17aa878170bd4e73108f18277b410faf\" returns successfully" Apr 16 23:28:36.233124 kubelet[2901]: I0416 23:28:36.232867 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:28:36.259574 kubelet[2901]: I0416 23:28:36.258174 2901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-lx7nq" podStartSLOduration=56.893070097 podStartE2EDuration="1m1.258159945s" podCreationTimestamp="2026-04-16 23:27:35 +0000 UTC" firstStartedPulling="2026-04-16 23:28:02.793774594 +0000 UTC m=+44.531509360" lastFinishedPulling="2026-04-16 23:28:07.158864442 +0000 UTC m=+48.896599208" observedRunningTime="2026-04-16 23:28:07.607055262 +0000 UTC m=+49.344790068" watchObservedRunningTime="2026-04-16 23:28:36.258159945 +0000 UTC m=+77.995894711" Apr 16 23:28:38.403187 kubelet[2901]: I0416 23:28:38.403087 2901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:30:24.169865 systemd[1]: Started sshd@7-10.0.3.226:22-50.85.169.122:56464.service - OpenSSH per-connection server daemon (50.85.169.122:56464). Apr 16 23:30:24.288810 sshd[6095]: Accepted publickey for core from 50.85.169.122 port 56464 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:24.291373 sshd-session[6095]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:24.301125 systemd-logind[1635]: New session 8 of user core. Apr 16 23:30:24.314369 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 16 23:30:24.429717 sshd[6104]: Connection closed by 50.85.169.122 port 56464 Apr 16 23:30:24.430452 sshd-session[6095]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:24.434506 systemd[1]: sshd@7-10.0.3.226:22-50.85.169.122:56464.service: Deactivated successfully. Apr 16 23:30:24.436478 systemd[1]: session-8.scope: Deactivated successfully. Apr 16 23:30:24.437283 systemd-logind[1635]: Session 8 logged out. Waiting for processes to exit. Apr 16 23:30:24.438889 systemd-logind[1635]: Removed session 8. Apr 16 23:30:29.455461 systemd[1]: Started sshd@8-10.0.3.226:22-50.85.169.122:40654.service - OpenSSH per-connection server daemon (50.85.169.122:40654). Apr 16 23:30:29.561801 sshd[6155]: Accepted publickey for core from 50.85.169.122 port 40654 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:29.562818 sshd-session[6155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:29.567649 systemd-logind[1635]: New session 9 of user core. Apr 16 23:30:29.579053 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 16 23:30:29.674331 sshd[6158]: Connection closed by 50.85.169.122 port 40654 Apr 16 23:30:29.673425 sshd-session[6155]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:29.677101 systemd[1]: sshd@8-10.0.3.226:22-50.85.169.122:40654.service: Deactivated successfully. Apr 16 23:30:29.679154 systemd[1]: session-9.scope: Deactivated successfully. Apr 16 23:30:29.680771 systemd-logind[1635]: Session 9 logged out. Waiting for processes to exit. Apr 16 23:30:29.682173 systemd-logind[1635]: Removed session 9. Apr 16 23:30:34.699318 systemd[1]: Started sshd@9-10.0.3.226:22-50.85.169.122:40656.service - OpenSSH per-connection server daemon (50.85.169.122:40656). Apr 16 23:30:34.807313 sshd[6182]: Accepted publickey for core from 50.85.169.122 port 40656 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:34.808718 sshd-session[6182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:34.812846 systemd-logind[1635]: New session 10 of user core. Apr 16 23:30:34.822936 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 16 23:30:34.918665 sshd[6185]: Connection closed by 50.85.169.122 port 40656 Apr 16 23:30:34.919069 sshd-session[6182]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:34.923260 systemd[1]: sshd@9-10.0.3.226:22-50.85.169.122:40656.service: Deactivated successfully. Apr 16 23:30:34.926169 systemd[1]: session-10.scope: Deactivated successfully. Apr 16 23:30:34.927975 systemd-logind[1635]: Session 10 logged out. Waiting for processes to exit. Apr 16 23:30:34.929135 systemd-logind[1635]: Removed session 10. Apr 16 23:30:39.947725 systemd[1]: Started sshd@10-10.0.3.226:22-50.85.169.122:54726.service - OpenSSH per-connection server daemon (50.85.169.122:54726). Apr 16 23:30:40.056976 sshd[6247]: Accepted publickey for core from 50.85.169.122 port 54726 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:40.058344 sshd-session[6247]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:40.063825 systemd-logind[1635]: New session 11 of user core. Apr 16 23:30:40.069902 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 16 23:30:40.165522 sshd[6250]: Connection closed by 50.85.169.122 port 54726 Apr 16 23:30:40.165889 sshd-session[6247]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:40.169879 systemd[1]: sshd@10-10.0.3.226:22-50.85.169.122:54726.service: Deactivated successfully. Apr 16 23:30:40.171899 systemd[1]: session-11.scope: Deactivated successfully. Apr 16 23:30:40.173390 systemd-logind[1635]: Session 11 logged out. Waiting for processes to exit. Apr 16 23:30:40.175577 systemd-logind[1635]: Removed session 11. Apr 16 23:30:45.191646 systemd[1]: Started sshd@11-10.0.3.226:22-50.85.169.122:54730.service - OpenSSH per-connection server daemon (50.85.169.122:54730). Apr 16 23:30:45.303274 sshd[6287]: Accepted publickey for core from 50.85.169.122 port 54730 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:45.304635 sshd-session[6287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:45.308721 systemd-logind[1635]: New session 12 of user core. Apr 16 23:30:45.317905 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 16 23:30:45.416289 sshd[6290]: Connection closed by 50.85.169.122 port 54730 Apr 16 23:30:45.416946 sshd-session[6287]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:45.420542 systemd[1]: sshd@11-10.0.3.226:22-50.85.169.122:54730.service: Deactivated successfully. Apr 16 23:30:45.423446 systemd[1]: session-12.scope: Deactivated successfully. Apr 16 23:30:45.424149 systemd-logind[1635]: Session 12 logged out. Waiting for processes to exit. Apr 16 23:30:45.425424 systemd-logind[1635]: Removed session 12. Apr 16 23:30:45.443501 systemd[1]: Started sshd@12-10.0.3.226:22-50.85.169.122:54746.service - OpenSSH per-connection server daemon (50.85.169.122:54746). Apr 16 23:30:45.545703 sshd[6305]: Accepted publickey for core from 50.85.169.122 port 54746 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:45.546891 sshd-session[6305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:45.552119 systemd-logind[1635]: New session 13 of user core. Apr 16 23:30:45.561907 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 16 23:30:45.689495 sshd[6308]: Connection closed by 50.85.169.122 port 54746 Apr 16 23:30:45.689238 sshd-session[6305]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:45.693355 systemd[1]: sshd@12-10.0.3.226:22-50.85.169.122:54746.service: Deactivated successfully. Apr 16 23:30:45.698847 systemd[1]: session-13.scope: Deactivated successfully. Apr 16 23:30:45.700995 systemd-logind[1635]: Session 13 logged out. Waiting for processes to exit. Apr 16 23:30:45.717439 systemd[1]: Started sshd@13-10.0.3.226:22-50.85.169.122:54760.service - OpenSSH per-connection server daemon (50.85.169.122:54760). Apr 16 23:30:45.718491 systemd-logind[1635]: Removed session 13. Apr 16 23:30:45.828664 sshd[6320]: Accepted publickey for core from 50.85.169.122 port 54760 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:45.830831 sshd-session[6320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:45.837273 systemd-logind[1635]: New session 14 of user core. Apr 16 23:30:45.844932 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 16 23:30:45.947192 sshd[6323]: Connection closed by 50.85.169.122 port 54760 Apr 16 23:30:45.947589 sshd-session[6320]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:45.951350 systemd[1]: sshd@13-10.0.3.226:22-50.85.169.122:54760.service: Deactivated successfully. Apr 16 23:30:45.953344 systemd[1]: session-14.scope: Deactivated successfully. Apr 16 23:30:45.954048 systemd-logind[1635]: Session 14 logged out. Waiting for processes to exit. Apr 16 23:30:45.955123 systemd-logind[1635]: Removed session 14. Apr 16 23:30:50.972343 systemd[1]: Started sshd@14-10.0.3.226:22-50.85.169.122:35214.service - OpenSSH per-connection server daemon (50.85.169.122:35214). Apr 16 23:30:51.083170 sshd[6360]: Accepted publickey for core from 50.85.169.122 port 35214 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:51.084539 sshd-session[6360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:51.089309 systemd-logind[1635]: New session 15 of user core. Apr 16 23:30:51.102137 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 16 23:30:51.197128 sshd[6363]: Connection closed by 50.85.169.122 port 35214 Apr 16 23:30:51.197909 sshd-session[6360]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:51.201583 systemd[1]: sshd@14-10.0.3.226:22-50.85.169.122:35214.service: Deactivated successfully. Apr 16 23:30:51.203720 systemd[1]: session-15.scope: Deactivated successfully. Apr 16 23:30:51.204571 systemd-logind[1635]: Session 15 logged out. Waiting for processes to exit. Apr 16 23:30:51.205771 systemd-logind[1635]: Removed session 15. Apr 16 23:30:51.226475 systemd[1]: Started sshd@15-10.0.3.226:22-50.85.169.122:35228.service - OpenSSH per-connection server daemon (50.85.169.122:35228). Apr 16 23:30:51.336223 sshd[6377]: Accepted publickey for core from 50.85.169.122 port 35228 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:51.338016 sshd-session[6377]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:51.346613 systemd-logind[1635]: New session 16 of user core. Apr 16 23:30:51.354132 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 16 23:30:51.495982 sshd[6380]: Connection closed by 50.85.169.122 port 35228 Apr 16 23:30:51.496327 sshd-session[6377]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:51.500660 systemd[1]: sshd@15-10.0.3.226:22-50.85.169.122:35228.service: Deactivated successfully. Apr 16 23:30:51.502400 systemd[1]: session-16.scope: Deactivated successfully. Apr 16 23:30:51.503196 systemd-logind[1635]: Session 16 logged out. Waiting for processes to exit. Apr 16 23:30:51.504182 systemd-logind[1635]: Removed session 16. Apr 16 23:30:51.523977 systemd[1]: Started sshd@16-10.0.3.226:22-50.85.169.122:35238.service - OpenSSH per-connection server daemon (50.85.169.122:35238). Apr 16 23:30:51.635786 sshd[6391]: Accepted publickey for core from 50.85.169.122 port 35238 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:51.636624 sshd-session[6391]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:51.640998 systemd-logind[1635]: New session 17 of user core. Apr 16 23:30:51.650904 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 16 23:30:52.277202 sshd[6394]: Connection closed by 50.85.169.122 port 35238 Apr 16 23:30:52.277928 sshd-session[6391]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:52.284720 systemd[1]: sshd@16-10.0.3.226:22-50.85.169.122:35238.service: Deactivated successfully. Apr 16 23:30:52.288654 systemd[1]: session-17.scope: Deactivated successfully. Apr 16 23:30:52.291131 systemd-logind[1635]: Session 17 logged out. Waiting for processes to exit. Apr 16 23:30:52.303320 systemd[1]: Started sshd@17-10.0.3.226:22-50.85.169.122:35240.service - OpenSSH per-connection server daemon (50.85.169.122:35240). Apr 16 23:30:52.305427 systemd-logind[1635]: Removed session 17. Apr 16 23:30:52.412898 sshd[6419]: Accepted publickey for core from 50.85.169.122 port 35240 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:52.414219 sshd-session[6419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:52.418412 systemd-logind[1635]: New session 18 of user core. Apr 16 23:30:52.424889 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 16 23:30:52.622139 sshd[6422]: Connection closed by 50.85.169.122 port 35240 Apr 16 23:30:52.622905 sshd-session[6419]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:52.626621 systemd[1]: sshd@17-10.0.3.226:22-50.85.169.122:35240.service: Deactivated successfully. Apr 16 23:30:52.628544 systemd[1]: session-18.scope: Deactivated successfully. Apr 16 23:30:52.631045 systemd-logind[1635]: Session 18 logged out. Waiting for processes to exit. Apr 16 23:30:52.632527 systemd-logind[1635]: Removed session 18. Apr 16 23:30:52.646893 systemd[1]: Started sshd@18-10.0.3.226:22-50.85.169.122:35244.service - OpenSSH per-connection server daemon (50.85.169.122:35244). Apr 16 23:30:52.759860 sshd[6433]: Accepted publickey for core from 50.85.169.122 port 35244 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:52.761341 sshd-session[6433]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:52.765418 systemd-logind[1635]: New session 19 of user core. Apr 16 23:30:52.777261 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 16 23:30:52.871871 sshd[6436]: Connection closed by 50.85.169.122 port 35244 Apr 16 23:30:52.872706 sshd-session[6433]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:52.876334 systemd[1]: sshd@18-10.0.3.226:22-50.85.169.122:35244.service: Deactivated successfully. Apr 16 23:30:52.878075 systemd[1]: session-19.scope: Deactivated successfully. Apr 16 23:30:52.880220 systemd-logind[1635]: Session 19 logged out. Waiting for processes to exit. Apr 16 23:30:52.881376 systemd-logind[1635]: Removed session 19. Apr 16 23:30:57.898862 systemd[1]: Started sshd@19-10.0.3.226:22-50.85.169.122:35246.service - OpenSSH per-connection server daemon (50.85.169.122:35246). Apr 16 23:30:58.007829 sshd[6484]: Accepted publickey for core from 50.85.169.122 port 35246 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:30:58.009142 sshd-session[6484]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:30:58.013243 systemd-logind[1635]: New session 20 of user core. Apr 16 23:30:58.020902 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 16 23:30:58.116109 sshd[6487]: Connection closed by 50.85.169.122 port 35246 Apr 16 23:30:58.116621 sshd-session[6484]: pam_unix(sshd:session): session closed for user core Apr 16 23:30:58.120276 systemd[1]: sshd@19-10.0.3.226:22-50.85.169.122:35246.service: Deactivated successfully. Apr 16 23:30:58.121957 systemd[1]: session-20.scope: Deactivated successfully. Apr 16 23:30:58.122738 systemd-logind[1635]: Session 20 logged out. Waiting for processes to exit. Apr 16 23:30:58.124392 systemd-logind[1635]: Removed session 20. Apr 16 23:31:03.142480 systemd[1]: Started sshd@20-10.0.3.226:22-50.85.169.122:41902.service - OpenSSH per-connection server daemon (50.85.169.122:41902). Apr 16 23:31:03.253660 sshd[6529]: Accepted publickey for core from 50.85.169.122 port 41902 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:31:03.255121 sshd-session[6529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:31:03.259154 systemd-logind[1635]: New session 21 of user core. Apr 16 23:31:03.269894 systemd[1]: Started session-21.scope - Session 21 of User core. Apr 16 23:31:03.363617 sshd[6532]: Connection closed by 50.85.169.122 port 41902 Apr 16 23:31:03.364401 sshd-session[6529]: pam_unix(sshd:session): session closed for user core Apr 16 23:31:03.367893 systemd[1]: sshd@20-10.0.3.226:22-50.85.169.122:41902.service: Deactivated successfully. Apr 16 23:31:03.370039 systemd[1]: session-21.scope: Deactivated successfully. Apr 16 23:31:03.371104 systemd-logind[1635]: Session 21 logged out. Waiting for processes to exit. Apr 16 23:31:03.372684 systemd-logind[1635]: Removed session 21. Apr 16 23:31:08.391905 systemd[1]: Started sshd@21-10.0.3.226:22-50.85.169.122:41904.service - OpenSSH per-connection server daemon (50.85.169.122:41904). Apr 16 23:31:08.503564 sshd[6568]: Accepted publickey for core from 50.85.169.122 port 41904 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:31:08.504812 sshd-session[6568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:31:08.509539 systemd-logind[1635]: New session 22 of user core. Apr 16 23:31:08.516893 systemd[1]: Started session-22.scope - Session 22 of User core. Apr 16 23:31:08.612420 sshd[6571]: Connection closed by 50.85.169.122 port 41904 Apr 16 23:31:08.612930 sshd-session[6568]: pam_unix(sshd:session): session closed for user core Apr 16 23:31:08.616509 systemd[1]: sshd@21-10.0.3.226:22-50.85.169.122:41904.service: Deactivated successfully. Apr 16 23:31:08.618186 systemd[1]: session-22.scope: Deactivated successfully. Apr 16 23:31:08.619648 systemd-logind[1635]: Session 22 logged out. Waiting for processes to exit. Apr 16 23:31:08.621364 systemd-logind[1635]: Removed session 22. Apr 16 23:31:13.638294 systemd[1]: Started sshd@22-10.0.3.226:22-50.85.169.122:48782.service - OpenSSH per-connection server daemon (50.85.169.122:48782). Apr 16 23:31:13.752787 sshd[6609]: Accepted publickey for core from 50.85.169.122 port 48782 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:31:13.754061 sshd-session[6609]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:31:13.761225 systemd-logind[1635]: New session 23 of user core. Apr 16 23:31:13.766882 systemd[1]: Started session-23.scope - Session 23 of User core. Apr 16 23:31:13.861657 sshd[6615]: Connection closed by 50.85.169.122 port 48782 Apr 16 23:31:13.862227 sshd-session[6609]: pam_unix(sshd:session): session closed for user core Apr 16 23:31:13.866302 systemd[1]: sshd@22-10.0.3.226:22-50.85.169.122:48782.service: Deactivated successfully. Apr 16 23:31:13.868007 systemd[1]: session-23.scope: Deactivated successfully. Apr 16 23:31:13.869949 systemd-logind[1635]: Session 23 logged out. Waiting for processes to exit. Apr 16 23:31:13.870716 systemd-logind[1635]: Removed session 23. Apr 16 23:31:18.887554 systemd[1]: Started sshd@23-10.0.3.226:22-50.85.169.122:48794.service - OpenSSH per-connection server daemon (50.85.169.122:48794). Apr 16 23:31:18.993908 sshd[6631]: Accepted publickey for core from 50.85.169.122 port 48794 ssh2: RSA SHA256:CwtdB64hxNxm9zZHfz0IxEQf83Y8sHuOR4DDhC2oQfg Apr 16 23:31:18.995090 sshd-session[6631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 16 23:31:18.999954 systemd-logind[1635]: New session 24 of user core. Apr 16 23:31:19.009886 systemd[1]: Started session-24.scope - Session 24 of User core. Apr 16 23:31:19.106926 sshd[6634]: Connection closed by 50.85.169.122 port 48794 Apr 16 23:31:19.107848 sshd-session[6631]: pam_unix(sshd:session): session closed for user core Apr 16 23:31:19.111577 systemd-logind[1635]: Session 24 logged out. Waiting for processes to exit. Apr 16 23:31:19.111971 systemd[1]: sshd@23-10.0.3.226:22-50.85.169.122:48794.service: Deactivated successfully. Apr 16 23:31:19.113943 systemd[1]: session-24.scope: Deactivated successfully. Apr 16 23:31:19.115529 systemd-logind[1635]: Removed session 24. Apr 16 23:31:57.325866 update_engine[1637]: I20260416 23:31:57.324851 1637 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Apr 16 23:31:57.325866 update_engine[1637]: I20260416 23:31:57.324944 1637 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Apr 16 23:31:57.325866 update_engine[1637]: I20260416 23:31:57.325330 1637 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Apr 16 23:31:57.325866 update_engine[1637]: I20260416 23:31:57.325708 1637 omaha_request_params.cc:62] Current group set to stable Apr 16 23:31:57.326595 update_engine[1637]: I20260416 23:31:57.326568 1637 update_attempter.cc:499] Already updated boot flags. Skipping. Apr 16 23:31:57.326661 update_engine[1637]: I20260416 23:31:57.326646 1637 update_attempter.cc:643] Scheduling an action processor start. Apr 16 23:31:57.326747 update_engine[1637]: I20260416 23:31:57.326717 1637 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Apr 16 23:31:57.326840 update_engine[1637]: I20260416 23:31:57.326824 1637 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Apr 16 23:31:57.326945 locksmithd[1683]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Apr 16 23:31:57.327161 update_engine[1637]: I20260416 23:31:57.327137 1637 omaha_request_action.cc:271] Posting an Omaha request to disabled Apr 16 23:31:57.327214 update_engine[1637]: I20260416 23:31:57.327199 1637 omaha_request_action.cc:272] Request: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.327214 update_engine[1637]: Apr 16 23:31:57.328756 update_engine[1637]: I20260416 23:31:57.327381 1637 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 16 23:31:57.329104 update_engine[1637]: I20260416 23:31:57.329072 1637 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 16 23:31:57.329903 update_engine[1637]: I20260416 23:31:57.329872 1637 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 16 23:31:57.337649 update_engine[1637]: E20260416 23:31:57.337599 1637 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 16 23:31:57.337723 update_engine[1637]: I20260416 23:31:57.337682 1637 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Apr 16 23:32:07.306054 update_engine[1637]: I20260416 23:32:07.305927 1637 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 16 23:32:07.306407 update_engine[1637]: I20260416 23:32:07.306084 1637 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 16 23:32:07.306803 update_engine[1637]: I20260416 23:32:07.306777 1637 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 16 23:32:07.312432 update_engine[1637]: E20260416 23:32:07.312387 1637 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 16 23:32:07.312486 update_engine[1637]: I20260416 23:32:07.312476 1637 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Apr 16 23:32:13.599989 systemd[1]: cri-containerd-9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2.scope: Deactivated successfully. Apr 16 23:32:13.600789 containerd[1656]: time="2026-04-16T23:32:13.600618663Z" level=info msg="received container exit event container_id:\"9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2\" id:\"9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2\" pid:2757 exit_status:1 exited_at:{seconds:1776382333 nanos:600186462}" Apr 16 23:32:13.600971 systemd[1]: cri-containerd-9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2.scope: Consumed 3.919s CPU time, 63.6M memory peak. Apr 16 23:32:13.623650 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2-rootfs.mount: Deactivated successfully. Apr 16 23:32:13.625586 containerd[1656]: time="2026-04-16T23:32:13.625553999Z" level=info msg="received container exit event container_id:\"90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20\" id:\"90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20\" pid:3234 exit_status:1 exited_at:{seconds:1776382333 nanos:624913718}" Apr 16 23:32:13.625782 systemd[1]: cri-containerd-90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20.scope: Deactivated successfully. Apr 16 23:32:13.626070 systemd[1]: cri-containerd-90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20.scope: Consumed 15.857s CPU time, 103.8M memory peak. Apr 16 23:32:13.645845 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20-rootfs.mount: Deactivated successfully. Apr 16 23:32:13.845004 kubelet[2901]: E0416 23:32:13.844961 2901 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.226:57000->10.0.3.229:2379: read: connection timed out" Apr 16 23:32:14.128872 containerd[1656]: time="2026-04-16T23:32:14.128772864Z" level=warning msg="container event discarded" container=abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5 type=CONTAINER_CREATED_EVENT Apr 16 23:32:14.140170 containerd[1656]: time="2026-04-16T23:32:14.140072290Z" level=warning msg="container event discarded" container=abdda18253b9138e03951e79c780d1d9b26212e9471f7803fcb867910d3d72c5 type=CONTAINER_STARTED_EVENT Apr 16 23:32:14.162384 containerd[1656]: time="2026-04-16T23:32:14.162345420Z" level=warning msg="container event discarded" container=408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11 type=CONTAINER_CREATED_EVENT Apr 16 23:32:14.162384 containerd[1656]: time="2026-04-16T23:32:14.162371580Z" level=warning msg="container event discarded" container=408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11 type=CONTAINER_STARTED_EVENT Apr 16 23:32:14.162384 containerd[1656]: time="2026-04-16T23:32:14.162381740Z" level=warning msg="container event discarded" container=018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391 type=CONTAINER_CREATED_EVENT Apr 16 23:32:14.162384 containerd[1656]: time="2026-04-16T23:32:14.162389300Z" level=warning msg="container event discarded" container=018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391 type=CONTAINER_STARTED_EVENT Apr 16 23:32:14.177604 containerd[1656]: time="2026-04-16T23:32:14.177577975Z" level=warning msg="container event discarded" container=21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a type=CONTAINER_CREATED_EVENT Apr 16 23:32:14.181715 kubelet[2901]: I0416 23:32:14.181616 2901 scope.go:117] "RemoveContainer" containerID="90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20" Apr 16 23:32:14.183967 containerd[1656]: time="2026-04-16T23:32:14.183643549Z" level=info msg="CreateContainer within sandbox \"47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Apr 16 23:32:14.184052 kubelet[2901]: I0416 23:32:14.183968 2901 scope.go:117] "RemoveContainer" containerID="9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2" Apr 16 23:32:14.185774 containerd[1656]: time="2026-04-16T23:32:14.185722074Z" level=info msg="CreateContainer within sandbox \"018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Apr 16 23:32:14.196361 containerd[1656]: time="2026-04-16T23:32:14.196292738Z" level=warning msg="container event discarded" container=c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51 type=CONTAINER_CREATED_EVENT Apr 16 23:32:14.196361 containerd[1656]: time="2026-04-16T23:32:14.196354858Z" level=warning msg="container event discarded" container=9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2 type=CONTAINER_CREATED_EVENT Apr 16 23:32:14.197131 containerd[1656]: time="2026-04-16T23:32:14.197103499Z" level=info msg="Container 2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:32:14.202833 containerd[1656]: time="2026-04-16T23:32:14.202768952Z" level=info msg="Container ee7131b5b6cde29b9f20cb609d25077fae4884ad919edc1adab690b4ada52c4e: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:32:14.206616 containerd[1656]: time="2026-04-16T23:32:14.206561041Z" level=info msg="CreateContainer within sandbox \"47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4\"" Apr 16 23:32:14.207334 containerd[1656]: time="2026-04-16T23:32:14.207300923Z" level=info msg="StartContainer for \"2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4\"" Apr 16 23:32:14.208147 containerd[1656]: time="2026-04-16T23:32:14.208112244Z" level=info msg="connecting to shim 2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4" address="unix:///run/containerd/s/dde29e35f05d509134bcad93245f417c05a27854bad4c991013c05cf5bb9f722" protocol=ttrpc version=3 Apr 16 23:32:14.214458 containerd[1656]: time="2026-04-16T23:32:14.214394899Z" level=info msg="CreateContainer within sandbox \"018b165a78b4a0c07e32a0e211b57a20a3d8931033277781a528d4b1c8839391\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"ee7131b5b6cde29b9f20cb609d25077fae4884ad919edc1adab690b4ada52c4e\"" Apr 16 23:32:14.215028 containerd[1656]: time="2026-04-16T23:32:14.215002580Z" level=info msg="StartContainer for \"ee7131b5b6cde29b9f20cb609d25077fae4884ad919edc1adab690b4ada52c4e\"" Apr 16 23:32:14.216437 containerd[1656]: time="2026-04-16T23:32:14.216371823Z" level=info msg="connecting to shim ee7131b5b6cde29b9f20cb609d25077fae4884ad919edc1adab690b4ada52c4e" address="unix:///run/containerd/s/9001d3af19fab49d13b013a1164cf6c44da1ea6a8d728d37b979197742b5bdbf" protocol=ttrpc version=3 Apr 16 23:32:14.226887 systemd[1]: Started cri-containerd-2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4.scope - libcontainer container 2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4. Apr 16 23:32:14.230453 systemd[1]: Started cri-containerd-ee7131b5b6cde29b9f20cb609d25077fae4884ad919edc1adab690b4ada52c4e.scope - libcontainer container ee7131b5b6cde29b9f20cb609d25077fae4884ad919edc1adab690b4ada52c4e. Apr 16 23:32:14.246814 containerd[1656]: time="2026-04-16T23:32:14.246557572Z" level=warning msg="container event discarded" container=21406c0574fde85e829b01f388cf2c8d020c3df369d6d02c71cd27a2558d1b1a type=CONTAINER_STARTED_EVENT Apr 16 23:32:14.260657 containerd[1656]: time="2026-04-16T23:32:14.260615564Z" level=info msg="StartContainer for \"2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4\" returns successfully" Apr 16 23:32:14.275158 containerd[1656]: time="2026-04-16T23:32:14.275089237Z" level=warning msg="container event discarded" container=c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51 type=CONTAINER_STARTED_EVENT Apr 16 23:32:14.275158 containerd[1656]: time="2026-04-16T23:32:14.275132397Z" level=warning msg="container event discarded" container=9cd55e1f16d994885c004d62bbbaf851741b68316fd75541ff8e65c1969a14d2 type=CONTAINER_STARTED_EVENT Apr 16 23:32:14.279897 containerd[1656]: time="2026-04-16T23:32:14.279755367Z" level=info msg="StartContainer for \"ee7131b5b6cde29b9f20cb609d25077fae4884ad919edc1adab690b4ada52c4e\" returns successfully" Apr 16 23:32:17.301935 update_engine[1637]: I20260416 23:32:17.301807 1637 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 16 23:32:17.302521 update_engine[1637]: I20260416 23:32:17.301952 1637 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 16 23:32:17.302547 update_engine[1637]: I20260416 23:32:17.302516 1637 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 16 23:32:17.307213 update_engine[1637]: E20260416 23:32:17.307137 1637 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 16 23:32:17.307343 update_engine[1637]: I20260416 23:32:17.307305 1637 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Apr 16 23:32:17.411592 systemd[1]: cri-containerd-2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4.scope: Deactivated successfully. Apr 16 23:32:17.412711 containerd[1656]: time="2026-04-16T23:32:17.412216012Z" level=info msg="received container exit event container_id:\"2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4\" id:\"2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4\" pid:6893 exit_status:1 exited_at:{seconds:1776382337 nanos:411998811}" Apr 16 23:32:17.430757 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4-rootfs.mount: Deactivated successfully. Apr 16 23:32:18.199567 kubelet[2901]: I0416 23:32:18.199526 2901 scope.go:117] "RemoveContainer" containerID="90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20" Apr 16 23:32:18.199997 kubelet[2901]: I0416 23:32:18.199748 2901 scope.go:117] "RemoveContainer" containerID="2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4" Apr 16 23:32:18.199997 kubelet[2901]: E0416 23:32:18.199915 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-5588576f44-d5vm6_tigera-operator(867ecad7-0cd8-46ca-b446-c47691f2f99d)\"" pod="tigera-operator/tigera-operator-5588576f44-d5vm6" podUID="867ecad7-0cd8-46ca-b446-c47691f2f99d" Apr 16 23:32:18.201327 containerd[1656]: time="2026-04-16T23:32:18.201287526Z" level=info msg="RemoveContainer for \"90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20\"" Apr 16 23:32:18.207965 containerd[1656]: time="2026-04-16T23:32:18.207922861Z" level=info msg="RemoveContainer for \"90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20\" returns successfully" Apr 16 23:32:19.086987 systemd[1]: cri-containerd-c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51.scope: Deactivated successfully. Apr 16 23:32:19.087283 systemd[1]: cri-containerd-c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51.scope: Consumed 3.234s CPU time, 25M memory peak. Apr 16 23:32:19.088627 containerd[1656]: time="2026-04-16T23:32:19.088586984Z" level=info msg="received container exit event container_id:\"c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51\" id:\"c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51\" pid:2750 exit_status:1 exited_at:{seconds:1776382339 nanos:88253504}" Apr 16 23:32:19.108699 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51-rootfs.mount: Deactivated successfully. Apr 16 23:32:19.205611 kubelet[2901]: I0416 23:32:19.205583 2901 scope.go:117] "RemoveContainer" containerID="c17bdcb83b207210961bf4922788cfe3c93ce6a70d684494baebf8c5137d1c51" Apr 16 23:32:19.207790 containerd[1656]: time="2026-04-16T23:32:19.207749975Z" level=info msg="CreateContainer within sandbox \"408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Apr 16 23:32:19.219762 containerd[1656]: time="2026-04-16T23:32:19.219513122Z" level=info msg="Container b6bd70aeb00ced93e5eb84f08566584d093ab43abc2543147ebd05a080197f44: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:32:19.228997 containerd[1656]: time="2026-04-16T23:32:19.228945104Z" level=info msg="CreateContainer within sandbox \"408b0ffba90698793147d628c9990186fdf5ebeafc66ce5823b513b83aee4a11\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"b6bd70aeb00ced93e5eb84f08566584d093ab43abc2543147ebd05a080197f44\"" Apr 16 23:32:19.229584 containerd[1656]: time="2026-04-16T23:32:19.229538465Z" level=info msg="StartContainer for \"b6bd70aeb00ced93e5eb84f08566584d093ab43abc2543147ebd05a080197f44\"" Apr 16 23:32:19.230876 containerd[1656]: time="2026-04-16T23:32:19.230837108Z" level=info msg="connecting to shim b6bd70aeb00ced93e5eb84f08566584d093ab43abc2543147ebd05a080197f44" address="unix:///run/containerd/s/d5b4dcdf25a0938c705e92b16d6bfdefa0c82b9feaabeed974a850c7a2df19d0" protocol=ttrpc version=3 Apr 16 23:32:19.247896 systemd[1]: Started cri-containerd-b6bd70aeb00ced93e5eb84f08566584d093ab43abc2543147ebd05a080197f44.scope - libcontainer container b6bd70aeb00ced93e5eb84f08566584d093ab43abc2543147ebd05a080197f44. Apr 16 23:32:19.284198 containerd[1656]: time="2026-04-16T23:32:19.284158069Z" level=info msg="StartContainer for \"b6bd70aeb00ced93e5eb84f08566584d093ab43abc2543147ebd05a080197f44\" returns successfully" Apr 16 23:32:23.845512 kubelet[2901]: E0416 23:32:23.845416 2901 controller.go:195] "Failed to update lease" err="Put \"https://10.0.3.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-b2725589f5?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 16 23:32:24.949821 containerd[1656]: time="2026-04-16T23:32:24.949690675Z" level=warning msg="container event discarded" container=8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443 type=CONTAINER_CREATED_EVENT Apr 16 23:32:24.949821 containerd[1656]: time="2026-04-16T23:32:24.949805035Z" level=warning msg="container event discarded" container=8839ecd38effab01e7d43f076db94c50f245b6f19a1a0f7a6ecca723c8212443 type=CONTAINER_STARTED_EVENT Apr 16 23:32:24.979197 containerd[1656]: time="2026-04-16T23:32:24.979075462Z" level=warning msg="container event discarded" container=56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff type=CONTAINER_CREATED_EVENT Apr 16 23:32:25.070531 containerd[1656]: time="2026-04-16T23:32:25.070401869Z" level=warning msg="container event discarded" container=56d2a7c87472ff4873cce554b26832d034e4f0ef6b11b0edcdad85eb7f02fdff type=CONTAINER_STARTED_EVENT Apr 16 23:32:25.426218 containerd[1656]: time="2026-04-16T23:32:25.426095718Z" level=warning msg="container event discarded" container=47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32 type=CONTAINER_CREATED_EVENT Apr 16 23:32:25.426218 containerd[1656]: time="2026-04-16T23:32:25.426181238Z" level=warning msg="container event discarded" container=47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32 type=CONTAINER_STARTED_EVENT Apr 16 23:32:27.306855 update_engine[1637]: I20260416 23:32:27.306771 1637 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 16 23:32:27.306855 update_engine[1637]: I20260416 23:32:27.306860 1637 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 16 23:32:27.307561 update_engine[1637]: I20260416 23:32:27.307526 1637 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 16 23:32:27.313260 update_engine[1637]: E20260416 23:32:27.313191 1637 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 16 23:32:27.313350 update_engine[1637]: I20260416 23:32:27.313302 1637 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Apr 16 23:32:27.313350 update_engine[1637]: I20260416 23:32:27.313313 1637 omaha_request_action.cc:617] Omaha request response: Apr 16 23:32:27.313414 update_engine[1637]: E20260416 23:32:27.313395 1637 omaha_request_action.cc:636] Omaha request network transfer failed. Apr 16 23:32:27.313442 update_engine[1637]: I20260416 23:32:27.313418 1637 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Apr 16 23:32:27.313442 update_engine[1637]: I20260416 23:32:27.313424 1637 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 16 23:32:27.313442 update_engine[1637]: I20260416 23:32:27.313428 1637 update_attempter.cc:306] Processing Done. Apr 16 23:32:27.313502 update_engine[1637]: E20260416 23:32:27.313441 1637 update_attempter.cc:619] Update failed. Apr 16 23:32:27.313502 update_engine[1637]: I20260416 23:32:27.313446 1637 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Apr 16 23:32:27.313502 update_engine[1637]: I20260416 23:32:27.313450 1637 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Apr 16 23:32:27.313502 update_engine[1637]: I20260416 23:32:27.313454 1637 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Apr 16 23:32:27.313576 update_engine[1637]: I20260416 23:32:27.313519 1637 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Apr 16 23:32:27.313576 update_engine[1637]: I20260416 23:32:27.313541 1637 omaha_request_action.cc:271] Posting an Omaha request to disabled Apr 16 23:32:27.313576 update_engine[1637]: I20260416 23:32:27.313546 1637 omaha_request_action.cc:272] Request: Apr 16 23:32:27.313576 update_engine[1637]: Apr 16 23:32:27.313576 update_engine[1637]: Apr 16 23:32:27.313576 update_engine[1637]: Apr 16 23:32:27.313576 update_engine[1637]: Apr 16 23:32:27.313576 update_engine[1637]: Apr 16 23:32:27.313576 update_engine[1637]: Apr 16 23:32:27.313576 update_engine[1637]: I20260416 23:32:27.313551 1637 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 16 23:32:27.313576 update_engine[1637]: I20260416 23:32:27.313570 1637 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 16 23:32:27.313894 locksmithd[1683]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Apr 16 23:32:27.314142 update_engine[1637]: I20260416 23:32:27.313864 1637 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 16 23:32:27.319775 update_engine[1637]: E20260416 23:32:27.319710 1637 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 16 23:32:27.319851 update_engine[1637]: I20260416 23:32:27.319803 1637 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Apr 16 23:32:27.319851 update_engine[1637]: I20260416 23:32:27.319825 1637 omaha_request_action.cc:617] Omaha request response: Apr 16 23:32:27.319851 update_engine[1637]: I20260416 23:32:27.319832 1637 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 16 23:32:27.319851 update_engine[1637]: I20260416 23:32:27.319837 1637 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 16 23:32:27.319851 update_engine[1637]: I20260416 23:32:27.319842 1637 update_attempter.cc:306] Processing Done. Apr 16 23:32:27.319851 update_engine[1637]: I20260416 23:32:27.319847 1637 update_attempter.cc:310] Error event sent. Apr 16 23:32:27.319977 update_engine[1637]: I20260416 23:32:27.319855 1637 update_check_scheduler.cc:74] Next update check in 44m56s Apr 16 23:32:27.320178 locksmithd[1683]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Apr 16 23:32:27.548327 containerd[1656]: time="2026-04-16T23:32:27.548219825Z" level=warning msg="container event discarded" container=90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20 type=CONTAINER_CREATED_EVENT Apr 16 23:32:27.597658 containerd[1656]: time="2026-04-16T23:32:27.597531857Z" level=warning msg="container event discarded" container=90f331b440777003f7f32808f4370c9e0f3b10ff3887aa615b1b04187ce51e20 type=CONTAINER_STARTED_EVENT Apr 16 23:32:30.365716 kubelet[2901]: I0416 23:32:30.365662 2901 scope.go:117] "RemoveContainer" containerID="2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4" Apr 16 23:32:30.368012 containerd[1656]: time="2026-04-16T23:32:30.367973239Z" level=info msg="CreateContainer within sandbox \"47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Apr 16 23:32:30.377047 containerd[1656]: time="2026-04-16T23:32:30.376996419Z" level=info msg="Container 6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f: CDI devices from CRI Config.CDIDevices: []" Apr 16 23:32:30.383874 containerd[1656]: time="2026-04-16T23:32:30.383834515Z" level=info msg="CreateContainer within sandbox \"47ad3769b34f0f65b5f551a2b670d00e7ce97f6f6a63fa5b765711f89bbf8f32\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f\"" Apr 16 23:32:30.384327 containerd[1656]: time="2026-04-16T23:32:30.384305836Z" level=info msg="StartContainer for \"6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f\"" Apr 16 23:32:30.385123 containerd[1656]: time="2026-04-16T23:32:30.385098438Z" level=info msg="connecting to shim 6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f" address="unix:///run/containerd/s/dde29e35f05d509134bcad93245f417c05a27854bad4c991013c05cf5bb9f722" protocol=ttrpc version=3 Apr 16 23:32:30.403897 systemd[1]: Started cri-containerd-6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f.scope - libcontainer container 6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f. Apr 16 23:32:30.428395 containerd[1656]: time="2026-04-16T23:32:30.428354616Z" level=info msg="StartContainer for \"6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f\" returns successfully" Apr 16 23:32:33.846292 kubelet[2901]: E0416 23:32:33.845856 2901 controller.go:195] "Failed to update lease" err="Put \"https://10.0.3.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-b2725589f5?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 16 23:32:37.817870 containerd[1656]: time="2026-04-16T23:32:37.817781343Z" level=warning msg="container event discarded" container=4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12 type=CONTAINER_CREATED_EVENT Apr 16 23:32:37.817870 containerd[1656]: time="2026-04-16T23:32:37.817849903Z" level=warning msg="container event discarded" container=4e514b67ac7b826b98387cc908b2d48668901fa81495b690124d50cb614bff12 type=CONTAINER_STARTED_EVENT Apr 16 23:32:37.859198 containerd[1656]: time="2026-04-16T23:32:37.859084556Z" level=warning msg="container event discarded" container=01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5 type=CONTAINER_CREATED_EVENT Apr 16 23:32:37.859198 containerd[1656]: time="2026-04-16T23:32:37.859124917Z" level=warning msg="container event discarded" container=01715b1a40fe240de199f630aa0166d2baa43c2f96e1bd81005582e620e413d5 type=CONTAINER_STARTED_EVENT Apr 16 23:32:39.548792 containerd[1656]: time="2026-04-16T23:32:39.548701039Z" level=warning msg="container event discarded" container=a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c type=CONTAINER_CREATED_EVENT Apr 16 23:32:39.606796 containerd[1656]: time="2026-04-16T23:32:39.606740891Z" level=warning msg="container event discarded" container=a7cf827bdcdfeee51c7456834281d62854cff91ef2b07d91e3c5fc1ee26a6f6c type=CONTAINER_STARTED_EVENT Apr 16 23:32:40.855337 containerd[1656]: time="2026-04-16T23:32:40.855187171Z" level=warning msg="container event discarded" container=49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651 type=CONTAINER_CREATED_EVENT Apr 16 23:32:40.944221 containerd[1656]: time="2026-04-16T23:32:40.944131893Z" level=warning msg="container event discarded" container=49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651 type=CONTAINER_STARTED_EVENT Apr 16 23:32:41.103881 containerd[1656]: time="2026-04-16T23:32:41.103782536Z" level=warning msg="container event discarded" container=49d07664ed9625f412133d3cb1b018833f2c44a81a5261fd6115b39fffca2651 type=CONTAINER_STOPPED_EVENT Apr 16 23:32:41.612326 systemd[1]: cri-containerd-6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f.scope: Deactivated successfully. Apr 16 23:32:41.612912 containerd[1656]: time="2026-04-16T23:32:41.612873934Z" level=info msg="received container exit event container_id:\"6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f\" id:\"6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f\" pid:7060 exit_status:1 exited_at:{seconds:1776382361 nanos:612584773}" Apr 16 23:32:41.632413 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f-rootfs.mount: Deactivated successfully. Apr 16 23:32:42.264593 kubelet[2901]: I0416 23:32:42.264563 2901 scope.go:117] "RemoveContainer" containerID="2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4" Apr 16 23:32:42.265011 kubelet[2901]: I0416 23:32:42.264871 2901 scope.go:117] "RemoveContainer" containerID="6c2fb7f420c26f5ba2fdafdd8b74cbf089e13899bf1baf218e2ca8108db0ae0f" Apr 16 23:32:42.265040 kubelet[2901]: E0416 23:32:42.265004 2901 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=tigera-operator pod=tigera-operator-5588576f44-d5vm6_tigera-operator(867ecad7-0cd8-46ca-b446-c47691f2f99d)\"" pod="tigera-operator/tigera-operator-5588576f44-d5vm6" podUID="867ecad7-0cd8-46ca-b446-c47691f2f99d" Apr 16 23:32:42.266533 containerd[1656]: time="2026-04-16T23:32:42.266441660Z" level=info msg="RemoveContainer for \"2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4\"" Apr 16 23:32:42.272340 containerd[1656]: time="2026-04-16T23:32:42.272266154Z" level=info msg="RemoveContainer for \"2642ce1dbd7c2427ebb055c54f8d9a1f07711befd3c9c9c2188e241e6fe0b2f4\" returns successfully" Apr 16 23:32:42.348424 kubelet[2901]: E0416 23:32:42.348282 2901 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-b2725589f5.18a6fa50e24b93b6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-b2725589f5,UID:a1e3414c0e0f231e941f0dc3f4920754,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-b2725589f5,},FirstTimestamp:2026-04-16 23:32:08.34517087 +0000 UTC m=+290.082905676,LastTimestamp:2026-04-16 23:32:08.34517087 +0000 UTC m=+290.082905676,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-b2725589f5,}" Apr 16 23:32:43.847394 kubelet[2901]: E0416 23:32:43.846850 2901 controller.go:195] "Failed to update lease" err="Put \"https://10.0.3.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-b2725589f5?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 16 23:32:45.001781 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Apr 16 23:32:45.053581 containerd[1656]: time="2026-04-16T23:32:45.053513559Z" level=warning msg="container event discarded" container=bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63 type=CONTAINER_CREATED_EVENT Apr 16 23:32:45.147017 containerd[1656]: time="2026-04-16T23:32:45.146945812Z" level=warning msg="container event discarded" container=bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63 type=CONTAINER_STARTED_EVENT Apr 16 23:32:45.458420 containerd[1656]: time="2026-04-16T23:32:45.458363160Z" level=warning msg="container event discarded" container=bd2d34bf94e40c0282fd65e9e0b2cdddf2d07e9c316bc07f7af909708f95de63 type=CONTAINER_STOPPED_EVENT Apr 16 23:32:47.657905 containerd[1656]: time="2026-04-16T23:32:47.657820963Z" level=warning msg="container event discarded" container=9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a type=CONTAINER_CREATED_EVENT Apr 16 23:32:47.741211 containerd[1656]: time="2026-04-16T23:32:47.741148672Z" level=warning msg="container event discarded" container=9160d41bb051fce55de2beadc68df47da7a5ab96d75631cda553b905104e3c5a type=CONTAINER_STARTED_EVENT