Jan 23 00:03:16.781426 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 23 00:03:16.781449 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Thu Jan 22 22:21:53 -00 2026 Jan 23 00:03:16.781459 kernel: KASLR enabled Jan 23 00:03:16.781465 kernel: efi: EFI v2.7 by EDK II Jan 23 00:03:16.781470 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 23 00:03:16.781476 kernel: random: crng init done Jan 23 00:03:16.781483 kernel: secureboot: Secure boot disabled Jan 23 00:03:16.781489 kernel: ACPI: Early table checksum verification disabled Jan 23 00:03:16.781495 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 23 00:03:16.781501 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 23 00:03:16.781520 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781527 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781533 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781538 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781546 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781552 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781559 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781565 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781578 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781593 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 00:03:16.781600 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 23 00:03:16.781606 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 23 00:03:16.781612 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 23 00:03:16.781618 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 23 00:03:16.781624 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 23 00:03:16.781630 kernel: Zone ranges: Jan 23 00:03:16.781637 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 23 00:03:16.781643 kernel: DMA32 empty Jan 23 00:03:16.781649 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 23 00:03:16.781656 kernel: Device empty Jan 23 00:03:16.781661 kernel: Movable zone start for each node Jan 23 00:03:16.781667 kernel: Early memory node ranges Jan 23 00:03:16.781673 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 23 00:03:16.781679 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 23 00:03:16.781685 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 23 00:03:16.781691 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 23 00:03:16.781698 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 23 00:03:16.781704 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 23 00:03:16.781711 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 23 00:03:16.781728 kernel: psci: probing for conduit method from ACPI. Jan 23 00:03:16.781739 kernel: psci: PSCIv1.3 detected in firmware. Jan 23 00:03:16.781746 kernel: psci: Using standard PSCI v0.2 function IDs Jan 23 00:03:16.781752 kernel: psci: Trusted OS migration not required Jan 23 00:03:16.781760 kernel: psci: SMC Calling Convention v1.1 Jan 23 00:03:16.781767 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 23 00:03:16.781773 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 23 00:03:16.781779 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 23 00:03:16.781786 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 23 00:03:16.781792 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 23 00:03:16.781798 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 23 00:03:16.781805 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 23 00:03:16.781811 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 23 00:03:16.781817 kernel: Detected PIPT I-cache on CPU0 Jan 23 00:03:16.781824 kernel: CPU features: detected: GIC system register CPU interface Jan 23 00:03:16.781830 kernel: CPU features: detected: Spectre-v4 Jan 23 00:03:16.781838 kernel: CPU features: detected: Spectre-BHB Jan 23 00:03:16.781844 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 23 00:03:16.781850 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 23 00:03:16.781857 kernel: CPU features: detected: ARM erratum 1418040 Jan 23 00:03:16.781863 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 23 00:03:16.781869 kernel: alternatives: applying boot alternatives Jan 23 00:03:16.781877 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=38aa0560e146398cb8c3378a56d449784f1c7652139d7b61279d764fcc4c793a Jan 23 00:03:16.781884 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 23 00:03:16.781890 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 23 00:03:16.781901 kernel: Fallback order for Node 0: 0 Jan 23 00:03:16.781909 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 23 00:03:16.781915 kernel: Policy zone: Normal Jan 23 00:03:16.781922 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 23 00:03:16.781928 kernel: software IO TLB: area num 4. Jan 23 00:03:16.781934 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 23 00:03:16.781941 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 23 00:03:16.781947 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 23 00:03:16.781954 kernel: rcu: RCU event tracing is enabled. Jan 23 00:03:16.781961 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 23 00:03:16.781967 kernel: Trampoline variant of Tasks RCU enabled. Jan 23 00:03:16.781973 kernel: Tracing variant of Tasks RCU enabled. Jan 23 00:03:16.781980 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 23 00:03:16.781987 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 23 00:03:16.781994 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 23 00:03:16.782000 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 23 00:03:16.782007 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 23 00:03:16.782013 kernel: GICv3: 256 SPIs implemented Jan 23 00:03:16.782019 kernel: GICv3: 0 Extended SPIs implemented Jan 23 00:03:16.782025 kernel: Root IRQ handler: gic_handle_irq Jan 23 00:03:16.782032 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 23 00:03:16.782039 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 23 00:03:16.782045 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 23 00:03:16.782051 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 23 00:03:16.782058 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 23 00:03:16.782065 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 23 00:03:16.782072 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 23 00:03:16.782078 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 23 00:03:16.782085 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 23 00:03:16.782091 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 00:03:16.782097 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 23 00:03:16.782104 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 23 00:03:16.782110 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 23 00:03:16.782117 kernel: arm-pv: using stolen time PV Jan 23 00:03:16.782124 kernel: Console: colour dummy device 80x25 Jan 23 00:03:16.782132 kernel: ACPI: Core revision 20240827 Jan 23 00:03:16.782138 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 23 00:03:16.782145 kernel: pid_max: default: 32768 minimum: 301 Jan 23 00:03:16.782152 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 23 00:03:16.782158 kernel: landlock: Up and running. Jan 23 00:03:16.782164 kernel: SELinux: Initializing. Jan 23 00:03:16.782171 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 00:03:16.782178 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 00:03:16.782184 kernel: rcu: Hierarchical SRCU implementation. Jan 23 00:03:16.782191 kernel: rcu: Max phase no-delay instances is 400. Jan 23 00:03:16.782199 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 23 00:03:16.782206 kernel: Remapping and enabling EFI services. Jan 23 00:03:16.782212 kernel: smp: Bringing up secondary CPUs ... Jan 23 00:03:16.782219 kernel: Detected PIPT I-cache on CPU1 Jan 23 00:03:16.782226 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 23 00:03:16.782232 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 23 00:03:16.782239 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 00:03:16.782245 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 23 00:03:16.782252 kernel: Detected PIPT I-cache on CPU2 Jan 23 00:03:16.782264 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 23 00:03:16.782271 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 23 00:03:16.782278 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 00:03:16.782286 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 23 00:03:16.782293 kernel: Detected PIPT I-cache on CPU3 Jan 23 00:03:16.782300 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 23 00:03:16.782307 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 23 00:03:16.782314 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 00:03:16.782322 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 23 00:03:16.782329 kernel: smp: Brought up 1 node, 4 CPUs Jan 23 00:03:16.782336 kernel: SMP: Total of 4 processors activated. Jan 23 00:03:16.782342 kernel: CPU: All CPU(s) started at EL1 Jan 23 00:03:16.782349 kernel: CPU features: detected: 32-bit EL0 Support Jan 23 00:03:16.782356 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 23 00:03:16.782363 kernel: CPU features: detected: Common not Private translations Jan 23 00:03:16.782370 kernel: CPU features: detected: CRC32 instructions Jan 23 00:03:16.782377 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 23 00:03:16.782385 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 23 00:03:16.782392 kernel: CPU features: detected: LSE atomic instructions Jan 23 00:03:16.782399 kernel: CPU features: detected: Privileged Access Never Jan 23 00:03:16.782406 kernel: CPU features: detected: RAS Extension Support Jan 23 00:03:16.782413 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 23 00:03:16.782420 kernel: alternatives: applying system-wide alternatives Jan 23 00:03:16.782426 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 23 00:03:16.782434 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Jan 23 00:03:16.782441 kernel: devtmpfs: initialized Jan 23 00:03:16.782449 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 23 00:03:16.782456 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 23 00:03:16.782463 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 23 00:03:16.782470 kernel: 0 pages in range for non-PLT usage Jan 23 00:03:16.782477 kernel: 508400 pages in range for PLT usage Jan 23 00:03:16.782484 kernel: pinctrl core: initialized pinctrl subsystem Jan 23 00:03:16.782491 kernel: SMBIOS 3.0.0 present. Jan 23 00:03:16.782497 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 23 00:03:16.782504 kernel: DMI: Memory slots populated: 1/1 Jan 23 00:03:16.782512 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 23 00:03:16.782519 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 23 00:03:16.782526 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 23 00:03:16.782533 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 23 00:03:16.782540 kernel: audit: initializing netlink subsys (disabled) Jan 23 00:03:16.782547 kernel: audit: type=2000 audit(0.040:1): state=initialized audit_enabled=0 res=1 Jan 23 00:03:16.782554 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 23 00:03:16.782561 kernel: cpuidle: using governor menu Jan 23 00:03:16.782567 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 23 00:03:16.782576 kernel: ASID allocator initialised with 32768 entries Jan 23 00:03:16.782583 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 23 00:03:16.782589 kernel: Serial: AMBA PL011 UART driver Jan 23 00:03:16.782596 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 23 00:03:16.782603 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 23 00:03:16.782610 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 23 00:03:16.782617 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 23 00:03:16.782624 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 23 00:03:16.782631 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 23 00:03:16.782639 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 23 00:03:16.782646 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 23 00:03:16.782652 kernel: ACPI: Added _OSI(Module Device) Jan 23 00:03:16.782659 kernel: ACPI: Added _OSI(Processor Device) Jan 23 00:03:16.782666 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 23 00:03:16.782673 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 23 00:03:16.782680 kernel: ACPI: Interpreter enabled Jan 23 00:03:16.782686 kernel: ACPI: Using GIC for interrupt routing Jan 23 00:03:16.782693 kernel: ACPI: MCFG table detected, 1 entries Jan 23 00:03:16.782701 kernel: ACPI: CPU0 has been hot-added Jan 23 00:03:16.782708 kernel: ACPI: CPU1 has been hot-added Jan 23 00:03:16.782718 kernel: ACPI: CPU2 has been hot-added Jan 23 00:03:16.782731 kernel: ACPI: CPU3 has been hot-added Jan 23 00:03:16.782742 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 23 00:03:16.782751 kernel: printk: legacy console [ttyAMA0] enabled Jan 23 00:03:16.782758 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 23 00:03:16.782945 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 23 00:03:16.783016 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 23 00:03:16.783077 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 23 00:03:16.783134 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 23 00:03:16.783190 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 23 00:03:16.783199 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 23 00:03:16.783206 kernel: PCI host bridge to bus 0000:00 Jan 23 00:03:16.783270 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 23 00:03:16.783326 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 23 00:03:16.783378 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 23 00:03:16.783429 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 23 00:03:16.783515 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 23 00:03:16.783587 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.783650 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 23 00:03:16.783711 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 23 00:03:16.783791 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 23 00:03:16.783852 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 23 00:03:16.783923 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.783982 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 23 00:03:16.784040 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 23 00:03:16.784099 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 23 00:03:16.784164 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.784224 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 23 00:03:16.784282 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 23 00:03:16.784354 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 23 00:03:16.784413 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 23 00:03:16.784477 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.784536 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 23 00:03:16.784594 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 23 00:03:16.784654 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 23 00:03:16.784734 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.784798 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 23 00:03:16.784856 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 23 00:03:16.784913 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 23 00:03:16.784970 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 23 00:03:16.785039 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.785100 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 23 00:03:16.785158 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 23 00:03:16.785216 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 23 00:03:16.785273 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 23 00:03:16.785337 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.785396 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 23 00:03:16.785453 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 23 00:03:16.785536 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.785601 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 23 00:03:16.785659 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 23 00:03:16.785745 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.785812 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 23 00:03:16.785871 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 23 00:03:16.785937 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.786000 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 23 00:03:16.786057 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 23 00:03:16.786125 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.786184 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 23 00:03:16.786242 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 23 00:03:16.786307 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.786365 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 23 00:03:16.786423 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 23 00:03:16.786487 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.786546 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 23 00:03:16.786604 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 23 00:03:16.786668 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.786742 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 23 00:03:16.786809 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 23 00:03:16.786874 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.786932 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 23 00:03:16.786989 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 23 00:03:16.787053 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.787112 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 23 00:03:16.787172 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 23 00:03:16.787236 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.787294 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 23 00:03:16.787350 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 23 00:03:16.787413 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.787473 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 23 00:03:16.787531 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 23 00:03:16.787590 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 23 00:03:16.787649 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 23 00:03:16.787714 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.787793 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 23 00:03:16.787851 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 23 00:03:16.787909 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 23 00:03:16.787966 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 23 00:03:16.788041 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.788102 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 23 00:03:16.788158 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 23 00:03:16.788215 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 23 00:03:16.788271 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 23 00:03:16.788335 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.788392 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 23 00:03:16.788452 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 23 00:03:16.788509 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 23 00:03:16.788566 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 23 00:03:16.788629 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.788687 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 23 00:03:16.788762 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 23 00:03:16.788823 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 23 00:03:16.788884 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 23 00:03:16.788949 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.789008 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 23 00:03:16.789066 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 23 00:03:16.789125 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 23 00:03:16.789182 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 23 00:03:16.789245 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.789303 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 23 00:03:16.789362 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 23 00:03:16.789419 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 23 00:03:16.789476 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 23 00:03:16.789566 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.789630 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 23 00:03:16.789688 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 23 00:03:16.789756 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 23 00:03:16.789819 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 23 00:03:16.789883 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.789941 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 23 00:03:16.789998 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 23 00:03:16.790055 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 23 00:03:16.790111 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 23 00:03:16.790174 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.790242 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 23 00:03:16.790301 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 23 00:03:16.790358 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 23 00:03:16.790416 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 23 00:03:16.790479 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.790541 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 23 00:03:16.790598 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 23 00:03:16.790657 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 23 00:03:16.790713 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 23 00:03:16.790795 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.790854 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 23 00:03:16.790913 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 23 00:03:16.790970 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 23 00:03:16.791027 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 23 00:03:16.791089 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.791147 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 23 00:03:16.791204 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 23 00:03:16.791266 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 23 00:03:16.791327 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 23 00:03:16.791394 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.791457 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 23 00:03:16.791519 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 23 00:03:16.791585 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 23 00:03:16.791646 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 23 00:03:16.791731 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.791794 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 23 00:03:16.791855 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 23 00:03:16.791935 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 23 00:03:16.791996 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 23 00:03:16.792061 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 00:03:16.792120 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 23 00:03:16.792179 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 23 00:03:16.792242 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 23 00:03:16.792304 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 23 00:03:16.792375 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 23 00:03:16.792437 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 23 00:03:16.792498 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 23 00:03:16.792559 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 23 00:03:16.792626 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 23 00:03:16.792689 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 23 00:03:16.792776 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 23 00:03:16.792841 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 23 00:03:16.792903 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 23 00:03:16.792992 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 23 00:03:16.793053 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 23 00:03:16.793125 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 23 00:03:16.793194 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 23 00:03:16.793256 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 23 00:03:16.793324 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 23 00:03:16.793395 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 23 00:03:16.793455 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 23 00:03:16.793527 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 23 00:03:16.793592 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 23 00:03:16.793650 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 23 00:03:16.793712 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 23 00:03:16.793780 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 23 00:03:16.793844 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 23 00:03:16.793908 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 23 00:03:16.793969 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 23 00:03:16.794028 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 23 00:03:16.794091 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 23 00:03:16.794149 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 23 00:03:16.794208 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 23 00:03:16.794271 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 23 00:03:16.794331 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 23 00:03:16.794389 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 23 00:03:16.794451 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 23 00:03:16.794510 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 23 00:03:16.794569 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 23 00:03:16.794631 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 23 00:03:16.794694 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 23 00:03:16.794788 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 23 00:03:16.794860 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 23 00:03:16.794920 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 23 00:03:16.794981 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 23 00:03:16.795047 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 23 00:03:16.795107 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 23 00:03:16.795167 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 23 00:03:16.795229 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 23 00:03:16.795288 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 23 00:03:16.795350 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 23 00:03:16.795418 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 23 00:03:16.795477 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 23 00:03:16.795537 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 23 00:03:16.795617 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 23 00:03:16.795676 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 23 00:03:16.795744 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 23 00:03:16.795812 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 23 00:03:16.795871 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 23 00:03:16.795928 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 23 00:03:16.795989 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 23 00:03:16.796048 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 23 00:03:16.796106 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 23 00:03:16.796167 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 23 00:03:16.796224 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 23 00:03:16.796281 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 23 00:03:16.796343 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 23 00:03:16.796406 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 23 00:03:16.796467 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 23 00:03:16.796530 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 23 00:03:16.796590 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 23 00:03:16.796655 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 23 00:03:16.796730 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 23 00:03:16.796795 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 23 00:03:16.796856 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 23 00:03:16.796918 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 23 00:03:16.796976 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 23 00:03:16.797048 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 23 00:03:16.797110 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 23 00:03:16.797168 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 23 00:03:16.797226 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 23 00:03:16.797292 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 23 00:03:16.797352 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 23 00:03:16.797410 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 23 00:03:16.797470 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 23 00:03:16.797543 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 23 00:03:16.797607 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 23 00:03:16.797669 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 23 00:03:16.797748 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 23 00:03:16.797813 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 23 00:03:16.797877 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 23 00:03:16.797938 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 23 00:03:16.797997 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 23 00:03:16.798058 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 23 00:03:16.798118 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 23 00:03:16.798175 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 23 00:03:16.798237 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 23 00:03:16.798295 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 23 00:03:16.798353 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 23 00:03:16.798413 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 23 00:03:16.798472 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 23 00:03:16.798530 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 23 00:03:16.798592 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 23 00:03:16.798650 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 23 00:03:16.798707 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 23 00:03:16.798776 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 23 00:03:16.798836 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 23 00:03:16.798896 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 23 00:03:16.798961 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 23 00:03:16.799020 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 23 00:03:16.799078 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 23 00:03:16.799139 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 23 00:03:16.799199 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 23 00:03:16.799256 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 23 00:03:16.799317 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 23 00:03:16.799378 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 23 00:03:16.799435 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 23 00:03:16.799495 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 23 00:03:16.799553 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 23 00:03:16.799611 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 23 00:03:16.799670 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 23 00:03:16.799736 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 23 00:03:16.799800 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 23 00:03:16.799860 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 23 00:03:16.799919 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 23 00:03:16.799976 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 23 00:03:16.800036 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 23 00:03:16.800093 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 23 00:03:16.800155 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 23 00:03:16.800216 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 23 00:03:16.800275 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 23 00:03:16.800333 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 23 00:03:16.800391 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 23 00:03:16.800449 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 23 00:03:16.800508 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 23 00:03:16.800566 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 23 00:03:16.800624 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 23 00:03:16.800684 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 23 00:03:16.800752 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 23 00:03:16.800810 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 23 00:03:16.800869 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 23 00:03:16.800927 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 23 00:03:16.800986 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 23 00:03:16.801044 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 23 00:03:16.801103 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 23 00:03:16.801163 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 23 00:03:16.801221 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 23 00:03:16.801279 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 23 00:03:16.801337 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 23 00:03:16.801395 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 23 00:03:16.801455 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 23 00:03:16.801528 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 23 00:03:16.801594 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 23 00:03:16.801655 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 23 00:03:16.801714 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 23 00:03:16.801795 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 23 00:03:16.801857 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 23 00:03:16.801916 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 23 00:03:16.801976 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 23 00:03:16.802034 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 23 00:03:16.802097 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 23 00:03:16.802156 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 23 00:03:16.802216 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 23 00:03:16.802274 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 23 00:03:16.802333 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 23 00:03:16.802392 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 23 00:03:16.802451 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 23 00:03:16.802510 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 23 00:03:16.802572 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 23 00:03:16.802631 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 23 00:03:16.802693 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 23 00:03:16.802775 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 23 00:03:16.802837 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 23 00:03:16.802895 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 23 00:03:16.802955 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 23 00:03:16.803013 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 23 00:03:16.803075 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 23 00:03:16.803133 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 23 00:03:16.803192 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 23 00:03:16.803250 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 23 00:03:16.803309 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 23 00:03:16.803367 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 23 00:03:16.803426 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 23 00:03:16.803484 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 23 00:03:16.803544 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 23 00:03:16.803603 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 23 00:03:16.803662 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 23 00:03:16.803726 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 23 00:03:16.803794 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 23 00:03:16.803852 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 23 00:03:16.803911 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 23 00:03:16.803971 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 23 00:03:16.804030 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 23 00:03:16.804087 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 23 00:03:16.804145 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 23 00:03:16.804203 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 23 00:03:16.804261 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 23 00:03:16.804319 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 23 00:03:16.804377 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 23 00:03:16.804436 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 23 00:03:16.804495 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 23 00:03:16.804552 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 23 00:03:16.804611 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 23 00:03:16.804669 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 23 00:03:16.804737 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 23 00:03:16.804797 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 23 00:03:16.804855 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 23 00:03:16.804915 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 23 00:03:16.804974 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 23 00:03:16.805031 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 23 00:03:16.805090 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 23 00:03:16.805147 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 23 00:03:16.805207 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 23 00:03:16.805266 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 23 00:03:16.805324 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 23 00:03:16.805384 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 23 00:03:16.805443 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 23 00:03:16.805501 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.805580 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.805640 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 23 00:03:16.805701 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.805789 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.805854 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 23 00:03:16.805914 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.805972 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.806031 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 23 00:03:16.806090 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.806149 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.806212 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 23 00:03:16.806270 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.806328 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.806388 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 23 00:03:16.806446 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.806504 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.806575 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 23 00:03:16.806636 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.806697 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.806771 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 23 00:03:16.806833 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.806892 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.806954 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 23 00:03:16.807012 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.807070 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.807130 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 23 00:03:16.807190 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.807248 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.807307 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 23 00:03:16.807366 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.807423 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.807482 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 23 00:03:16.807540 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.807598 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.807660 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 23 00:03:16.807717 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.807799 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.807860 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 23 00:03:16.807919 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.807977 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.808036 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 23 00:03:16.808094 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.808154 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.808213 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 23 00:03:16.808271 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.808329 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.808389 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 23 00:03:16.808448 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.808506 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.808565 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 23 00:03:16.808628 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.808687 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.808768 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 23 00:03:16.808833 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 23 00:03:16.808893 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 23 00:03:16.808958 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 23 00:03:16.809020 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 23 00:03:16.809081 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 23 00:03:16.809141 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 23 00:03:16.809200 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 23 00:03:16.809260 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 23 00:03:16.809319 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 23 00:03:16.809379 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 23 00:03:16.809438 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 23 00:03:16.809500 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 23 00:03:16.809576 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 23 00:03:16.809637 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 23 00:03:16.809697 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.809774 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.809836 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.809895 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.809955 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810012 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810074 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810133 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810192 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810253 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810313 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810375 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810435 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810495 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810555 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810615 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810673 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810748 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810812 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810873 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.810934 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.810995 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811072 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.811132 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811192 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.811256 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811317 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.811379 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811439 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.811498 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811557 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.811618 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811679 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.811747 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811809 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 00:03:16.811868 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 23 00:03:16.811935 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 23 00:03:16.811999 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 23 00:03:16.812066 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 23 00:03:16.812128 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 23 00:03:16.812189 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 23 00:03:16.812249 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 23 00:03:16.812314 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 23 00:03:16.812374 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 23 00:03:16.812435 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 23 00:03:16.812496 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 23 00:03:16.812563 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 23 00:03:16.812626 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 23 00:03:16.812712 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 23 00:03:16.812780 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 23 00:03:16.812840 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 23 00:03:16.812908 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 23 00:03:16.812971 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 23 00:03:16.813032 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 23 00:03:16.813090 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 23 00:03:16.813154 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 23 00:03:16.813215 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 23 00:03:16.813273 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 23 00:03:16.813331 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 23 00:03:16.813390 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 23 00:03:16.813455 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 23 00:03:16.813530 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 23 00:03:16.813603 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 23 00:03:16.813666 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 23 00:03:16.813736 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 23 00:03:16.813800 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 23 00:03:16.813862 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 23 00:03:16.813924 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 23 00:03:16.813983 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 23 00:03:16.814040 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 23 00:03:16.814100 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 23 00:03:16.814159 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 23 00:03:16.814225 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 23 00:03:16.814287 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 23 00:03:16.814347 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 23 00:03:16.814405 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 23 00:03:16.814463 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 23 00:03:16.814521 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 23 00:03:16.814579 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 23 00:03:16.814637 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 23 00:03:16.814697 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 23 00:03:16.814774 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 23 00:03:16.814835 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 23 00:03:16.814895 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 23 00:03:16.814953 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 23 00:03:16.815011 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 23 00:03:16.815074 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 23 00:03:16.815136 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 23 00:03:16.815196 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 23 00:03:16.815256 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 23 00:03:16.815314 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 23 00:03:16.815372 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 23 00:03:16.815433 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 23 00:03:16.815493 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 23 00:03:16.815554 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 23 00:03:16.815622 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 23 00:03:16.815683 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 23 00:03:16.815753 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 23 00:03:16.815816 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 23 00:03:16.815878 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 23 00:03:16.815936 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 23 00:03:16.815997 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 23 00:03:16.816057 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 23 00:03:16.816116 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 23 00:03:16.816174 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 23 00:03:16.816233 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 23 00:03:16.816292 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 23 00:03:16.816353 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 23 00:03:16.816412 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 23 00:03:16.816471 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 23 00:03:16.816529 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 23 00:03:16.816589 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 23 00:03:16.816647 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 23 00:03:16.816706 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 23 00:03:16.816785 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 23 00:03:16.816856 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 23 00:03:16.816917 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 23 00:03:16.816979 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 23 00:03:16.817037 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 23 00:03:16.817096 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 23 00:03:16.817153 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 23 00:03:16.817213 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 23 00:03:16.817272 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 23 00:03:16.817331 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 23 00:03:16.817392 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 23 00:03:16.817452 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 23 00:03:16.817523 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 23 00:03:16.817588 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 23 00:03:16.817648 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 23 00:03:16.817709 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 23 00:03:16.817798 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 23 00:03:16.817858 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 23 00:03:16.817921 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 23 00:03:16.817983 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 23 00:03:16.818043 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 23 00:03:16.818101 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 23 00:03:16.818159 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 23 00:03:16.818221 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 23 00:03:16.818281 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 23 00:03:16.818339 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 23 00:03:16.818399 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 23 00:03:16.818461 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 23 00:03:16.818521 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 23 00:03:16.818580 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 23 00:03:16.818637 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 23 00:03:16.818697 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 23 00:03:16.818780 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 23 00:03:16.818843 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 23 00:03:16.818915 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 23 00:03:16.818978 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 23 00:03:16.819040 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 23 00:03:16.819100 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 23 00:03:16.819158 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 23 00:03:16.819220 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 23 00:03:16.819285 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 23 00:03:16.819347 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 23 00:03:16.819405 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 23 00:03:16.819469 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 23 00:03:16.819531 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 23 00:03:16.819591 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 23 00:03:16.819654 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 23 00:03:16.819725 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 23 00:03:16.819781 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 23 00:03:16.819834 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 23 00:03:16.819901 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 23 00:03:16.819957 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 23 00:03:16.820018 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 23 00:03:16.820073 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 23 00:03:16.820141 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 23 00:03:16.820202 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 23 00:03:16.820265 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 23 00:03:16.820326 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 23 00:03:16.820393 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 23 00:03:16.820450 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 23 00:03:16.820512 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 23 00:03:16.820566 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 23 00:03:16.820636 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 23 00:03:16.820691 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 23 00:03:16.820768 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 23 00:03:16.820827 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 23 00:03:16.820891 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 23 00:03:16.820946 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 23 00:03:16.821006 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 23 00:03:16.821064 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 23 00:03:16.821129 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 23 00:03:16.821185 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 23 00:03:16.821245 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 23 00:03:16.821300 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 23 00:03:16.821361 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 23 00:03:16.821419 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 23 00:03:16.821482 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 23 00:03:16.821555 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 23 00:03:16.821620 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 23 00:03:16.821677 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 23 00:03:16.821789 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 23 00:03:16.821854 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 23 00:03:16.821917 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 23 00:03:16.821973 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 23 00:03:16.822033 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 23 00:03:16.822087 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 23 00:03:16.822152 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 23 00:03:16.822206 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 23 00:03:16.822260 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 23 00:03:16.822321 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 23 00:03:16.822375 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 23 00:03:16.822441 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 23 00:03:16.822503 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 23 00:03:16.822567 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 23 00:03:16.822621 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 23 00:03:16.822683 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 23 00:03:16.822765 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 23 00:03:16.822825 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 23 00:03:16.822894 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 23 00:03:16.822952 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 23 00:03:16.823011 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 23 00:03:16.823078 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 23 00:03:16.823140 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 23 00:03:16.823197 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 23 00:03:16.823259 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 23 00:03:16.823315 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 23 00:03:16.823371 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 23 00:03:16.823433 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 23 00:03:16.823489 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 23 00:03:16.823546 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 23 00:03:16.823612 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 23 00:03:16.823667 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 23 00:03:16.823731 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 23 00:03:16.823797 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 23 00:03:16.823852 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 23 00:03:16.823915 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 23 00:03:16.823978 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 23 00:03:16.824034 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 23 00:03:16.824089 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 23 00:03:16.824152 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 23 00:03:16.824212 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 23 00:03:16.824269 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 23 00:03:16.824335 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 23 00:03:16.824393 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 23 00:03:16.824474 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 23 00:03:16.824537 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 23 00:03:16.824596 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 23 00:03:16.824654 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 23 00:03:16.824740 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 23 00:03:16.824803 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 23 00:03:16.824857 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 23 00:03:16.824867 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 23 00:03:16.824875 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 23 00:03:16.824882 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 23 00:03:16.824892 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 23 00:03:16.824899 kernel: iommu: Default domain type: Translated Jan 23 00:03:16.824906 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 23 00:03:16.824914 kernel: efivars: Registered efivars operations Jan 23 00:03:16.824921 kernel: vgaarb: loaded Jan 23 00:03:16.824929 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 23 00:03:16.824936 kernel: VFS: Disk quotas dquot_6.6.0 Jan 23 00:03:16.824943 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 23 00:03:16.824951 kernel: pnp: PnP ACPI init Jan 23 00:03:16.825017 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 23 00:03:16.825030 kernel: pnp: PnP ACPI: found 1 devices Jan 23 00:03:16.825037 kernel: NET: Registered PF_INET protocol family Jan 23 00:03:16.825045 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 23 00:03:16.825052 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 23 00:03:16.825060 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 23 00:03:16.825067 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 23 00:03:16.825074 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 23 00:03:16.825082 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 23 00:03:16.825091 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 23 00:03:16.825098 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 23 00:03:16.825106 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 23 00:03:16.825172 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 23 00:03:16.825182 kernel: PCI: CLS 0 bytes, default 64 Jan 23 00:03:16.825189 kernel: kvm [1]: HYP mode not available Jan 23 00:03:16.825197 kernel: Initialise system trusted keyrings Jan 23 00:03:16.825204 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 23 00:03:16.825213 kernel: Key type asymmetric registered Jan 23 00:03:16.825220 kernel: Asymmetric key parser 'x509' registered Jan 23 00:03:16.825228 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 23 00:03:16.825235 kernel: io scheduler mq-deadline registered Jan 23 00:03:16.825242 kernel: io scheduler kyber registered Jan 23 00:03:16.825250 kernel: io scheduler bfq registered Jan 23 00:03:16.825258 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 23 00:03:16.825324 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 23 00:03:16.825384 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 23 00:03:16.825446 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.825520 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 23 00:03:16.825586 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 23 00:03:16.825646 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.825708 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 23 00:03:16.825777 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 23 00:03:16.825838 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.825900 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 23 00:03:16.825962 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 23 00:03:16.826020 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.826080 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 23 00:03:16.826139 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 23 00:03:16.826197 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.826258 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 23 00:03:16.826319 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 23 00:03:16.826379 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.826442 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 23 00:03:16.826502 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 23 00:03:16.826562 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.826625 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 23 00:03:16.826687 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 23 00:03:16.826755 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.826766 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 23 00:03:16.826834 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 23 00:03:16.826900 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 23 00:03:16.826959 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.827022 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 23 00:03:16.827082 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 23 00:03:16.827141 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.827201 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 23 00:03:16.827260 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 23 00:03:16.827319 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.827380 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 23 00:03:16.827438 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 23 00:03:16.827496 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.827561 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 23 00:03:16.827627 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 23 00:03:16.827692 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.827772 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 23 00:03:16.827842 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 23 00:03:16.827906 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.827969 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 23 00:03:16.828030 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 23 00:03:16.828087 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.828156 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 23 00:03:16.828220 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 23 00:03:16.828278 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.828304 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 23 00:03:16.828362 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 23 00:03:16.828422 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 23 00:03:16.828479 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.828542 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 23 00:03:16.828600 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 23 00:03:16.828658 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.828727 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 23 00:03:16.828795 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 23 00:03:16.828854 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.828916 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 23 00:03:16.828975 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 23 00:03:16.829033 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.829094 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 23 00:03:16.829152 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 23 00:03:16.829213 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.829275 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 23 00:03:16.829334 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 23 00:03:16.829393 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.829454 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 23 00:03:16.829528 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 23 00:03:16.829591 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.829654 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 23 00:03:16.829714 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 23 00:03:16.829795 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.829805 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 23 00:03:16.829866 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 23 00:03:16.829925 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 23 00:03:16.829985 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.830048 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 23 00:03:16.830107 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 23 00:03:16.830169 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.830232 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 23 00:03:16.830292 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 23 00:03:16.830350 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.830412 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 23 00:03:16.830470 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 23 00:03:16.830528 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.830589 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 23 00:03:16.830650 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 23 00:03:16.830707 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.830782 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 23 00:03:16.830843 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 23 00:03:16.830902 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.830962 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 23 00:03:16.831020 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 23 00:03:16.831079 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.831141 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 23 00:03:16.831200 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 23 00:03:16.831258 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.831318 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 23 00:03:16.831377 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 23 00:03:16.831435 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 00:03:16.831445 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 23 00:03:16.831454 kernel: ACPI: button: Power Button [PWRB] Jan 23 00:03:16.831518 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 23 00:03:16.831584 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 23 00:03:16.831595 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 23 00:03:16.831602 kernel: thunder_xcv, ver 1.0 Jan 23 00:03:16.831609 kernel: thunder_bgx, ver 1.0 Jan 23 00:03:16.831617 kernel: nicpf, ver 1.0 Jan 23 00:03:16.831624 kernel: nicvf, ver 1.0 Jan 23 00:03:16.831695 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 23 00:03:16.831766 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-23T00:03:16 UTC (1769126596) Jan 23 00:03:16.831777 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 23 00:03:16.831784 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 23 00:03:16.831793 kernel: watchdog: NMI not fully supported Jan 23 00:03:16.831800 kernel: watchdog: Hard watchdog permanently disabled Jan 23 00:03:16.831808 kernel: NET: Registered PF_INET6 protocol family Jan 23 00:03:16.831815 kernel: Segment Routing with IPv6 Jan 23 00:03:16.831823 kernel: In-situ OAM (IOAM) with IPv6 Jan 23 00:03:16.831832 kernel: NET: Registered PF_PACKET protocol family Jan 23 00:03:16.831839 kernel: Key type dns_resolver registered Jan 23 00:03:16.831847 kernel: registered taskstats version 1 Jan 23 00:03:16.831854 kernel: Loading compiled-in X.509 certificates Jan 23 00:03:16.831862 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 380753d9165686712e58c1d21e00c0268e70f18f' Jan 23 00:03:16.831869 kernel: Demotion targets for Node 0: null Jan 23 00:03:16.831876 kernel: Key type .fscrypt registered Jan 23 00:03:16.831884 kernel: Key type fscrypt-provisioning registered Jan 23 00:03:16.831891 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 23 00:03:16.831900 kernel: ima: Allocated hash algorithm: sha1 Jan 23 00:03:16.831907 kernel: ima: No architecture policies found Jan 23 00:03:16.831915 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 23 00:03:16.831922 kernel: clk: Disabling unused clocks Jan 23 00:03:16.831930 kernel: PM: genpd: Disabling unused power domains Jan 23 00:03:16.831937 kernel: Warning: unable to open an initial console. Jan 23 00:03:16.831945 kernel: Freeing unused kernel memory: 39552K Jan 23 00:03:16.831952 kernel: Run /init as init process Jan 23 00:03:16.831960 kernel: with arguments: Jan 23 00:03:16.831968 kernel: /init Jan 23 00:03:16.831975 kernel: with environment: Jan 23 00:03:16.831982 kernel: HOME=/ Jan 23 00:03:16.831990 kernel: TERM=linux Jan 23 00:03:16.831998 systemd[1]: Successfully made /usr/ read-only. Jan 23 00:03:16.832009 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 00:03:16.832017 systemd[1]: Detected virtualization kvm. Jan 23 00:03:16.832025 systemd[1]: Detected architecture arm64. Jan 23 00:03:16.832034 systemd[1]: Running in initrd. Jan 23 00:03:16.832041 systemd[1]: No hostname configured, using default hostname. Jan 23 00:03:16.832049 systemd[1]: Hostname set to . Jan 23 00:03:16.832057 systemd[1]: Initializing machine ID from VM UUID. Jan 23 00:03:16.832065 systemd[1]: Queued start job for default target initrd.target. Jan 23 00:03:16.832073 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 00:03:16.832088 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 00:03:16.832099 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 23 00:03:16.832107 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 00:03:16.832115 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 23 00:03:16.832126 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 23 00:03:16.832135 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 23 00:03:16.832144 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 23 00:03:16.832152 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 00:03:16.832160 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 00:03:16.832168 systemd[1]: Reached target paths.target - Path Units. Jan 23 00:03:16.832176 systemd[1]: Reached target slices.target - Slice Units. Jan 23 00:03:16.832186 systemd[1]: Reached target swap.target - Swaps. Jan 23 00:03:16.832194 systemd[1]: Reached target timers.target - Timer Units. Jan 23 00:03:16.832203 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 00:03:16.832211 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 00:03:16.832219 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 23 00:03:16.832228 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 23 00:03:16.832236 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 00:03:16.832244 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 00:03:16.832253 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 00:03:16.832263 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 00:03:16.832271 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 23 00:03:16.832281 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 00:03:16.832289 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 23 00:03:16.832298 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 23 00:03:16.832306 systemd[1]: Starting systemd-fsck-usr.service... Jan 23 00:03:16.832314 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 00:03:16.832324 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 00:03:16.832332 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 00:03:16.832340 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 23 00:03:16.832349 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 00:03:16.832357 systemd[1]: Finished systemd-fsck-usr.service. Jan 23 00:03:16.832387 systemd-journald[310]: Collecting audit messages is disabled. Jan 23 00:03:16.832407 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 23 00:03:16.832416 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 00:03:16.832424 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 23 00:03:16.832434 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 23 00:03:16.832444 kernel: Bridge firewalling registered Jan 23 00:03:16.832452 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 00:03:16.832460 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 00:03:16.832468 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 00:03:16.832477 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 00:03:16.832485 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 00:03:16.832495 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 23 00:03:16.832503 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 00:03:16.832511 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 00:03:16.832521 systemd-journald[310]: Journal started Jan 23 00:03:16.832539 systemd-journald[310]: Runtime Journal (/run/log/journal/97f71726ea904e05a1fb285fb0d0118e) is 8M, max 319.5M, 311.5M free. Jan 23 00:03:16.779053 systemd-modules-load[312]: Inserted module 'overlay' Jan 23 00:03:16.795316 systemd-modules-load[312]: Inserted module 'br_netfilter' Jan 23 00:03:16.836046 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 00:03:16.840026 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 00:03:16.842144 dracut-cmdline[341]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=38aa0560e146398cb8c3378a56d449784f1c7652139d7b61279d764fcc4c793a Jan 23 00:03:16.856045 systemd-tmpfiles[362]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 23 00:03:16.859655 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 00:03:16.863111 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 00:03:16.895325 systemd-resolved[394]: Positive Trust Anchors: Jan 23 00:03:16.895345 systemd-resolved[394]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 00:03:16.895376 systemd-resolved[394]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 00:03:16.900870 systemd-resolved[394]: Defaulting to hostname 'linux'. Jan 23 00:03:16.901935 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 00:03:16.904751 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 00:03:16.917747 kernel: SCSI subsystem initialized Jan 23 00:03:16.921738 kernel: Loading iSCSI transport class v2.0-870. Jan 23 00:03:16.929772 kernel: iscsi: registered transport (tcp) Jan 23 00:03:16.942758 kernel: iscsi: registered transport (qla4xxx) Jan 23 00:03:16.942789 kernel: QLogic iSCSI HBA Driver Jan 23 00:03:16.959085 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 00:03:16.980784 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 00:03:16.982250 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 00:03:17.029777 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 23 00:03:17.031999 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 23 00:03:17.095754 kernel: raid6: neonx8 gen() 15751 MB/s Jan 23 00:03:17.112765 kernel: raid6: neonx4 gen() 15782 MB/s Jan 23 00:03:17.129772 kernel: raid6: neonx2 gen() 13117 MB/s Jan 23 00:03:17.146776 kernel: raid6: neonx1 gen() 10374 MB/s Jan 23 00:03:17.163766 kernel: raid6: int64x8 gen() 6895 MB/s Jan 23 00:03:17.180772 kernel: raid6: int64x4 gen() 7353 MB/s Jan 23 00:03:17.197765 kernel: raid6: int64x2 gen() 6084 MB/s Jan 23 00:03:17.214772 kernel: raid6: int64x1 gen() 5052 MB/s Jan 23 00:03:17.214821 kernel: raid6: using algorithm neonx4 gen() 15782 MB/s Jan 23 00:03:17.231766 kernel: raid6: .... xor() 12290 MB/s, rmw enabled Jan 23 00:03:17.231813 kernel: raid6: using neon recovery algorithm Jan 23 00:03:17.237140 kernel: xor: measuring software checksum speed Jan 23 00:03:17.237192 kernel: 8regs : 21590 MB/sec Jan 23 00:03:17.237758 kernel: 32regs : 21681 MB/sec Jan 23 00:03:17.238849 kernel: arm64_neon : 27377 MB/sec Jan 23 00:03:17.238894 kernel: xor: using function: arm64_neon (27377 MB/sec) Jan 23 00:03:17.290756 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 23 00:03:17.297441 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 23 00:03:17.302161 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 00:03:17.332678 systemd-udevd[566]: Using default interface naming scheme 'v255'. Jan 23 00:03:17.336673 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 00:03:17.339012 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 23 00:03:17.366241 dracut-pre-trigger[576]: rd.md=0: removing MD RAID activation Jan 23 00:03:17.387900 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 00:03:17.390259 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 00:03:17.465478 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 00:03:17.468238 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 23 00:03:17.510754 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 23 00:03:17.514148 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 23 00:03:17.521337 kernel: ACPI: bus type USB registered Jan 23 00:03:17.521383 kernel: usbcore: registered new interface driver usbfs Jan 23 00:03:17.522308 kernel: usbcore: registered new interface driver hub Jan 23 00:03:17.523420 kernel: usbcore: registered new device driver usb Jan 23 00:03:17.523452 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 23 00:03:17.527787 kernel: GPT:17805311 != 104857599 Jan 23 00:03:17.527817 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 23 00:03:17.528776 kernel: GPT:17805311 != 104857599 Jan 23 00:03:17.528801 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 23 00:03:17.529792 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 00:03:17.554747 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 00:03:17.554932 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 23 00:03:17.555014 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 23 00:03:17.557104 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 00:03:17.557348 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 23 00:03:17.557180 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 00:03:17.560417 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 23 00:03:17.560555 kernel: hub 1-0:1.0: USB hub found Jan 23 00:03:17.560828 kernel: hub 1-0:1.0: 4 ports detected Jan 23 00:03:17.560914 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 23 00:03:17.557371 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 00:03:17.564852 kernel: hub 2-0:1.0: USB hub found Jan 23 00:03:17.564992 kernel: hub 2-0:1.0: 4 ports detected Jan 23 00:03:17.563654 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 00:03:17.567285 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 00:03:17.601485 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 23 00:03:17.609985 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 23 00:03:17.611354 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 23 00:03:17.615028 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 00:03:17.624606 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 00:03:17.640562 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 23 00:03:17.641803 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 23 00:03:17.643774 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 00:03:17.646488 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 00:03:17.648465 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 00:03:17.651104 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 23 00:03:17.652742 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 23 00:03:17.670140 disk-uuid[669]: Primary Header is updated. Jan 23 00:03:17.670140 disk-uuid[669]: Secondary Entries is updated. Jan 23 00:03:17.670140 disk-uuid[669]: Secondary Header is updated. Jan 23 00:03:17.677742 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 00:03:17.677994 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 23 00:03:17.800758 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 23 00:03:17.932042 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 23 00:03:17.932091 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 23 00:03:17.933255 kernel: usbcore: registered new interface driver usbhid Jan 23 00:03:17.933733 kernel: usbhid: USB HID core driver Jan 23 00:03:18.037762 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 23 00:03:18.163751 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 23 00:03:18.215750 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 23 00:03:18.691756 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 00:03:18.692080 disk-uuid[671]: The operation has completed successfully. Jan 23 00:03:18.727711 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 23 00:03:18.727852 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 23 00:03:18.753707 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 23 00:03:18.778412 sh[690]: Success Jan 23 00:03:18.790736 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 23 00:03:18.790774 kernel: device-mapper: uevent: version 1.0.3 Jan 23 00:03:18.792233 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 23 00:03:18.798743 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 23 00:03:18.852639 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 23 00:03:18.855609 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 23 00:03:18.870353 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 23 00:03:18.884744 kernel: BTRFS: device fsid 97a43946-ed04-45c1-a355-c0350e8b973e devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (702) Jan 23 00:03:18.887585 kernel: BTRFS info (device dm-0): first mount of filesystem 97a43946-ed04-45c1-a355-c0350e8b973e Jan 23 00:03:18.887606 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 23 00:03:18.900061 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 23 00:03:18.900086 kernel: BTRFS info (device dm-0): enabling free space tree Jan 23 00:03:18.902346 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 23 00:03:18.903621 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 23 00:03:18.904771 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 23 00:03:18.905616 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 23 00:03:18.907134 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 23 00:03:18.933744 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (733) Jan 23 00:03:18.935744 kernel: BTRFS info (device vda6): first mount of filesystem e9ae44b3-0aec-43ca-ad8b-9cf4e242132f Jan 23 00:03:18.935782 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 00:03:18.941982 kernel: BTRFS info (device vda6): turning on async discard Jan 23 00:03:18.942033 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 00:03:18.946744 kernel: BTRFS info (device vda6): last unmount of filesystem e9ae44b3-0aec-43ca-ad8b-9cf4e242132f Jan 23 00:03:18.948693 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 23 00:03:18.950794 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 23 00:03:19.018405 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 00:03:19.022405 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 00:03:19.069875 systemd-networkd[877]: lo: Link UP Jan 23 00:03:19.069890 systemd-networkd[877]: lo: Gained carrier Jan 23 00:03:19.070831 systemd-networkd[877]: Enumeration completed Jan 23 00:03:19.070946 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 00:03:19.071263 systemd-networkd[877]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 00:03:19.071267 systemd-networkd[877]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 00:03:19.072122 systemd-networkd[877]: eth0: Link UP Jan 23 00:03:19.072214 systemd-networkd[877]: eth0: Gained carrier Jan 23 00:03:19.072223 systemd-networkd[877]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 00:03:19.072734 systemd[1]: Reached target network.target - Network. Jan 23 00:03:19.094770 ignition[796]: Ignition 2.22.0 Jan 23 00:03:19.094782 ignition[796]: Stage: fetch-offline Jan 23 00:03:19.094815 ignition[796]: no configs at "/usr/lib/ignition/base.d" Jan 23 00:03:19.094822 ignition[796]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 00:03:19.097785 systemd-networkd[877]: eth0: DHCPv4 address 10.0.0.231/25, gateway 10.0.0.129 acquired from 10.0.0.129 Jan 23 00:03:19.094899 ignition[796]: parsed url from cmdline: "" Jan 23 00:03:19.098049 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 00:03:19.094902 ignition[796]: no config URL provided Jan 23 00:03:19.100771 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 23 00:03:19.094906 ignition[796]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 00:03:19.094912 ignition[796]: no config at "/usr/lib/ignition/user.ign" Jan 23 00:03:19.094917 ignition[796]: failed to fetch config: resource requires networking Jan 23 00:03:19.095127 ignition[796]: Ignition finished successfully Jan 23 00:03:19.132224 ignition[891]: Ignition 2.22.0 Jan 23 00:03:19.132241 ignition[891]: Stage: fetch Jan 23 00:03:19.132370 ignition[891]: no configs at "/usr/lib/ignition/base.d" Jan 23 00:03:19.132379 ignition[891]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 00:03:19.132451 ignition[891]: parsed url from cmdline: "" Jan 23 00:03:19.132454 ignition[891]: no config URL provided Jan 23 00:03:19.132459 ignition[891]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 00:03:19.132464 ignition[891]: no config at "/usr/lib/ignition/user.ign" Jan 23 00:03:19.132693 ignition[891]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 00:03:19.132937 ignition[891]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 23 00:03:19.133056 ignition[891]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 00:03:20.132992 ignition[891]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 00:03:20.133224 ignition[891]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 00:03:20.549997 systemd-networkd[877]: eth0: Gained IPv6LL Jan 23 00:03:21.133756 ignition[891]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 00:03:21.133812 ignition[891]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 00:03:22.133952 ignition[891]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 00:03:22.133975 ignition[891]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 00:03:22.508152 ignition[891]: GET result: OK Jan 23 00:03:22.508296 ignition[891]: parsing config with SHA512: dea0f4885d434bd8bc85db790a49a88ee691464b4cb22fd743390abe67c052989fe984c90e0f6a1733d99ce23e3b29ee827b6d62c99d8fdf857ab2c7bde007ec Jan 23 00:03:22.513892 unknown[891]: fetched base config from "system" Jan 23 00:03:22.513903 unknown[891]: fetched base config from "system" Jan 23 00:03:22.514260 ignition[891]: fetch: fetch complete Jan 23 00:03:22.513908 unknown[891]: fetched user config from "openstack" Jan 23 00:03:22.514264 ignition[891]: fetch: fetch passed Jan 23 00:03:22.514313 ignition[891]: Ignition finished successfully Jan 23 00:03:22.518034 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 23 00:03:22.520129 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 23 00:03:22.550818 ignition[899]: Ignition 2.22.0 Jan 23 00:03:22.550834 ignition[899]: Stage: kargs Jan 23 00:03:22.550972 ignition[899]: no configs at "/usr/lib/ignition/base.d" Jan 23 00:03:22.550984 ignition[899]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 00:03:22.554264 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 23 00:03:22.551686 ignition[899]: kargs: kargs passed Jan 23 00:03:22.551743 ignition[899]: Ignition finished successfully Jan 23 00:03:22.556430 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 23 00:03:22.591764 ignition[907]: Ignition 2.22.0 Jan 23 00:03:22.591777 ignition[907]: Stage: disks Jan 23 00:03:22.591908 ignition[907]: no configs at "/usr/lib/ignition/base.d" Jan 23 00:03:22.591916 ignition[907]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 00:03:22.594704 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 23 00:03:22.592645 ignition[907]: disks: disks passed Jan 23 00:03:22.596916 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 23 00:03:22.592685 ignition[907]: Ignition finished successfully Jan 23 00:03:22.598524 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 23 00:03:22.599988 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 00:03:22.601645 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 00:03:22.603091 systemd[1]: Reached target basic.target - Basic System. Jan 23 00:03:22.605649 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 23 00:03:22.633370 systemd-fsck[917]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jan 23 00:03:22.637278 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 23 00:03:22.640138 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 23 00:03:22.740758 kernel: EXT4-fs (vda9): mounted filesystem f31390ab-27e9-47d9-a374-053913301d53 r/w with ordered data mode. Quota mode: none. Jan 23 00:03:22.740758 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 23 00:03:22.741913 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 23 00:03:22.744443 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 00:03:22.746839 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 23 00:03:22.747775 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 23 00:03:22.759075 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 23 00:03:22.760175 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 23 00:03:22.760239 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 00:03:22.762281 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 23 00:03:22.764275 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 23 00:03:22.778858 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (926) Jan 23 00:03:22.778913 kernel: BTRFS info (device vda6): first mount of filesystem e9ae44b3-0aec-43ca-ad8b-9cf4e242132f Jan 23 00:03:22.778924 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 00:03:22.784883 kernel: BTRFS info (device vda6): turning on async discard Jan 23 00:03:22.784925 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 00:03:22.786957 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 00:03:22.811754 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:22.813246 initrd-setup-root[954]: cut: /sysroot/etc/passwd: No such file or directory Jan 23 00:03:22.818512 initrd-setup-root[961]: cut: /sysroot/etc/group: No such file or directory Jan 23 00:03:22.822463 initrd-setup-root[968]: cut: /sysroot/etc/shadow: No such file or directory Jan 23 00:03:22.826403 initrd-setup-root[975]: cut: /sysroot/etc/gshadow: No such file or directory Jan 23 00:03:22.909142 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 23 00:03:22.911463 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 23 00:03:22.913111 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 23 00:03:22.934430 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 23 00:03:22.936069 kernel: BTRFS info (device vda6): last unmount of filesystem e9ae44b3-0aec-43ca-ad8b-9cf4e242132f Jan 23 00:03:22.954911 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 23 00:03:22.966657 ignition[1045]: INFO : Ignition 2.22.0 Jan 23 00:03:22.966657 ignition[1045]: INFO : Stage: mount Jan 23 00:03:22.968358 ignition[1045]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 00:03:22.968358 ignition[1045]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 00:03:22.968358 ignition[1045]: INFO : mount: mount passed Jan 23 00:03:22.968358 ignition[1045]: INFO : Ignition finished successfully Jan 23 00:03:22.969202 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 23 00:03:23.848748 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:25.853749 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:29.858757 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:29.863109 coreos-metadata[928]: Jan 23 00:03:29.863 WARN failed to locate config-drive, using the metadata service API instead Jan 23 00:03:29.879760 coreos-metadata[928]: Jan 23 00:03:29.879 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 00:03:31.577792 coreos-metadata[928]: Jan 23 00:03:31.577 INFO Fetch successful Jan 23 00:03:31.579104 coreos-metadata[928]: Jan 23 00:03:31.578 INFO wrote hostname ci-4459-2-2-n-22c0b85714 to /sysroot/etc/hostname Jan 23 00:03:31.580876 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 23 00:03:31.582779 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 23 00:03:31.585353 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 23 00:03:31.618739 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 00:03:31.637761 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1063) Jan 23 00:03:31.637808 kernel: BTRFS info (device vda6): first mount of filesystem e9ae44b3-0aec-43ca-ad8b-9cf4e242132f Jan 23 00:03:31.637820 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 00:03:31.643140 kernel: BTRFS info (device vda6): turning on async discard Jan 23 00:03:31.643178 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 00:03:31.644675 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 00:03:31.677245 ignition[1081]: INFO : Ignition 2.22.0 Jan 23 00:03:31.677245 ignition[1081]: INFO : Stage: files Jan 23 00:03:31.679306 ignition[1081]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 00:03:31.679306 ignition[1081]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 00:03:31.679306 ignition[1081]: DEBUG : files: compiled without relabeling support, skipping Jan 23 00:03:31.679306 ignition[1081]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 23 00:03:31.679306 ignition[1081]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 23 00:03:31.685427 ignition[1081]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 23 00:03:31.685427 ignition[1081]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 23 00:03:31.685427 ignition[1081]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 23 00:03:31.685427 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 23 00:03:31.685427 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 23 00:03:31.682884 unknown[1081]: wrote ssh authorized keys file for user: core Jan 23 00:03:31.741600 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 23 00:03:31.855088 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 23 00:03:31.855088 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 00:03:31.860254 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Jan 23 00:03:32.268370 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 23 00:03:33.311929 ignition[1081]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 00:03:33.311929 ignition[1081]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 23 00:03:33.315816 ignition[1081]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 00:03:33.318762 ignition[1081]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 00:03:33.318762 ignition[1081]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 23 00:03:33.318762 ignition[1081]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 23 00:03:33.318762 ignition[1081]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 23 00:03:33.318762 ignition[1081]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 23 00:03:33.318762 ignition[1081]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 23 00:03:33.318762 ignition[1081]: INFO : files: files passed Jan 23 00:03:33.318762 ignition[1081]: INFO : Ignition finished successfully Jan 23 00:03:33.319372 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 23 00:03:33.322079 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 23 00:03:33.323698 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 23 00:03:33.346459 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 23 00:03:33.346578 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 23 00:03:33.351454 initrd-setup-root-after-ignition[1112]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 00:03:33.351454 initrd-setup-root-after-ignition[1112]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 23 00:03:33.354233 initrd-setup-root-after-ignition[1116]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 00:03:33.353322 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 00:03:33.355679 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 23 00:03:33.358214 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 23 00:03:33.409886 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 23 00:03:33.410027 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 23 00:03:33.412379 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 23 00:03:33.413630 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 23 00:03:33.415403 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 23 00:03:33.416272 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 23 00:03:33.450439 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 00:03:33.452826 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 23 00:03:33.471856 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 23 00:03:33.472991 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 00:03:33.474915 systemd[1]: Stopped target timers.target - Timer Units. Jan 23 00:03:33.476485 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 23 00:03:33.476619 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 00:03:33.478835 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 23 00:03:33.480586 systemd[1]: Stopped target basic.target - Basic System. Jan 23 00:03:33.482107 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 23 00:03:33.483623 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 00:03:33.485386 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 23 00:03:33.487127 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 23 00:03:33.488739 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 23 00:03:33.490460 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 00:03:33.492183 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 23 00:03:33.493869 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 23 00:03:33.495397 systemd[1]: Stopped target swap.target - Swaps. Jan 23 00:03:33.496737 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 23 00:03:33.496876 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 23 00:03:33.498993 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 23 00:03:33.500799 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 00:03:33.502621 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 23 00:03:33.505773 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 00:03:33.506869 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 23 00:03:33.506995 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 23 00:03:33.509486 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 23 00:03:33.509613 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 00:03:33.511375 systemd[1]: ignition-files.service: Deactivated successfully. Jan 23 00:03:33.511474 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 23 00:03:33.513840 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 23 00:03:33.514600 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 23 00:03:33.514743 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 00:03:33.517343 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 23 00:03:33.518769 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 23 00:03:33.518898 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 00:03:33.520743 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 23 00:03:33.520842 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 00:03:33.527021 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 23 00:03:33.527127 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 23 00:03:33.532977 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 23 00:03:33.537214 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 23 00:03:33.537338 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 23 00:03:33.540308 ignition[1136]: INFO : Ignition 2.22.0 Jan 23 00:03:33.540308 ignition[1136]: INFO : Stage: umount Jan 23 00:03:33.543023 ignition[1136]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 00:03:33.543023 ignition[1136]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 00:03:33.543023 ignition[1136]: INFO : umount: umount passed Jan 23 00:03:33.543023 ignition[1136]: INFO : Ignition finished successfully Jan 23 00:03:33.544031 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 23 00:03:33.544143 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 23 00:03:33.545926 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 23 00:03:33.545979 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 23 00:03:33.547251 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 23 00:03:33.547294 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 23 00:03:33.548632 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 23 00:03:33.548672 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 23 00:03:33.550277 systemd[1]: Stopped target network.target - Network. Jan 23 00:03:33.551612 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 23 00:03:33.551666 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 00:03:33.553391 systemd[1]: Stopped target paths.target - Path Units. Jan 23 00:03:33.554778 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 23 00:03:33.558759 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 00:03:33.560640 systemd[1]: Stopped target slices.target - Slice Units. Jan 23 00:03:33.562351 systemd[1]: Stopped target sockets.target - Socket Units. Jan 23 00:03:33.563704 systemd[1]: iscsid.socket: Deactivated successfully. Jan 23 00:03:33.563764 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 00:03:33.565341 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 23 00:03:33.565369 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 00:03:33.567327 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 23 00:03:33.567378 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 23 00:03:33.568704 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 23 00:03:33.568765 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 23 00:03:33.570303 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 23 00:03:33.570348 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 23 00:03:33.571845 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 23 00:03:33.573436 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 23 00:03:33.583405 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 23 00:03:33.583497 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 23 00:03:33.587314 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jan 23 00:03:33.587560 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 23 00:03:33.587595 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 00:03:33.590528 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jan 23 00:03:33.592034 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 23 00:03:33.593794 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 23 00:03:33.597021 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jan 23 00:03:33.597154 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 23 00:03:33.599034 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 23 00:03:33.599077 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 23 00:03:33.601425 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 23 00:03:33.602314 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 23 00:03:33.602373 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 00:03:33.604289 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 23 00:03:33.604336 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 23 00:03:33.606638 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 23 00:03:33.606678 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 23 00:03:33.608700 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 00:03:33.613092 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jan 23 00:03:33.622893 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 23 00:03:33.623027 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 23 00:03:33.625085 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 23 00:03:33.625213 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 00:03:33.627073 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 23 00:03:33.627133 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 23 00:03:33.628213 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 23 00:03:33.628243 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 00:03:33.630050 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 23 00:03:33.630094 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 23 00:03:33.632684 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 23 00:03:33.632744 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 23 00:03:33.635324 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 23 00:03:33.635377 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 00:03:33.638585 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 23 00:03:33.639603 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 23 00:03:33.639668 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 00:03:33.642378 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 23 00:03:33.642419 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 00:03:33.645487 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 00:03:33.645531 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 00:03:33.661816 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 23 00:03:33.661929 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 23 00:03:33.664019 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 23 00:03:33.666389 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 23 00:03:33.694774 systemd[1]: Switching root. Jan 23 00:03:33.740703 systemd-journald[310]: Journal stopped Jan 23 00:03:34.556543 systemd-journald[310]: Received SIGTERM from PID 1 (systemd). Jan 23 00:03:34.556609 kernel: SELinux: policy capability network_peer_controls=1 Jan 23 00:03:34.556626 kernel: SELinux: policy capability open_perms=1 Jan 23 00:03:34.556641 kernel: SELinux: policy capability extended_socket_class=1 Jan 23 00:03:34.556651 kernel: SELinux: policy capability always_check_network=0 Jan 23 00:03:34.556660 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 23 00:03:34.556672 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 23 00:03:34.556685 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 23 00:03:34.556697 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 23 00:03:34.556706 kernel: SELinux: policy capability userspace_initial_context=0 Jan 23 00:03:34.556716 kernel: audit: type=1403 audit(1769126613.859:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 23 00:03:34.556741 systemd[1]: Successfully loaded SELinux policy in 66.459ms. Jan 23 00:03:34.556759 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.758ms. Jan 23 00:03:34.556773 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 00:03:34.556784 systemd[1]: Detected virtualization kvm. Jan 23 00:03:34.556795 systemd[1]: Detected architecture arm64. Jan 23 00:03:34.556805 systemd[1]: Detected first boot. Jan 23 00:03:34.556818 systemd[1]: Hostname set to . Jan 23 00:03:34.556828 systemd[1]: Initializing machine ID from VM UUID. Jan 23 00:03:34.556841 zram_generator::config[1180]: No configuration found. Jan 23 00:03:34.556852 kernel: NET: Registered PF_VSOCK protocol family Jan 23 00:03:34.556862 systemd[1]: Populated /etc with preset unit settings. Jan 23 00:03:34.556872 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jan 23 00:03:34.556886 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 23 00:03:34.556899 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 23 00:03:34.556909 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 23 00:03:34.556919 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 23 00:03:34.556987 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 23 00:03:34.556996 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 23 00:03:34.557006 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 23 00:03:34.557016 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 23 00:03:34.557026 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 23 00:03:34.557038 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 23 00:03:34.557047 systemd[1]: Created slice user.slice - User and Session Slice. Jan 23 00:03:34.557057 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 00:03:34.557068 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 00:03:34.557078 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 23 00:03:34.557088 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 23 00:03:34.557098 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 23 00:03:34.557109 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 00:03:34.557121 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 23 00:03:34.557131 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 00:03:34.557141 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 00:03:34.557151 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 23 00:03:34.557161 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 23 00:03:34.557171 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 23 00:03:34.557181 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 23 00:03:34.557193 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 00:03:34.557206 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 00:03:34.557216 systemd[1]: Reached target slices.target - Slice Units. Jan 23 00:03:34.557228 systemd[1]: Reached target swap.target - Swaps. Jan 23 00:03:34.557238 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 23 00:03:34.557248 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 23 00:03:34.557257 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 23 00:03:34.557267 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 00:03:34.557277 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 00:03:34.557289 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 00:03:34.557299 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 23 00:03:34.557309 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 23 00:03:34.557318 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 23 00:03:34.557328 systemd[1]: Mounting media.mount - External Media Directory... Jan 23 00:03:34.557338 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 23 00:03:34.557347 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 23 00:03:34.557357 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 23 00:03:34.557368 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 23 00:03:34.557380 systemd[1]: Reached target machines.target - Containers. Jan 23 00:03:34.557390 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 23 00:03:34.557400 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 00:03:34.557410 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 00:03:34.557420 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 23 00:03:34.557430 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 00:03:34.557440 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 00:03:34.557450 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 00:03:34.557476 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 23 00:03:34.557488 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 00:03:34.557498 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 23 00:03:34.557511 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 23 00:03:34.557521 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 23 00:03:34.557535 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 23 00:03:34.557545 systemd[1]: Stopped systemd-fsck-usr.service. Jan 23 00:03:34.557555 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 00:03:34.557565 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 00:03:34.557575 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 00:03:34.557585 kernel: loop: module loaded Jan 23 00:03:34.557594 kernel: fuse: init (API version 7.41) Jan 23 00:03:34.557603 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 00:03:34.557614 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 23 00:03:34.557625 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 23 00:03:34.557635 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 00:03:34.557645 systemd[1]: verity-setup.service: Deactivated successfully. Jan 23 00:03:34.557655 kernel: ACPI: bus type drm_connector registered Jan 23 00:03:34.557664 systemd[1]: Stopped verity-setup.service. Jan 23 00:03:34.557674 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 23 00:03:34.557684 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 23 00:03:34.557728 systemd-journald[1251]: Collecting audit messages is disabled. Jan 23 00:03:34.557757 systemd[1]: Mounted media.mount - External Media Directory. Jan 23 00:03:34.557768 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 23 00:03:34.557778 systemd-journald[1251]: Journal started Jan 23 00:03:34.557801 systemd-journald[1251]: Runtime Journal (/run/log/journal/97f71726ea904e05a1fb285fb0d0118e) is 8M, max 319.5M, 311.5M free. Jan 23 00:03:34.339417 systemd[1]: Queued start job for default target multi-user.target. Jan 23 00:03:34.365059 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 23 00:03:34.365515 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 23 00:03:34.560238 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 00:03:34.560928 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 23 00:03:34.562095 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 23 00:03:34.564784 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 23 00:03:34.566352 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 00:03:34.568114 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 23 00:03:34.568281 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 23 00:03:34.569583 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 00:03:34.569765 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 00:03:34.570999 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 00:03:34.571168 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 00:03:34.572471 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 00:03:34.572625 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 00:03:34.574055 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 23 00:03:34.574214 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 23 00:03:34.575414 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 00:03:34.575569 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 00:03:34.576911 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 00:03:34.580083 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 00:03:34.581482 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 23 00:03:34.582996 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 23 00:03:34.595713 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 00:03:34.598032 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 23 00:03:34.599916 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 23 00:03:34.600928 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 23 00:03:34.600964 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 00:03:34.602672 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 23 00:03:34.613829 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 23 00:03:34.614897 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 00:03:34.617174 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 23 00:03:34.619115 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 23 00:03:34.620229 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 00:03:34.623883 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 23 00:03:34.624883 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 00:03:34.626244 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 00:03:34.628578 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 23 00:03:34.633507 systemd-journald[1251]: Time spent on flushing to /var/log/journal/97f71726ea904e05a1fb285fb0d0118e is 30.465ms for 1686 entries. Jan 23 00:03:34.633507 systemd-journald[1251]: System Journal (/var/log/journal/97f71726ea904e05a1fb285fb0d0118e) is 8M, max 584.8M, 576.8M free. Jan 23 00:03:34.677824 systemd-journald[1251]: Received client request to flush runtime journal. Jan 23 00:03:34.677858 kernel: loop0: detected capacity change from 0 to 119840 Jan 23 00:03:34.633942 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 23 00:03:34.637476 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 00:03:34.639799 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 23 00:03:34.641373 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 23 00:03:34.643086 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 23 00:03:34.648907 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 23 00:03:34.651386 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 23 00:03:34.671094 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 00:03:34.677116 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 23 00:03:34.680290 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 23 00:03:34.685523 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 00:03:34.696765 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 23 00:03:34.697399 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 23 00:03:34.712967 systemd-tmpfiles[1315]: ACLs are not supported, ignoring. Jan 23 00:03:34.712987 systemd-tmpfiles[1315]: ACLs are not supported, ignoring. Jan 23 00:03:34.716295 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 00:03:34.738755 kernel: loop1: detected capacity change from 0 to 100632 Jan 23 00:03:34.794747 kernel: loop2: detected capacity change from 0 to 200800 Jan 23 00:03:34.835771 kernel: loop3: detected capacity change from 0 to 1632 Jan 23 00:03:34.862770 kernel: loop4: detected capacity change from 0 to 119840 Jan 23 00:03:34.872762 kernel: loop5: detected capacity change from 0 to 100632 Jan 23 00:03:34.884762 kernel: loop6: detected capacity change from 0 to 200800 Jan 23 00:03:34.897787 kernel: loop7: detected capacity change from 0 to 1632 Jan 23 00:03:34.901287 (sd-merge)[1326]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Jan 23 00:03:34.901791 (sd-merge)[1326]: Merged extensions into '/usr'. Jan 23 00:03:34.905523 systemd[1]: Reload requested from client PID 1299 ('systemd-sysext') (unit systemd-sysext.service)... Jan 23 00:03:34.905544 systemd[1]: Reloading... Jan 23 00:03:34.951789 zram_generator::config[1352]: No configuration found. Jan 23 00:03:35.076214 ldconfig[1294]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 23 00:03:35.098000 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 23 00:03:35.098192 systemd[1]: Reloading finished in 192 ms. Jan 23 00:03:35.127868 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 23 00:03:35.129138 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 23 00:03:35.130480 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 23 00:03:35.147063 systemd[1]: Starting ensure-sysext.service... Jan 23 00:03:35.148626 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 00:03:35.150876 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 00:03:35.160463 systemd[1]: Reload requested from client PID 1390 ('systemctl') (unit ensure-sysext.service)... Jan 23 00:03:35.160483 systemd[1]: Reloading... Jan 23 00:03:35.163664 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 23 00:03:35.164081 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 23 00:03:35.164348 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 23 00:03:35.164547 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 23 00:03:35.165173 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 23 00:03:35.165378 systemd-tmpfiles[1391]: ACLs are not supported, ignoring. Jan 23 00:03:35.165424 systemd-tmpfiles[1391]: ACLs are not supported, ignoring. Jan 23 00:03:35.168304 systemd-tmpfiles[1391]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 00:03:35.168322 systemd-tmpfiles[1391]: Skipping /boot Jan 23 00:03:35.174107 systemd-tmpfiles[1391]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 00:03:35.174119 systemd-tmpfiles[1391]: Skipping /boot Jan 23 00:03:35.174669 systemd-udevd[1392]: Using default interface naming scheme 'v255'. Jan 23 00:03:35.208767 zram_generator::config[1419]: No configuration found. Jan 23 00:03:35.362759 kernel: mousedev: PS/2 mouse device common for all mice Jan 23 00:03:35.390278 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 23 00:03:35.390835 systemd[1]: Reloading finished in 230 ms. Jan 23 00:03:35.403492 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 00:03:35.413282 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 00:03:35.433889 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 23 00:03:35.433980 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 23 00:03:35.433999 kernel: [drm] features: -context_init Jan 23 00:03:35.437198 kernel: [drm] number of scanouts: 1 Jan 23 00:03:35.437266 kernel: [drm] number of cap sets: 0 Jan 23 00:03:35.437879 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 00:03:35.440023 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 23 00:03:35.442954 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 00:03:35.448902 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 23 00:03:35.449955 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 00:03:35.452752 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 23 00:03:35.481659 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 23 00:03:35.486864 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 23 00:03:35.491083 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 00:03:35.495564 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 00:03:35.504937 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 23 00:03:35.517120 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 00:03:35.519871 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 00:03:35.523478 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 00:03:35.528335 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 00:03:35.529712 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 00:03:35.529899 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 00:03:35.532843 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 00:03:35.533045 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 00:03:35.545232 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 23 00:03:35.547545 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 23 00:03:35.549245 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 00:03:35.549609 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 00:03:35.552515 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 00:03:35.552703 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 00:03:35.560959 augenrules[1548]: No rules Jan 23 00:03:35.562200 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 00:03:35.563790 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 00:03:35.579221 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 23 00:03:35.582755 systemd[1]: Finished ensure-sysext.service. Jan 23 00:03:35.590691 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 00:03:35.591676 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 00:03:35.592636 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 00:03:35.602243 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 00:03:35.604136 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 00:03:35.606253 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 00:03:35.608682 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 23 00:03:35.609946 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 00:03:35.609996 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 00:03:35.610045 systemd[1]: Reached target time-set.target - System Time Set. Jan 23 00:03:35.612332 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 23 00:03:35.615199 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 23 00:03:35.617654 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 00:03:35.620764 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 23 00:03:35.622191 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 00:03:35.622352 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 00:03:35.623864 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 00:03:35.624052 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 00:03:35.626775 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 23 00:03:35.626835 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 23 00:03:35.625979 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 00:03:35.626129 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 00:03:35.626932 augenrules[1559]: /sbin/augenrules: No change Jan 23 00:03:35.628352 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 00:03:35.628548 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 00:03:35.629962 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 23 00:03:35.632746 kernel: PTP clock support registered Jan 23 00:03:35.634989 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 23 00:03:35.636993 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 23 00:03:35.638806 augenrules[1588]: No rules Jan 23 00:03:35.640847 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 00:03:35.641810 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 00:03:35.646637 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 00:03:35.646896 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 00:03:35.646941 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 23 00:03:35.667887 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 23 00:03:35.704522 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 00:03:35.719865 systemd-resolved[1521]: Positive Trust Anchors: Jan 23 00:03:35.719884 systemd-resolved[1521]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 00:03:35.719915 systemd-resolved[1521]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 00:03:35.724954 systemd-resolved[1521]: Using system hostname 'ci-4459-2-2-n-22c0b85714'. Jan 23 00:03:35.726170 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 00:03:35.727427 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 00:03:35.728672 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 00:03:35.729707 systemd-networkd[1519]: lo: Link UP Jan 23 00:03:35.729738 systemd-networkd[1519]: lo: Gained carrier Jan 23 00:03:35.729862 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 23 00:03:35.730847 systemd-networkd[1519]: Enumeration completed Jan 23 00:03:35.730962 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 23 00:03:35.731339 systemd-networkd[1519]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 00:03:35.731349 systemd-networkd[1519]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 00:03:35.732034 systemd-networkd[1519]: eth0: Link UP Jan 23 00:03:35.732147 systemd-networkd[1519]: eth0: Gained carrier Jan 23 00:03:35.732167 systemd-networkd[1519]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 23 00:03:35.732298 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 23 00:03:35.733448 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 23 00:03:35.734645 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 23 00:03:35.735851 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 23 00:03:35.735887 systemd[1]: Reached target paths.target - Path Units. Jan 23 00:03:35.736643 systemd[1]: Reached target timers.target - Timer Units. Jan 23 00:03:35.738812 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 23 00:03:35.741023 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 23 00:03:35.743829 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 23 00:03:35.745112 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 23 00:03:35.746257 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 23 00:03:35.752942 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 23 00:03:35.754183 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 23 00:03:35.755788 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 00:03:35.755834 systemd-networkd[1519]: eth0: DHCPv4 address 10.0.0.231/25, gateway 10.0.0.129 acquired from 10.0.0.129 Jan 23 00:03:35.756888 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 23 00:03:35.757916 systemd[1]: Reached target network.target - Network. Jan 23 00:03:35.758775 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 00:03:35.759607 systemd[1]: Reached target basic.target - Basic System. Jan 23 00:03:35.760522 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 23 00:03:35.760554 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 23 00:03:35.763035 systemd[1]: Starting chronyd.service - NTP client/server... Jan 23 00:03:35.764787 systemd[1]: Starting containerd.service - containerd container runtime... Jan 23 00:03:35.766765 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 23 00:03:35.775886 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 23 00:03:35.776749 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:35.777695 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 23 00:03:35.780800 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 23 00:03:35.783020 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 23 00:03:35.783965 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 23 00:03:35.784913 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 23 00:03:35.786878 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 23 00:03:35.802916 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 23 00:03:35.804992 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 23 00:03:35.807218 jq[1619]: false Jan 23 00:03:35.815542 extend-filesystems[1620]: Found /dev/vda6 Jan 23 00:03:35.819830 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 23 00:03:35.823003 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 23 00:03:35.824921 chronyd[1612]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 23 00:03:35.825103 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 23 00:03:35.826260 chronyd[1612]: Loaded seccomp filter (level 2) Jan 23 00:03:35.827192 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 23 00:03:35.827627 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 23 00:03:35.828657 extend-filesystems[1620]: Found /dev/vda9 Jan 23 00:03:35.828432 systemd[1]: Starting update-engine.service - Update Engine... Jan 23 00:03:35.832300 extend-filesystems[1620]: Checking size of /dev/vda9 Jan 23 00:03:35.835342 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 23 00:03:35.837238 systemd[1]: Started chronyd.service - NTP client/server. Jan 23 00:03:35.840637 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 23 00:03:35.842424 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 23 00:03:35.842609 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 23 00:03:35.842863 systemd[1]: motdgen.service: Deactivated successfully. Jan 23 00:03:35.843026 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 23 00:03:35.847746 extend-filesystems[1620]: Resized partition /dev/vda9 Jan 23 00:03:35.848570 extend-filesystems[1650]: resize2fs 1.47.3 (8-Jul-2025) Jan 23 00:03:35.850963 jq[1644]: true Jan 23 00:03:35.850302 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 23 00:03:35.850493 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 23 00:03:35.862825 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Jan 23 00:03:35.875166 update_engine[1641]: I20260123 00:03:35.874632 1641 main.cc:92] Flatcar Update Engine starting Jan 23 00:03:35.875848 jq[1651]: true Jan 23 00:03:35.882525 (ntainerd)[1664]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 23 00:03:35.882994 tar[1649]: linux-arm64/LICENSE Jan 23 00:03:35.882994 tar[1649]: linux-arm64/helm Jan 23 00:03:35.889306 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 23 00:03:35.908652 dbus-daemon[1615]: [system] SELinux support is enabled Jan 23 00:03:35.909818 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 23 00:03:35.918117 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 23 00:03:35.918152 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 23 00:03:35.919364 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 23 00:03:35.919391 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 23 00:03:35.920899 systemd[1]: Started update-engine.service - Update Engine. Jan 23 00:03:35.922067 update_engine[1641]: I20260123 00:03:35.922010 1641 update_check_scheduler.cc:74] Next update check in 6m31s Jan 23 00:03:35.927896 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 23 00:03:35.945668 systemd-logind[1635]: New seat seat0. Jan 23 00:03:35.982022 locksmithd[1680]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 23 00:03:35.988475 systemd-logind[1635]: Watching system buttons on /dev/input/event0 (Power Button) Jan 23 00:03:35.988492 systemd-logind[1635]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 23 00:03:35.988749 systemd[1]: Started systemd-logind.service - User Login Management. Jan 23 00:03:36.025699 bash[1683]: Updated "/home/core/.ssh/authorized_keys" Jan 23 00:03:36.028808 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 23 00:03:36.033044 systemd[1]: Starting sshkeys.service... Jan 23 00:03:36.065777 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 23 00:03:36.068312 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 23 00:03:36.094742 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:36.097544 containerd[1664]: time="2026-01-23T00:03:36Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 23 00:03:36.098494 containerd[1664]: time="2026-01-23T00:03:36.098454720Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116332200Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.76µs" Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116384440Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116405920Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116682520Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116706720Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116753160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116813480Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.116825440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.117120880Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.117137240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.117148760Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 00:03:36.117735 containerd[1664]: time="2026-01-23T00:03:36.117156480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 23 00:03:36.118034 containerd[1664]: time="2026-01-23T00:03:36.117220360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 23 00:03:36.118034 containerd[1664]: time="2026-01-23T00:03:36.117387960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 00:03:36.118034 containerd[1664]: time="2026-01-23T00:03:36.117413600Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 00:03:36.118034 containerd[1664]: time="2026-01-23T00:03:36.117422760Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 23 00:03:36.118034 containerd[1664]: time="2026-01-23T00:03:36.117481000Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 23 00:03:36.118034 containerd[1664]: time="2026-01-23T00:03:36.117758520Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 23 00:03:36.118034 containerd[1664]: time="2026-01-23T00:03:36.117824840Z" level=info msg="metadata content store policy set" policy=shared Jan 23 00:03:36.146913 containerd[1664]: time="2026-01-23T00:03:36.146696600Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 23 00:03:36.147006 containerd[1664]: time="2026-01-23T00:03:36.146966560Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 23 00:03:36.147025 containerd[1664]: time="2026-01-23T00:03:36.147006800Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 23 00:03:36.147062 containerd[1664]: time="2026-01-23T00:03:36.147022280Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 23 00:03:36.147062 containerd[1664]: time="2026-01-23T00:03:36.147036960Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 23 00:03:36.147062 containerd[1664]: time="2026-01-23T00:03:36.147047920Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 23 00:03:36.147062 containerd[1664]: time="2026-01-23T00:03:36.147060000Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 23 00:03:36.147122 containerd[1664]: time="2026-01-23T00:03:36.147072760Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 23 00:03:36.147122 containerd[1664]: time="2026-01-23T00:03:36.147085000Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 23 00:03:36.147122 containerd[1664]: time="2026-01-23T00:03:36.147095640Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 23 00:03:36.147122 containerd[1664]: time="2026-01-23T00:03:36.147106000Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 23 00:03:36.147122 containerd[1664]: time="2026-01-23T00:03:36.147119000Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 23 00:03:36.147293 containerd[1664]: time="2026-01-23T00:03:36.147264800Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 23 00:03:36.147325 containerd[1664]: time="2026-01-23T00:03:36.147296600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 23 00:03:36.147325 containerd[1664]: time="2026-01-23T00:03:36.147312840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 23 00:03:36.147357 containerd[1664]: time="2026-01-23T00:03:36.147323560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 23 00:03:36.147357 containerd[1664]: time="2026-01-23T00:03:36.147335120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 23 00:03:36.147357 containerd[1664]: time="2026-01-23T00:03:36.147346160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 23 00:03:36.147404 containerd[1664]: time="2026-01-23T00:03:36.147358080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 23 00:03:36.147404 containerd[1664]: time="2026-01-23T00:03:36.147369360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 23 00:03:36.147404 containerd[1664]: time="2026-01-23T00:03:36.147379880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 23 00:03:36.147404 containerd[1664]: time="2026-01-23T00:03:36.147391120Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 23 00:03:36.147404 containerd[1664]: time="2026-01-23T00:03:36.147401680Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 23 00:03:36.147629 containerd[1664]: time="2026-01-23T00:03:36.147609280Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 23 00:03:36.147629 containerd[1664]: time="2026-01-23T00:03:36.147628720Z" level=info msg="Start snapshots syncer" Jan 23 00:03:36.147715 containerd[1664]: time="2026-01-23T00:03:36.147654760Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 23 00:03:36.148019 containerd[1664]: time="2026-01-23T00:03:36.147958160Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 23 00:03:36.148126 containerd[1664]: time="2026-01-23T00:03:36.148066200Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 23 00:03:36.148147 containerd[1664]: time="2026-01-23T00:03:36.148132040Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 23 00:03:36.148259 containerd[1664]: time="2026-01-23T00:03:36.148234920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 23 00:03:36.148322 containerd[1664]: time="2026-01-23T00:03:36.148263200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 23 00:03:36.148351 containerd[1664]: time="2026-01-23T00:03:36.148324200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 23 00:03:36.148351 containerd[1664]: time="2026-01-23T00:03:36.148336760Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 23 00:03:36.148351 containerd[1664]: time="2026-01-23T00:03:36.148348600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 23 00:03:36.148398 containerd[1664]: time="2026-01-23T00:03:36.148359640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 23 00:03:36.148398 containerd[1664]: time="2026-01-23T00:03:36.148370400Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 23 00:03:36.148437 containerd[1664]: time="2026-01-23T00:03:36.148396080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 23 00:03:36.148437 containerd[1664]: time="2026-01-23T00:03:36.148417600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 23 00:03:36.148437 containerd[1664]: time="2026-01-23T00:03:36.148428520Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 23 00:03:36.148483 containerd[1664]: time="2026-01-23T00:03:36.148457880Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 00:03:36.148521 containerd[1664]: time="2026-01-23T00:03:36.148473840Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 00:03:36.148521 containerd[1664]: time="2026-01-23T00:03:36.148518280Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 00:03:36.148572 containerd[1664]: time="2026-01-23T00:03:36.148530760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 00:03:36.148572 containerd[1664]: time="2026-01-23T00:03:36.148538760Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 23 00:03:36.148572 containerd[1664]: time="2026-01-23T00:03:36.148548800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 23 00:03:36.148572 containerd[1664]: time="2026-01-23T00:03:36.148561040Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 23 00:03:36.148695 containerd[1664]: time="2026-01-23T00:03:36.148656960Z" level=info msg="runtime interface created" Jan 23 00:03:36.148695 containerd[1664]: time="2026-01-23T00:03:36.148691680Z" level=info msg="created NRI interface" Jan 23 00:03:36.148752 containerd[1664]: time="2026-01-23T00:03:36.148709360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 23 00:03:36.148752 containerd[1664]: time="2026-01-23T00:03:36.148735040Z" level=info msg="Connect containerd service" Jan 23 00:03:36.148793 containerd[1664]: time="2026-01-23T00:03:36.148761640Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 23 00:03:36.149700 containerd[1664]: time="2026-01-23T00:03:36.149666840Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 00:03:36.200745 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Jan 23 00:03:36.240762 extend-filesystems[1650]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 23 00:03:36.240762 extend-filesystems[1650]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 23 00:03:36.240762 extend-filesystems[1650]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Jan 23 00:03:36.246200 extend-filesystems[1620]: Resized filesystem in /dev/vda9 Jan 23 00:03:36.241868 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.240819520Z" level=info msg="Start subscribing containerd event" Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.240870680Z" level=info msg="Start recovering state" Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.240971920Z" level=info msg="Start event monitor" Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.242830120Z" level=info msg="Start cni network conf syncer for default" Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.242854160Z" level=info msg="Start streaming server" Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.242864960Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.241213880Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.243073440Z" level=info msg="runtime interface starting up..." Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.243083200Z" level=info msg="starting plugins..." Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.243102640Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.243324720Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 23 00:03:36.250630 containerd[1664]: time="2026-01-23T00:03:36.243467360Z" level=info msg="containerd successfully booted in 0.146330s" Jan 23 00:03:36.242077 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 23 00:03:36.246044 systemd[1]: Started containerd.service - containerd container runtime. Jan 23 00:03:36.335330 tar[1649]: linux-arm64/README.md Jan 23 00:03:36.352401 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 23 00:03:36.785761 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:37.118744 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:37.190030 systemd-networkd[1519]: eth0: Gained IPv6LL Jan 23 00:03:37.192447 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 23 00:03:37.194845 systemd[1]: Reached target network-online.target - Network is Online. Jan 23 00:03:37.198031 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:03:37.200392 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 23 00:03:37.236579 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 23 00:03:37.474377 sshd_keygen[1640]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 23 00:03:37.492673 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 23 00:03:37.496004 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 23 00:03:37.520490 systemd[1]: issuegen.service: Deactivated successfully. Jan 23 00:03:37.520740 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 23 00:03:37.525944 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 23 00:03:37.550790 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 23 00:03:37.553619 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 23 00:03:37.556123 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 23 00:03:37.557404 systemd[1]: Reached target getty.target - Login Prompts. Jan 23 00:03:38.060412 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 23 00:03:38.062840 systemd[1]: Started sshd@0-10.0.0.231:22-20.161.92.111:52082.service - OpenSSH per-connection server daemon (20.161.92.111:52082). Jan 23 00:03:38.066346 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:03:38.070267 (kubelet)[1757]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 00:03:38.511412 kubelet[1757]: E0123 00:03:38.511302 1757 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 00:03:38.513997 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 00:03:38.514124 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 00:03:38.514418 systemd[1]: kubelet.service: Consumed 709ms CPU time, 248.8M memory peak. Jan 23 00:03:38.683020 sshd[1756]: Accepted publickey for core from 20.161.92.111 port 52082 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:03:38.684655 sshd-session[1756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:03:38.691536 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 23 00:03:38.693629 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 23 00:03:38.700569 systemd-logind[1635]: New session 1 of user core. Jan 23 00:03:38.719775 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 23 00:03:38.723489 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 23 00:03:38.738340 (systemd)[1770]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 23 00:03:38.740653 systemd-logind[1635]: New session c1 of user core. Jan 23 00:03:38.795776 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:38.866052 systemd[1770]: Queued start job for default target default.target. Jan 23 00:03:38.885149 systemd[1770]: Created slice app.slice - User Application Slice. Jan 23 00:03:38.885183 systemd[1770]: Reached target paths.target - Paths. Jan 23 00:03:38.885221 systemd[1770]: Reached target timers.target - Timers. Jan 23 00:03:38.886434 systemd[1770]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 23 00:03:38.896095 systemd[1770]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 23 00:03:38.896164 systemd[1770]: Reached target sockets.target - Sockets. Jan 23 00:03:38.896201 systemd[1770]: Reached target basic.target - Basic System. Jan 23 00:03:38.896225 systemd[1770]: Reached target default.target - Main User Target. Jan 23 00:03:38.896249 systemd[1770]: Startup finished in 149ms. Jan 23 00:03:38.896801 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 23 00:03:38.899459 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 23 00:03:39.128787 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:39.338465 systemd[1]: Started sshd@1-10.0.0.231:22-20.161.92.111:52096.service - OpenSSH per-connection server daemon (20.161.92.111:52096). Jan 23 00:03:39.946762 sshd[1783]: Accepted publickey for core from 20.161.92.111 port 52096 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:03:39.947825 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:03:39.951578 systemd-logind[1635]: New session 2 of user core. Jan 23 00:03:39.961178 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 23 00:03:40.375431 sshd[1786]: Connection closed by 20.161.92.111 port 52096 Jan 23 00:03:40.375653 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Jan 23 00:03:40.379366 systemd[1]: sshd@1-10.0.0.231:22-20.161.92.111:52096.service: Deactivated successfully. Jan 23 00:03:40.380994 systemd[1]: session-2.scope: Deactivated successfully. Jan 23 00:03:40.383826 systemd-logind[1635]: Session 2 logged out. Waiting for processes to exit. Jan 23 00:03:40.385743 systemd-logind[1635]: Removed session 2. Jan 23 00:03:40.489616 systemd[1]: Started sshd@2-10.0.0.231:22-20.161.92.111:52106.service - OpenSSH per-connection server daemon (20.161.92.111:52106). Jan 23 00:03:41.106786 sshd[1792]: Accepted publickey for core from 20.161.92.111 port 52106 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:03:41.108179 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:03:41.113011 systemd-logind[1635]: New session 3 of user core. Jan 23 00:03:41.119974 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 23 00:03:41.535330 sshd[1795]: Connection closed by 20.161.92.111 port 52106 Jan 23 00:03:41.535573 sshd-session[1792]: pam_unix(sshd:session): session closed for user core Jan 23 00:03:41.539578 systemd[1]: sshd@2-10.0.0.231:22-20.161.92.111:52106.service: Deactivated successfully. Jan 23 00:03:41.542264 systemd[1]: session-3.scope: Deactivated successfully. Jan 23 00:03:41.542984 systemd-logind[1635]: Session 3 logged out. Waiting for processes to exit. Jan 23 00:03:41.544373 systemd-logind[1635]: Removed session 3. Jan 23 00:03:42.803760 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:42.809346 coreos-metadata[1614]: Jan 23 00:03:42.809 WARN failed to locate config-drive, using the metadata service API instead Jan 23 00:03:42.824744 coreos-metadata[1614]: Jan 23 00:03:42.824 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 23 00:03:43.145767 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 00:03:43.151427 coreos-metadata[1696]: Jan 23 00:03:43.151 WARN failed to locate config-drive, using the metadata service API instead Jan 23 00:03:43.163826 coreos-metadata[1696]: Jan 23 00:03:43.163 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 23 00:03:44.084412 coreos-metadata[1614]: Jan 23 00:03:44.084 INFO Fetch successful Jan 23 00:03:44.084768 coreos-metadata[1614]: Jan 23 00:03:44.084 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 00:03:44.763841 coreos-metadata[1696]: Jan 23 00:03:44.763 INFO Fetch successful Jan 23 00:03:44.763841 coreos-metadata[1696]: Jan 23 00:03:44.763 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 23 00:03:45.369899 coreos-metadata[1614]: Jan 23 00:03:45.369 INFO Fetch successful Jan 23 00:03:45.369899 coreos-metadata[1614]: Jan 23 00:03:45.369 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 23 00:03:48.322238 coreos-metadata[1696]: Jan 23 00:03:48.322 INFO Fetch successful Jan 23 00:03:48.323880 unknown[1696]: wrote ssh authorized keys file for user: core Jan 23 00:03:48.350026 update-ssh-keys[1809]: Updated "/home/core/.ssh/authorized_keys" Jan 23 00:03:48.351821 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 23 00:03:48.354033 systemd[1]: Finished sshkeys.service. Jan 23 00:03:48.764896 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 23 00:03:48.766533 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:03:48.911373 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:03:48.915277 (kubelet)[1820]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 00:03:48.949695 kubelet[1820]: E0123 00:03:48.949628 1820 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 00:03:48.952690 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 00:03:48.952844 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 00:03:48.953844 systemd[1]: kubelet.service: Consumed 144ms CPU time, 107.2M memory peak. Jan 23 00:03:49.577133 coreos-metadata[1614]: Jan 23 00:03:49.577 INFO Fetch successful Jan 23 00:03:49.577133 coreos-metadata[1614]: Jan 23 00:03:49.577 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 23 00:03:51.656975 systemd[1]: Started sshd@3-10.0.0.231:22-20.161.92.111:58482.service - OpenSSH per-connection server daemon (20.161.92.111:58482). Jan 23 00:03:52.175079 coreos-metadata[1614]: Jan 23 00:03:52.175 INFO Fetch successful Jan 23 00:03:52.175079 coreos-metadata[1614]: Jan 23 00:03:52.175 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 23 00:03:52.282272 sshd[1830]: Accepted publickey for core from 20.161.92.111 port 58482 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:03:52.283865 sshd-session[1830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:03:52.287553 systemd-logind[1635]: New session 4 of user core. Jan 23 00:03:52.300087 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 23 00:03:52.724803 sshd[1833]: Connection closed by 20.161.92.111 port 58482 Jan 23 00:03:52.724788 sshd-session[1830]: pam_unix(sshd:session): session closed for user core Jan 23 00:03:52.728218 systemd[1]: sshd@3-10.0.0.231:22-20.161.92.111:58482.service: Deactivated successfully. Jan 23 00:03:52.729799 systemd[1]: session-4.scope: Deactivated successfully. Jan 23 00:03:52.730961 systemd-logind[1635]: Session 4 logged out. Waiting for processes to exit. Jan 23 00:03:52.732110 systemd-logind[1635]: Removed session 4. Jan 23 00:03:52.834113 systemd[1]: Started sshd@4-10.0.0.231:22-20.161.92.111:40540.service - OpenSSH per-connection server daemon (20.161.92.111:40540). Jan 23 00:03:53.477002 sshd[1839]: Accepted publickey for core from 20.161.92.111 port 40540 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:03:53.477502 sshd-session[1839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:03:53.481240 systemd-logind[1635]: New session 5 of user core. Jan 23 00:03:53.497513 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 23 00:03:53.916930 sshd[1842]: Connection closed by 20.161.92.111 port 40540 Jan 23 00:03:53.917459 sshd-session[1839]: pam_unix(sshd:session): session closed for user core Jan 23 00:03:53.920763 systemd[1]: sshd@4-10.0.0.231:22-20.161.92.111:40540.service: Deactivated successfully. Jan 23 00:03:53.924128 systemd[1]: session-5.scope: Deactivated successfully. Jan 23 00:03:53.924690 systemd-logind[1635]: Session 5 logged out. Waiting for processes to exit. Jan 23 00:03:53.925692 systemd-logind[1635]: Removed session 5. Jan 23 00:03:54.115717 coreos-metadata[1614]: Jan 23 00:03:54.115 INFO Fetch successful Jan 23 00:03:54.115717 coreos-metadata[1614]: Jan 23 00:03:54.115 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 23 00:03:56.600376 coreos-metadata[1614]: Jan 23 00:03:56.600 INFO Fetch successful Jan 23 00:03:56.645941 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 23 00:03:56.647628 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 23 00:03:56.647904 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 23 00:03:56.648131 systemd[1]: Startup finished in 2.944s (kernel) + 17.241s (initrd) + 22.855s (userspace) = 43.041s. Jan 23 00:03:59.203748 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 23 00:03:59.205125 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:03:59.340836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:03:59.344555 (kubelet)[1860]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 00:03:59.376605 kubelet[1860]: E0123 00:03:59.376547 1860 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 00:03:59.379009 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 00:03:59.379142 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 00:03:59.380806 systemd[1]: kubelet.service: Consumed 138ms CPU time, 107.1M memory peak. Jan 23 00:03:59.610049 chronyd[1612]: Selected source PHC0 Jan 23 00:04:04.276417 systemd[1]: Started sshd@5-10.0.0.231:22-20.161.92.111:57016.service - OpenSSH per-connection server daemon (20.161.92.111:57016). Jan 23 00:04:04.913619 sshd[1870]: Accepted publickey for core from 20.161.92.111 port 57016 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:04:04.914846 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:04:04.918592 systemd-logind[1635]: New session 6 of user core. Jan 23 00:04:04.928010 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 23 00:04:05.343598 sshd[1873]: Connection closed by 20.161.92.111 port 57016 Jan 23 00:04:05.343933 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Jan 23 00:04:05.347701 systemd[1]: sshd@5-10.0.0.231:22-20.161.92.111:57016.service: Deactivated successfully. Jan 23 00:04:05.349401 systemd[1]: session-6.scope: Deactivated successfully. Jan 23 00:04:05.350165 systemd-logind[1635]: Session 6 logged out. Waiting for processes to exit. Jan 23 00:04:05.351474 systemd-logind[1635]: Removed session 6. Jan 23 00:04:05.462418 systemd[1]: Started sshd@6-10.0.0.231:22-20.161.92.111:57022.service - OpenSSH per-connection server daemon (20.161.92.111:57022). Jan 23 00:04:06.073519 sshd[1879]: Accepted publickey for core from 20.161.92.111 port 57022 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:04:06.075181 sshd-session[1879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:04:06.080071 systemd-logind[1635]: New session 7 of user core. Jan 23 00:04:06.101110 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 23 00:04:06.500263 sshd[1882]: Connection closed by 20.161.92.111 port 57022 Jan 23 00:04:06.500838 sshd-session[1879]: pam_unix(sshd:session): session closed for user core Jan 23 00:04:06.504806 systemd[1]: sshd@6-10.0.0.231:22-20.161.92.111:57022.service: Deactivated successfully. Jan 23 00:04:06.506399 systemd[1]: session-7.scope: Deactivated successfully. Jan 23 00:04:06.508354 systemd-logind[1635]: Session 7 logged out. Waiting for processes to exit. Jan 23 00:04:06.509216 systemd-logind[1635]: Removed session 7. Jan 23 00:04:06.611139 systemd[1]: Started sshd@7-10.0.0.231:22-20.161.92.111:57028.service - OpenSSH per-connection server daemon (20.161.92.111:57028). Jan 23 00:04:07.237290 sshd[1888]: Accepted publickey for core from 20.161.92.111 port 57028 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:04:07.238529 sshd-session[1888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:04:07.242227 systemd-logind[1635]: New session 8 of user core. Jan 23 00:04:07.251890 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 23 00:04:07.669188 sshd[1891]: Connection closed by 20.161.92.111 port 57028 Jan 23 00:04:07.669594 sshd-session[1888]: pam_unix(sshd:session): session closed for user core Jan 23 00:04:07.673541 systemd[1]: sshd@7-10.0.0.231:22-20.161.92.111:57028.service: Deactivated successfully. Jan 23 00:04:07.675076 systemd[1]: session-8.scope: Deactivated successfully. Jan 23 00:04:07.675802 systemd-logind[1635]: Session 8 logged out. Waiting for processes to exit. Jan 23 00:04:07.676919 systemd-logind[1635]: Removed session 8. Jan 23 00:04:07.779927 systemd[1]: Started sshd@8-10.0.0.231:22-20.161.92.111:57036.service - OpenSSH per-connection server daemon (20.161.92.111:57036). Jan 23 00:04:08.393067 sshd[1897]: Accepted publickey for core from 20.161.92.111 port 57036 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:04:08.394633 sshd-session[1897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:04:08.398328 systemd-logind[1635]: New session 9 of user core. Jan 23 00:04:08.404971 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 23 00:04:08.739049 sudo[1901]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 23 00:04:08.739305 sudo[1901]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 00:04:08.752209 sudo[1901]: pam_unix(sudo:session): session closed for user root Jan 23 00:04:08.849408 sshd[1900]: Connection closed by 20.161.92.111 port 57036 Jan 23 00:04:08.849897 sshd-session[1897]: pam_unix(sshd:session): session closed for user core Jan 23 00:04:08.854358 systemd[1]: sshd@8-10.0.0.231:22-20.161.92.111:57036.service: Deactivated successfully. Jan 23 00:04:08.857081 systemd[1]: session-9.scope: Deactivated successfully. Jan 23 00:04:08.858546 systemd-logind[1635]: Session 9 logged out. Waiting for processes to exit. Jan 23 00:04:08.859849 systemd-logind[1635]: Removed session 9. Jan 23 00:04:08.961304 systemd[1]: Started sshd@9-10.0.0.231:22-20.161.92.111:57044.service - OpenSSH per-connection server daemon (20.161.92.111:57044). Jan 23 00:04:09.464132 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 23 00:04:09.466381 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:04:09.573160 sshd[1907]: Accepted publickey for core from 20.161.92.111 port 57044 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:04:09.574564 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:04:09.580965 systemd-logind[1635]: New session 10 of user core. Jan 23 00:04:09.599160 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 23 00:04:09.657544 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:04:09.661331 (kubelet)[1919]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 00:04:09.693558 kubelet[1919]: E0123 00:04:09.693515 1919 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 00:04:09.695848 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 00:04:09.695975 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 00:04:09.696354 systemd[1]: kubelet.service: Consumed 141ms CPU time, 106.9M memory peak. Jan 23 00:04:09.907009 sudo[1929]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 23 00:04:09.907267 sudo[1929]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 00:04:09.911600 sudo[1929]: pam_unix(sudo:session): session closed for user root Jan 23 00:04:09.916069 sudo[1928]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 23 00:04:09.916305 sudo[1928]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 00:04:09.924747 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 00:04:09.965670 augenrules[1951]: No rules Jan 23 00:04:09.966777 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 00:04:09.968795 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 00:04:09.970282 sudo[1928]: pam_unix(sudo:session): session closed for user root Jan 23 00:04:10.066903 sshd[1913]: Connection closed by 20.161.92.111 port 57044 Jan 23 00:04:10.067346 sshd-session[1907]: pam_unix(sshd:session): session closed for user core Jan 23 00:04:10.070092 systemd[1]: sshd@9-10.0.0.231:22-20.161.92.111:57044.service: Deactivated successfully. Jan 23 00:04:10.072044 systemd[1]: session-10.scope: Deactivated successfully. Jan 23 00:04:10.075546 systemd-logind[1635]: Session 10 logged out. Waiting for processes to exit. Jan 23 00:04:10.076862 systemd-logind[1635]: Removed session 10. Jan 23 00:04:10.178144 systemd[1]: Started sshd@10-10.0.0.231:22-20.161.92.111:57054.service - OpenSSH per-connection server daemon (20.161.92.111:57054). Jan 23 00:04:10.785462 sshd[1960]: Accepted publickey for core from 20.161.92.111 port 57054 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:04:10.786672 sshd-session[1960]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:04:10.790555 systemd-logind[1635]: New session 11 of user core. Jan 23 00:04:10.811118 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 23 00:04:11.117080 sudo[1964]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 23 00:04:11.117328 sudo[1964]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 00:04:11.456192 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 23 00:04:11.468078 (dockerd)[1984]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 23 00:04:11.709567 dockerd[1984]: time="2026-01-23T00:04:11.709502057Z" level=info msg="Starting up" Jan 23 00:04:11.710325 dockerd[1984]: time="2026-01-23T00:04:11.710303899Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 23 00:04:11.720789 dockerd[1984]: time="2026-01-23T00:04:11.720751409Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 23 00:04:11.794005 dockerd[1984]: time="2026-01-23T00:04:11.793881699Z" level=info msg="Loading containers: start." Jan 23 00:04:11.808736 kernel: Initializing XFRM netlink socket Jan 23 00:04:12.061261 systemd-networkd[1519]: docker0: Link UP Jan 23 00:04:12.070326 dockerd[1984]: time="2026-01-23T00:04:12.070278692Z" level=info msg="Loading containers: done." Jan 23 00:04:12.083174 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck429151724-merged.mount: Deactivated successfully. Jan 23 00:04:12.084802 dockerd[1984]: time="2026-01-23T00:04:12.084460933Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 23 00:04:12.084802 dockerd[1984]: time="2026-01-23T00:04:12.084547373Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 23 00:04:12.084802 dockerd[1984]: time="2026-01-23T00:04:12.084653693Z" level=info msg="Initializing buildkit" Jan 23 00:04:12.114322 dockerd[1984]: time="2026-01-23T00:04:12.114267658Z" level=info msg="Completed buildkit initialization" Jan 23 00:04:12.121298 dockerd[1984]: time="2026-01-23T00:04:12.121249998Z" level=info msg="Daemon has completed initialization" Jan 23 00:04:12.121354 dockerd[1984]: time="2026-01-23T00:04:12.121314999Z" level=info msg="API listen on /run/docker.sock" Jan 23 00:04:12.121522 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 23 00:04:13.907471 containerd[1664]: time="2026-01-23T00:04:13.907431004Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Jan 23 00:04:14.641786 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1855897519.mount: Deactivated successfully. Jan 23 00:04:15.576189 containerd[1664]: time="2026-01-23T00:04:15.576112152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:15.577513 containerd[1664]: time="2026-01-23T00:04:15.577464116Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=24571138" Jan 23 00:04:15.578572 containerd[1664]: time="2026-01-23T00:04:15.578537119Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:15.582819 containerd[1664]: time="2026-01-23T00:04:15.582784092Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:15.583658 containerd[1664]: time="2026-01-23T00:04:15.583623174Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.67615081s" Jan 23 00:04:15.583691 containerd[1664]: time="2026-01-23T00:04:15.583657974Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Jan 23 00:04:15.584470 containerd[1664]: time="2026-01-23T00:04:15.584136015Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Jan 23 00:04:16.860150 containerd[1664]: time="2026-01-23T00:04:16.860102122Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:16.861260 containerd[1664]: time="2026-01-23T00:04:16.861230925Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19135497" Jan 23 00:04:16.862567 containerd[1664]: time="2026-01-23T00:04:16.862517249Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:16.865334 containerd[1664]: time="2026-01-23T00:04:16.865291097Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:16.866740 containerd[1664]: time="2026-01-23T00:04:16.866161740Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.281963204s" Jan 23 00:04:16.866848 containerd[1664]: time="2026-01-23T00:04:16.866828221Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Jan 23 00:04:16.867275 containerd[1664]: time="2026-01-23T00:04:16.867254103Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Jan 23 00:04:17.869242 containerd[1664]: time="2026-01-23T00:04:17.869195222Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:17.870100 containerd[1664]: time="2026-01-23T00:04:17.870071865Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14191736" Jan 23 00:04:17.871745 containerd[1664]: time="2026-01-23T00:04:17.871217988Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:17.874050 containerd[1664]: time="2026-01-23T00:04:17.874014076Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:17.875026 containerd[1664]: time="2026-01-23T00:04:17.875006039Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 1.007724536s" Jan 23 00:04:17.875094 containerd[1664]: time="2026-01-23T00:04:17.875031759Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Jan 23 00:04:17.875768 containerd[1664]: time="2026-01-23T00:04:17.875421040Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Jan 23 00:04:18.769415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1683642928.mount: Deactivated successfully. Jan 23 00:04:18.931836 containerd[1664]: time="2026-01-23T00:04:18.931360755Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:18.932363 containerd[1664]: time="2026-01-23T00:04:18.932307037Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=22805279" Jan 23 00:04:18.933120 containerd[1664]: time="2026-01-23T00:04:18.933087000Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:18.935908 containerd[1664]: time="2026-01-23T00:04:18.935855928Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:18.936768 containerd[1664]: time="2026-01-23T00:04:18.936672210Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.060893169s" Jan 23 00:04:18.936768 containerd[1664]: time="2026-01-23T00:04:18.936702810Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Jan 23 00:04:18.937265 containerd[1664]: time="2026-01-23T00:04:18.937238972Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Jan 23 00:04:19.559407 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4213575445.mount: Deactivated successfully. Jan 23 00:04:19.710287 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 23 00:04:19.712906 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:04:19.859825 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:04:19.865839 (kubelet)[2321]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 00:04:19.911211 kubelet[2321]: E0123 00:04:19.911161 2321 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 00:04:19.913443 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 00:04:19.913579 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 00:04:19.913986 systemd[1]: kubelet.service: Consumed 142ms CPU time, 107.5M memory peak. Jan 23 00:04:20.495354 containerd[1664]: time="2026-01-23T00:04:20.495280083Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:20.496025 containerd[1664]: time="2026-01-23T00:04:20.496003365Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395498" Jan 23 00:04:20.497148 containerd[1664]: time="2026-01-23T00:04:20.497115048Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:20.499922 containerd[1664]: time="2026-01-23T00:04:20.499890176Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:20.500872 containerd[1664]: time="2026-01-23T00:04:20.500841339Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.563491727s" Jan 23 00:04:20.500918 containerd[1664]: time="2026-01-23T00:04:20.500878579Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Jan 23 00:04:20.501283 containerd[1664]: time="2026-01-23T00:04:20.501264340Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Jan 23 00:04:21.039450 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount611225024.mount: Deactivated successfully. Jan 23 00:04:21.046375 containerd[1664]: time="2026-01-23T00:04:21.046337144Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:21.047199 containerd[1664]: time="2026-01-23T00:04:21.047171467Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Jan 23 00:04:21.048303 containerd[1664]: time="2026-01-23T00:04:21.048259750Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:21.050829 containerd[1664]: time="2026-01-23T00:04:21.050783477Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:21.051779 containerd[1664]: time="2026-01-23T00:04:21.051486839Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 550.196259ms" Jan 23 00:04:21.051779 containerd[1664]: time="2026-01-23T00:04:21.051516119Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Jan 23 00:04:21.052312 containerd[1664]: time="2026-01-23T00:04:21.052081761Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Jan 23 00:04:21.606668 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount495008842.mount: Deactivated successfully. Jan 23 00:04:21.615843 update_engine[1641]: I20260123 00:04:21.615756 1641 update_attempter.cc:509] Updating boot flags... Jan 23 00:04:24.015317 containerd[1664]: time="2026-01-23T00:04:24.015272224Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:24.016684 containerd[1664]: time="2026-01-23T00:04:24.016643988Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=98063043" Jan 23 00:04:24.017987 containerd[1664]: time="2026-01-23T00:04:24.017935111Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:24.021163 containerd[1664]: time="2026-01-23T00:04:24.020746000Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:24.021758 containerd[1664]: time="2026-01-23T00:04:24.021716722Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 2.969607681s" Jan 23 00:04:24.021815 containerd[1664]: time="2026-01-23T00:04:24.021759762Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Jan 23 00:04:28.416279 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:04:28.416410 systemd[1]: kubelet.service: Consumed 142ms CPU time, 107.5M memory peak. Jan 23 00:04:28.418241 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:04:28.442019 systemd[1]: Reload requested from client PID 2448 ('systemctl') (unit session-11.scope)... Jan 23 00:04:28.442036 systemd[1]: Reloading... Jan 23 00:04:28.516906 zram_generator::config[2490]: No configuration found. Jan 23 00:04:28.693068 systemd[1]: Reloading finished in 250 ms. Jan 23 00:04:28.753595 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 23 00:04:28.753680 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 23 00:04:28.753984 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:04:28.754033 systemd[1]: kubelet.service: Consumed 92ms CPU time, 95M memory peak. Jan 23 00:04:28.755562 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:04:28.873657 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:04:28.877613 (kubelet)[2538]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 00:04:28.911082 kubelet[2538]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 00:04:28.911082 kubelet[2538]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 00:04:28.912140 kubelet[2538]: I0123 00:04:28.912085 2538 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 00:04:29.683950 kubelet[2538]: I0123 00:04:29.683900 2538 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 23 00:04:29.683950 kubelet[2538]: I0123 00:04:29.683933 2538 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 00:04:29.685756 kubelet[2538]: I0123 00:04:29.685735 2538 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 23 00:04:29.685794 kubelet[2538]: I0123 00:04:29.685755 2538 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 00:04:29.686037 kubelet[2538]: I0123 00:04:29.685978 2538 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 00:04:29.696758 kubelet[2538]: E0123 00:04:29.696706 2538 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.231:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 23 00:04:29.697753 kubelet[2538]: I0123 00:04:29.697303 2538 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 00:04:29.701203 kubelet[2538]: I0123 00:04:29.701185 2538 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 00:04:29.703705 kubelet[2538]: I0123 00:04:29.703679 2538 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 23 00:04:29.703901 kubelet[2538]: I0123 00:04:29.703878 2538 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 00:04:29.704038 kubelet[2538]: I0123 00:04:29.703900 2538 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-n-22c0b85714","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 00:04:29.704134 kubelet[2538]: I0123 00:04:29.704043 2538 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 00:04:29.704134 kubelet[2538]: I0123 00:04:29.704052 2538 container_manager_linux.go:306] "Creating device plugin manager" Jan 23 00:04:29.704174 kubelet[2538]: I0123 00:04:29.704148 2538 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 23 00:04:29.706678 kubelet[2538]: I0123 00:04:29.706646 2538 state_mem.go:36] "Initialized new in-memory state store" Jan 23 00:04:29.709112 kubelet[2538]: I0123 00:04:29.709060 2538 kubelet.go:475] "Attempting to sync node with API server" Jan 23 00:04:29.709112 kubelet[2538]: I0123 00:04:29.709086 2538 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 00:04:29.709686 kubelet[2538]: E0123 00:04:29.709657 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.231:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-n-22c0b85714&limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 23 00:04:29.710155 kubelet[2538]: I0123 00:04:29.710123 2538 kubelet.go:387] "Adding apiserver pod source" Jan 23 00:04:29.710183 kubelet[2538]: I0123 00:04:29.710158 2538 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 00:04:29.710815 kubelet[2538]: E0123 00:04:29.710752 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.231:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 23 00:04:29.711747 kubelet[2538]: I0123 00:04:29.711379 2538 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Jan 23 00:04:29.712083 kubelet[2538]: I0123 00:04:29.712043 2538 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 00:04:29.712083 kubelet[2538]: I0123 00:04:29.712077 2538 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 23 00:04:29.712143 kubelet[2538]: W0123 00:04:29.712114 2538 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 23 00:04:29.715248 kubelet[2538]: I0123 00:04:29.715058 2538 server.go:1262] "Started kubelet" Jan 23 00:04:29.715248 kubelet[2538]: I0123 00:04:29.715218 2538 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 00:04:29.715773 kubelet[2538]: I0123 00:04:29.715748 2538 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 00:04:29.716027 kubelet[2538]: I0123 00:04:29.715998 2538 server.go:310] "Adding debug handlers to kubelet server" Jan 23 00:04:29.717619 kubelet[2538]: I0123 00:04:29.717597 2538 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 00:04:29.720802 kubelet[2538]: I0123 00:04:29.720770 2538 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 23 00:04:29.720925 kubelet[2538]: I0123 00:04:29.720866 2538 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 00:04:29.720965 kubelet[2538]: I0123 00:04:29.720946 2538 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 23 00:04:29.721118 kubelet[2538]: E0123 00:04:29.721086 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:29.721184 kubelet[2538]: I0123 00:04:29.721163 2538 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 00:04:29.722067 kubelet[2538]: I0123 00:04:29.722012 2538 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 23 00:04:29.722171 kubelet[2538]: I0123 00:04:29.722114 2538 reconciler.go:29] "Reconciler: start to sync state" Jan 23 00:04:29.722999 kubelet[2538]: E0123 00:04:29.722961 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.231:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 23 00:04:29.723094 kubelet[2538]: E0123 00:04:29.723067 2538 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-22c0b85714?timeout=10s\": dial tcp 10.0.0.231:6443: connect: connection refused" interval="200ms" Jan 23 00:04:29.723594 kubelet[2538]: I0123 00:04:29.723569 2538 factory.go:223] Registration of the systemd container factory successfully Jan 23 00:04:29.723689 kubelet[2538]: I0123 00:04:29.723670 2538 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 00:04:29.725987 kubelet[2538]: I0123 00:04:29.725689 2538 factory.go:223] Registration of the containerd container factory successfully Jan 23 00:04:29.731838 kubelet[2538]: E0123 00:04:29.730319 2538 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.231:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.231:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-n-22c0b85714.188d33553708b384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-n-22c0b85714,UID:ci-4459-2-2-n-22c0b85714,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-n-22c0b85714,},FirstTimestamp:2026-01-23 00:04:29.71502682 +0000 UTC m=+0.834750517,LastTimestamp:2026-01-23 00:04:29.71502682 +0000 UTC m=+0.834750517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-n-22c0b85714,}" Jan 23 00:04:29.733002 kubelet[2538]: E0123 00:04:29.732979 2538 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 00:04:29.737836 kubelet[2538]: I0123 00:04:29.737799 2538 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 23 00:04:29.738937 kubelet[2538]: I0123 00:04:29.738919 2538 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 23 00:04:29.738937 kubelet[2538]: I0123 00:04:29.738938 2538 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 23 00:04:29.739017 kubelet[2538]: I0123 00:04:29.738969 2538 kubelet.go:2427] "Starting kubelet main sync loop" Jan 23 00:04:29.739038 kubelet[2538]: E0123 00:04:29.739011 2538 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 00:04:29.740201 kubelet[2538]: E0123 00:04:29.740165 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.231:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 23 00:04:29.740934 kubelet[2538]: I0123 00:04:29.740911 2538 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 00:04:29.740934 kubelet[2538]: I0123 00:04:29.740931 2538 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 00:04:29.741019 kubelet[2538]: I0123 00:04:29.740949 2538 state_mem.go:36] "Initialized new in-memory state store" Jan 23 00:04:29.743366 kubelet[2538]: I0123 00:04:29.743343 2538 policy_none.go:49] "None policy: Start" Jan 23 00:04:29.743366 kubelet[2538]: I0123 00:04:29.743366 2538 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 23 00:04:29.743451 kubelet[2538]: I0123 00:04:29.743377 2538 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 23 00:04:29.744908 kubelet[2538]: I0123 00:04:29.744885 2538 policy_none.go:47] "Start" Jan 23 00:04:29.748296 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 23 00:04:29.763077 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 23 00:04:29.765754 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 23 00:04:29.781183 kubelet[2538]: E0123 00:04:29.780981 2538 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 00:04:29.781183 kubelet[2538]: I0123 00:04:29.781186 2538 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 00:04:29.781290 kubelet[2538]: I0123 00:04:29.781196 2538 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 00:04:29.781421 kubelet[2538]: I0123 00:04:29.781400 2538 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 00:04:29.782778 kubelet[2538]: E0123 00:04:29.782752 2538 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 00:04:29.782854 kubelet[2538]: E0123 00:04:29.782796 2538 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:29.849016 systemd[1]: Created slice kubepods-burstable-pod3a10472d13f2951e88c6a61d231674e2.slice - libcontainer container kubepods-burstable-pod3a10472d13f2951e88c6a61d231674e2.slice. Jan 23 00:04:29.862564 kubelet[2538]: E0123 00:04:29.862532 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.866030 systemd[1]: Created slice kubepods-burstable-podfebd4c2f08265b8ed6daeee1c1e41be2.slice - libcontainer container kubepods-burstable-podfebd4c2f08265b8ed6daeee1c1e41be2.slice. Jan 23 00:04:29.868094 kubelet[2538]: E0123 00:04:29.868073 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.870017 systemd[1]: Created slice kubepods-burstable-pod66724ef1eb38b95b6410ddd291cb47c9.slice - libcontainer container kubepods-burstable-pod66724ef1eb38b95b6410ddd291cb47c9.slice. Jan 23 00:04:29.871757 kubelet[2538]: E0123 00:04:29.871585 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.883183 kubelet[2538]: I0123 00:04:29.883161 2538 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.883745 kubelet[2538]: E0123 00:04:29.883690 2538 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.231:6443/api/v1/nodes\": dial tcp 10.0.0.231:6443: connect: connection refused" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923133 kubelet[2538]: I0123 00:04:29.923073 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3a10472d13f2951e88c6a61d231674e2-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" (UID: \"3a10472d13f2951e88c6a61d231674e2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923133 kubelet[2538]: I0123 00:04:29.923120 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3a10472d13f2951e88c6a61d231674e2-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" (UID: \"3a10472d13f2951e88c6a61d231674e2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923620 kubelet[2538]: I0123 00:04:29.923248 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923620 kubelet[2538]: I0123 00:04:29.923335 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923620 kubelet[2538]: I0123 00:04:29.923388 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/66724ef1eb38b95b6410ddd291cb47c9-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-n-22c0b85714\" (UID: \"66724ef1eb38b95b6410ddd291cb47c9\") " pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923620 kubelet[2538]: I0123 00:04:29.923420 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3a10472d13f2951e88c6a61d231674e2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" (UID: \"3a10472d13f2951e88c6a61d231674e2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923620 kubelet[2538]: I0123 00:04:29.923436 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923757 kubelet[2538]: I0123 00:04:29.923451 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923757 kubelet[2538]: I0123 00:04:29.923475 2538 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:29.923757 kubelet[2538]: E0123 00:04:29.923500 2538 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-22c0b85714?timeout=10s\": dial tcp 10.0.0.231:6443: connect: connection refused" interval="400ms" Jan 23 00:04:30.086058 kubelet[2538]: I0123 00:04:30.086018 2538 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:30.086355 kubelet[2538]: E0123 00:04:30.086329 2538 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.231:6443/api/v1/nodes\": dial tcp 10.0.0.231:6443: connect: connection refused" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:30.166207 containerd[1664]: time="2026-01-23T00:04:30.166168034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-n-22c0b85714,Uid:3a10472d13f2951e88c6a61d231674e2,Namespace:kube-system,Attempt:0,}" Jan 23 00:04:30.171950 containerd[1664]: time="2026-01-23T00:04:30.171774251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-n-22c0b85714,Uid:febd4c2f08265b8ed6daeee1c1e41be2,Namespace:kube-system,Attempt:0,}" Jan 23 00:04:30.173847 containerd[1664]: time="2026-01-23T00:04:30.173817776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-n-22c0b85714,Uid:66724ef1eb38b95b6410ddd291cb47c9,Namespace:kube-system,Attempt:0,}" Jan 23 00:04:30.324405 kubelet[2538]: E0123 00:04:30.324351 2538 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-22c0b85714?timeout=10s\": dial tcp 10.0.0.231:6443: connect: connection refused" interval="800ms" Jan 23 00:04:30.488884 kubelet[2538]: I0123 00:04:30.488829 2538 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:30.489229 kubelet[2538]: E0123 00:04:30.489186 2538 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.231:6443/api/v1/nodes\": dial tcp 10.0.0.231:6443: connect: connection refused" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:30.765898 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount107585667.mount: Deactivated successfully. Jan 23 00:04:30.776695 containerd[1664]: time="2026-01-23T00:04:30.776049065Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 00:04:30.776881 kubelet[2538]: E0123 00:04:30.776545 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.231:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 23 00:04:30.778592 containerd[1664]: time="2026-01-23T00:04:30.778544752Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Jan 23 00:04:30.780376 containerd[1664]: time="2026-01-23T00:04:30.780341957Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 00:04:30.781409 containerd[1664]: time="2026-01-23T00:04:30.781378640Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 00:04:30.783588 containerd[1664]: time="2026-01-23T00:04:30.783025405Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 00:04:30.783983 containerd[1664]: time="2026-01-23T00:04:30.783950007Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 00:04:30.784820 containerd[1664]: time="2026-01-23T00:04:30.784786610Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 00:04:30.786497 containerd[1664]: time="2026-01-23T00:04:30.786459174Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 00:04:30.787425 containerd[1664]: time="2026-01-23T00:04:30.787379377Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 613.860241ms" Jan 23 00:04:30.790532 containerd[1664]: time="2026-01-23T00:04:30.790205545Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 614.461763ms" Jan 23 00:04:30.791146 containerd[1664]: time="2026-01-23T00:04:30.791105148Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 623.231509ms" Jan 23 00:04:30.816108 containerd[1664]: time="2026-01-23T00:04:30.815969219Z" level=info msg="connecting to shim 5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192" address="unix:///run/containerd/s/0a774f6096ca08883c42fb3f9cb746eda10bb0b4739fd2252c053885cc7f235e" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:04:30.819683 containerd[1664]: time="2026-01-23T00:04:30.819639470Z" level=info msg="connecting to shim d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720" address="unix:///run/containerd/s/2a5abcca5ef7165ede6bcec3da5d7d66968b82b20e1c55ecb7980c3e0265f0aa" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:04:30.829959 containerd[1664]: time="2026-01-23T00:04:30.829915339Z" level=info msg="connecting to shim 6065c6292a8bc8f44ec74f2ee2f056d5c3dd0217226db07ca237ce04601116b1" address="unix:///run/containerd/s/38bc52e642c1b6589672153412b2455c60f78e98009566fcc00c37ec8c760143" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:04:30.841899 systemd[1]: Started cri-containerd-d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720.scope - libcontainer container d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720. Jan 23 00:04:30.845852 systemd[1]: Started cri-containerd-5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192.scope - libcontainer container 5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192. Jan 23 00:04:30.851636 systemd[1]: Started cri-containerd-6065c6292a8bc8f44ec74f2ee2f056d5c3dd0217226db07ca237ce04601116b1.scope - libcontainer container 6065c6292a8bc8f44ec74f2ee2f056d5c3dd0217226db07ca237ce04601116b1. Jan 23 00:04:30.886279 containerd[1664]: time="2026-01-23T00:04:30.886236621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-n-22c0b85714,Uid:66724ef1eb38b95b6410ddd291cb47c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720\"" Jan 23 00:04:30.892013 containerd[1664]: time="2026-01-23T00:04:30.891545356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-n-22c0b85714,Uid:febd4c2f08265b8ed6daeee1c1e41be2,Namespace:kube-system,Attempt:0,} returns sandbox id \"5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192\"" Jan 23 00:04:30.894774 containerd[1664]: time="2026-01-23T00:04:30.894622525Z" level=info msg="CreateContainer within sandbox \"d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 23 00:04:30.897305 kubelet[2538]: E0123 00:04:30.897275 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.231:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 23 00:04:30.898656 containerd[1664]: time="2026-01-23T00:04:30.898624176Z" level=info msg="CreateContainer within sandbox \"5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 23 00:04:30.901188 containerd[1664]: time="2026-01-23T00:04:30.901131624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-n-22c0b85714,Uid:3a10472d13f2951e88c6a61d231674e2,Namespace:kube-system,Attempt:0,} returns sandbox id \"6065c6292a8bc8f44ec74f2ee2f056d5c3dd0217226db07ca237ce04601116b1\"" Jan 23 00:04:30.906742 containerd[1664]: time="2026-01-23T00:04:30.906682919Z" level=info msg="CreateContainer within sandbox \"6065c6292a8bc8f44ec74f2ee2f056d5c3dd0217226db07ca237ce04601116b1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 23 00:04:30.908755 containerd[1664]: time="2026-01-23T00:04:30.908368324Z" level=info msg="Container 9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:04:30.909483 containerd[1664]: time="2026-01-23T00:04:30.909333287Z" level=info msg="Container c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:04:30.917851 containerd[1664]: time="2026-01-23T00:04:30.917588911Z" level=info msg="CreateContainer within sandbox \"d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0\"" Jan 23 00:04:30.918716 containerd[1664]: time="2026-01-23T00:04:30.918642394Z" level=info msg="StartContainer for \"9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0\"" Jan 23 00:04:30.920570 containerd[1664]: time="2026-01-23T00:04:30.920514199Z" level=info msg="connecting to shim 9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0" address="unix:///run/containerd/s/2a5abcca5ef7165ede6bcec3da5d7d66968b82b20e1c55ecb7980c3e0265f0aa" protocol=ttrpc version=3 Jan 23 00:04:30.921553 containerd[1664]: time="2026-01-23T00:04:30.920879040Z" level=info msg="Container f2cc479429e6dbc26388668adeeca0b5c53f71a50f57c2c1c95572825c7eb349: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:04:30.924688 containerd[1664]: time="2026-01-23T00:04:30.924655411Z" level=info msg="CreateContainer within sandbox \"5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280\"" Jan 23 00:04:30.925450 containerd[1664]: time="2026-01-23T00:04:30.925402173Z" level=info msg="StartContainer for \"c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280\"" Jan 23 00:04:30.927079 containerd[1664]: time="2026-01-23T00:04:30.927050258Z" level=info msg="connecting to shim c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280" address="unix:///run/containerd/s/0a774f6096ca08883c42fb3f9cb746eda10bb0b4739fd2252c053885cc7f235e" protocol=ttrpc version=3 Jan 23 00:04:30.930418 containerd[1664]: time="2026-01-23T00:04:30.930314187Z" level=info msg="CreateContainer within sandbox \"6065c6292a8bc8f44ec74f2ee2f056d5c3dd0217226db07ca237ce04601116b1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f2cc479429e6dbc26388668adeeca0b5c53f71a50f57c2c1c95572825c7eb349\"" Jan 23 00:04:30.932656 containerd[1664]: time="2026-01-23T00:04:30.932593394Z" level=info msg="StartContainer for \"f2cc479429e6dbc26388668adeeca0b5c53f71a50f57c2c1c95572825c7eb349\"" Jan 23 00:04:30.934912 containerd[1664]: time="2026-01-23T00:04:30.934878200Z" level=info msg="connecting to shim f2cc479429e6dbc26388668adeeca0b5c53f71a50f57c2c1c95572825c7eb349" address="unix:///run/containerd/s/38bc52e642c1b6589672153412b2455c60f78e98009566fcc00c37ec8c760143" protocol=ttrpc version=3 Jan 23 00:04:30.939942 systemd[1]: Started cri-containerd-9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0.scope - libcontainer container 9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0. Jan 23 00:04:30.942524 systemd[1]: Started cri-containerd-c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280.scope - libcontainer container c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280. Jan 23 00:04:30.949366 systemd[1]: Started cri-containerd-f2cc479429e6dbc26388668adeeca0b5c53f71a50f57c2c1c95572825c7eb349.scope - libcontainer container f2cc479429e6dbc26388668adeeca0b5c53f71a50f57c2c1c95572825c7eb349. Jan 23 00:04:30.977089 kubelet[2538]: E0123 00:04:30.977041 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.231:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-n-22c0b85714&limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 23 00:04:30.988549 containerd[1664]: time="2026-01-23T00:04:30.988452714Z" level=info msg="StartContainer for \"9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0\" returns successfully" Jan 23 00:04:30.996328 containerd[1664]: time="2026-01-23T00:04:30.996250937Z" level=info msg="StartContainer for \"c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280\" returns successfully" Jan 23 00:04:31.004910 containerd[1664]: time="2026-01-23T00:04:31.004855121Z" level=info msg="StartContainer for \"f2cc479429e6dbc26388668adeeca0b5c53f71a50f57c2c1c95572825c7eb349\" returns successfully" Jan 23 00:04:31.057942 kubelet[2538]: E0123 00:04:31.057832 2538 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.231:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 23 00:04:31.293194 kubelet[2538]: I0123 00:04:31.293155 2538 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:31.747053 kubelet[2538]: E0123 00:04:31.746989 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:31.751105 kubelet[2538]: E0123 00:04:31.751077 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:31.753139 kubelet[2538]: E0123 00:04:31.753112 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:32.699671 kubelet[2538]: I0123 00:04:32.699622 2538 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:32.699671 kubelet[2538]: E0123 00:04:32.699664 2538 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-2-n-22c0b85714\": node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:32.737310 kubelet[2538]: E0123 00:04:32.737266 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:32.755768 kubelet[2538]: E0123 00:04:32.755083 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:32.756413 kubelet[2538]: E0123 00:04:32.756184 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:32.838064 kubelet[2538]: E0123 00:04:32.838016 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:32.938471 kubelet[2538]: E0123 00:04:32.938436 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.039158 kubelet[2538]: E0123 00:04:33.039116 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.140040 kubelet[2538]: E0123 00:04:33.139963 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.240138 kubelet[2538]: E0123 00:04:33.240092 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.340966 kubelet[2538]: E0123 00:04:33.340806 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.441538 kubelet[2538]: E0123 00:04:33.441498 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.542356 kubelet[2538]: E0123 00:04:33.542314 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.643494 kubelet[2538]: E0123 00:04:33.643388 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.744471 kubelet[2538]: E0123 00:04:33.744355 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.757384 kubelet[2538]: E0123 00:04:33.757343 2538 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-n-22c0b85714\" not found" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:33.845385 kubelet[2538]: E0123 00:04:33.845226 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.945713 kubelet[2538]: E0123 00:04:33.945606 2538 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:33.984064 kubelet[2538]: I0123 00:04:33.984004 2538 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:34.022853 kubelet[2538]: I0123 00:04:34.022816 2538 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:34.027778 kubelet[2538]: I0123 00:04:34.027752 2538 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:34.032676 kubelet[2538]: I0123 00:04:34.032646 2538 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:34.037329 kubelet[2538]: E0123 00:04:34.037291 2538 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-n-22c0b85714\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:34.713198 kubelet[2538]: I0123 00:04:34.712950 2538 apiserver.go:52] "Watching apiserver" Jan 23 00:04:34.722841 kubelet[2538]: I0123 00:04:34.722820 2538 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 23 00:04:34.829670 systemd[1]: Reload requested from client PID 2828 ('systemctl') (unit session-11.scope)... Jan 23 00:04:34.829684 systemd[1]: Reloading... Jan 23 00:04:34.901766 zram_generator::config[2871]: No configuration found. Jan 23 00:04:35.071707 systemd[1]: Reloading finished in 241 ms. Jan 23 00:04:35.101081 kubelet[2538]: I0123 00:04:35.100737 2538 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 00:04:35.100990 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:04:35.115043 systemd[1]: kubelet.service: Deactivated successfully. Jan 23 00:04:35.115273 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:04:35.115329 systemd[1]: kubelet.service: Consumed 1.169s CPU time, 124.1M memory peak. Jan 23 00:04:35.117136 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 00:04:35.287230 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 00:04:35.291911 (kubelet)[2916]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 00:04:35.333794 kubelet[2916]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 00:04:35.333794 kubelet[2916]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 00:04:35.333794 kubelet[2916]: I0123 00:04:35.333741 2916 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 00:04:35.339467 kubelet[2916]: I0123 00:04:35.339415 2916 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 23 00:04:35.339467 kubelet[2916]: I0123 00:04:35.339450 2916 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 00:04:35.339467 kubelet[2916]: I0123 00:04:35.339480 2916 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 23 00:04:35.339617 kubelet[2916]: I0123 00:04:35.339486 2916 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 00:04:35.339715 kubelet[2916]: I0123 00:04:35.339697 2916 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 00:04:35.342632 kubelet[2916]: I0123 00:04:35.342595 2916 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 23 00:04:35.345827 kubelet[2916]: I0123 00:04:35.345790 2916 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 00:04:35.349358 kubelet[2916]: I0123 00:04:35.349335 2916 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 00:04:35.352766 kubelet[2916]: I0123 00:04:35.352002 2916 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 23 00:04:35.352766 kubelet[2916]: I0123 00:04:35.352222 2916 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 00:04:35.352766 kubelet[2916]: I0123 00:04:35.352250 2916 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-n-22c0b85714","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 00:04:35.352766 kubelet[2916]: I0123 00:04:35.352399 2916 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 00:04:35.352976 kubelet[2916]: I0123 00:04:35.352406 2916 container_manager_linux.go:306] "Creating device plugin manager" Jan 23 00:04:35.352976 kubelet[2916]: I0123 00:04:35.352432 2916 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 23 00:04:35.353335 kubelet[2916]: I0123 00:04:35.353312 2916 state_mem.go:36] "Initialized new in-memory state store" Jan 23 00:04:35.353490 kubelet[2916]: I0123 00:04:35.353480 2916 kubelet.go:475] "Attempting to sync node with API server" Jan 23 00:04:35.353519 kubelet[2916]: I0123 00:04:35.353496 2916 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 00:04:35.353591 kubelet[2916]: I0123 00:04:35.353521 2916 kubelet.go:387] "Adding apiserver pod source" Jan 23 00:04:35.353591 kubelet[2916]: I0123 00:04:35.353535 2916 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 00:04:35.354779 kubelet[2916]: I0123 00:04:35.354423 2916 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Jan 23 00:04:35.355057 kubelet[2916]: I0123 00:04:35.355019 2916 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 00:04:35.355057 kubelet[2916]: I0123 00:04:35.355055 2916 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 23 00:04:35.357981 kubelet[2916]: I0123 00:04:35.357951 2916 server.go:1262] "Started kubelet" Jan 23 00:04:35.358087 kubelet[2916]: I0123 00:04:35.358055 2916 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 00:04:35.359429 kubelet[2916]: I0123 00:04:35.358582 2916 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 00:04:35.359592 kubelet[2916]: I0123 00:04:35.359564 2916 server.go:310] "Adding debug handlers to kubelet server" Jan 23 00:04:35.361669 kubelet[2916]: I0123 00:04:35.361617 2916 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 00:04:35.367861 kubelet[2916]: I0123 00:04:35.367106 2916 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 23 00:04:35.368008 kubelet[2916]: I0123 00:04:35.367882 2916 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 23 00:04:35.368082 kubelet[2916]: I0123 00:04:35.368067 2916 reconciler.go:29] "Reconciler: start to sync state" Jan 23 00:04:35.370980 kubelet[2916]: E0123 00:04:35.370953 2916 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 00:04:35.372468 kubelet[2916]: I0123 00:04:35.372438 2916 factory.go:223] Registration of the systemd container factory successfully Jan 23 00:04:35.372608 kubelet[2916]: I0123 00:04:35.372586 2916 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 00:04:35.377296 kubelet[2916]: E0123 00:04:35.377221 2916 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-n-22c0b85714\" not found" Jan 23 00:04:35.385752 kubelet[2916]: I0123 00:04:35.384665 2916 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 00:04:35.385752 kubelet[2916]: I0123 00:04:35.385182 2916 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 23 00:04:35.386054 kubelet[2916]: I0123 00:04:35.386034 2916 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 00:04:35.392165 kubelet[2916]: I0123 00:04:35.391449 2916 factory.go:223] Registration of the containerd container factory successfully Jan 23 00:04:35.398017 kubelet[2916]: I0123 00:04:35.397970 2916 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 23 00:04:35.398961 kubelet[2916]: I0123 00:04:35.398929 2916 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 23 00:04:35.398961 kubelet[2916]: I0123 00:04:35.398956 2916 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 23 00:04:35.398961 kubelet[2916]: I0123 00:04:35.398983 2916 kubelet.go:2427] "Starting kubelet main sync loop" Jan 23 00:04:35.399094 kubelet[2916]: E0123 00:04:35.399025 2916 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 00:04:35.436895 kubelet[2916]: I0123 00:04:35.436855 2916 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 00:04:35.436895 kubelet[2916]: I0123 00:04:35.436896 2916 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 00:04:35.437022 kubelet[2916]: I0123 00:04:35.436920 2916 state_mem.go:36] "Initialized new in-memory state store" Jan 23 00:04:35.437322 kubelet[2916]: I0123 00:04:35.437060 2916 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 23 00:04:35.437322 kubelet[2916]: I0123 00:04:35.437078 2916 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 23 00:04:35.437322 kubelet[2916]: I0123 00:04:35.437096 2916 policy_none.go:49] "None policy: Start" Jan 23 00:04:35.437322 kubelet[2916]: I0123 00:04:35.437105 2916 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 23 00:04:35.437322 kubelet[2916]: I0123 00:04:35.437113 2916 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 23 00:04:35.437322 kubelet[2916]: I0123 00:04:35.437233 2916 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Jan 23 00:04:35.437322 kubelet[2916]: I0123 00:04:35.437241 2916 policy_none.go:47] "Start" Jan 23 00:04:35.445817 kubelet[2916]: E0123 00:04:35.445781 2916 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 00:04:35.446368 kubelet[2916]: I0123 00:04:35.446007 2916 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 00:04:35.446368 kubelet[2916]: I0123 00:04:35.446020 2916 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 00:04:35.446686 kubelet[2916]: I0123 00:04:35.446440 2916 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 00:04:35.448220 kubelet[2916]: E0123 00:04:35.448190 2916 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 00:04:35.500408 kubelet[2916]: I0123 00:04:35.500168 2916 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.500408 kubelet[2916]: I0123 00:04:35.500207 2916 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.500408 kubelet[2916]: I0123 00:04:35.500237 2916 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.506708 kubelet[2916]: E0123 00:04:35.506657 2916 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-n-22c0b85714\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.507410 kubelet[2916]: E0123 00:04:35.507379 2916 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.507489 kubelet[2916]: E0123 00:04:35.507473 2916 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.548974 kubelet[2916]: I0123 00:04:35.548707 2916 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.556147 kubelet[2916]: I0123 00:04:35.556103 2916 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.556256 kubelet[2916]: I0123 00:04:35.556187 2916 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569157 kubelet[2916]: I0123 00:04:35.569100 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3a10472d13f2951e88c6a61d231674e2-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" (UID: \"3a10472d13f2951e88c6a61d231674e2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569157 kubelet[2916]: I0123 00:04:35.569137 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3a10472d13f2951e88c6a61d231674e2-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" (UID: \"3a10472d13f2951e88c6a61d231674e2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569157 kubelet[2916]: I0123 00:04:35.569155 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569311 kubelet[2916]: I0123 00:04:35.569172 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569311 kubelet[2916]: I0123 00:04:35.569188 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/66724ef1eb38b95b6410ddd291cb47c9-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-n-22c0b85714\" (UID: \"66724ef1eb38b95b6410ddd291cb47c9\") " pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569311 kubelet[2916]: I0123 00:04:35.569249 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3a10472d13f2951e88c6a61d231674e2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" (UID: \"3a10472d13f2951e88c6a61d231674e2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569311 kubelet[2916]: I0123 00:04:35.569289 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569397 kubelet[2916]: I0123 00:04:35.569339 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:35.569397 kubelet[2916]: I0123 00:04:35.569367 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/febd4c2f08265b8ed6daeee1c1e41be2-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-n-22c0b85714\" (UID: \"febd4c2f08265b8ed6daeee1c1e41be2\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:36.354515 kubelet[2916]: I0123 00:04:36.354405 2916 apiserver.go:52] "Watching apiserver" Jan 23 00:04:36.368704 kubelet[2916]: I0123 00:04:36.368658 2916 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 23 00:04:36.418559 kubelet[2916]: I0123 00:04:36.418535 2916 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:36.425927 kubelet[2916]: E0123 00:04:36.425896 2916 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-n-22c0b85714\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" Jan 23 00:04:36.438331 kubelet[2916]: I0123 00:04:36.438274 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-n-22c0b85714" podStartSLOduration=2.438260153 podStartE2EDuration="2.438260153s" podCreationTimestamp="2026-01-23 00:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 00:04:36.438119433 +0000 UTC m=+1.142971801" watchObservedRunningTime="2026-01-23 00:04:36.438260153 +0000 UTC m=+1.143112561" Jan 23 00:04:36.455676 kubelet[2916]: I0123 00:04:36.455600 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-n-22c0b85714" podStartSLOduration=2.455582803 podStartE2EDuration="2.455582803s" podCreationTimestamp="2026-01-23 00:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 00:04:36.455420523 +0000 UTC m=+1.160272851" watchObservedRunningTime="2026-01-23 00:04:36.455582803 +0000 UTC m=+1.160435171" Jan 23 00:04:36.455676 kubelet[2916]: I0123 00:04:36.455680 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-n-22c0b85714" podStartSLOduration=3.455675083 podStartE2EDuration="3.455675083s" podCreationTimestamp="2026-01-23 00:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 00:04:36.446445417 +0000 UTC m=+1.151297785" watchObservedRunningTime="2026-01-23 00:04:36.455675083 +0000 UTC m=+1.160527451" Jan 23 00:04:40.399443 kubelet[2916]: I0123 00:04:40.399408 2916 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 23 00:04:40.400226 kubelet[2916]: I0123 00:04:40.399940 2916 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 23 00:04:40.400262 containerd[1664]: time="2026-01-23T00:04:40.399687721Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 23 00:04:41.510431 systemd[1]: Created slice kubepods-besteffort-podfde066f9_4c42_40d7_b19c_8aa95a694991.slice - libcontainer container kubepods-besteffort-podfde066f9_4c42_40d7_b19c_8aa95a694991.slice. Jan 23 00:04:41.606131 kubelet[2916]: I0123 00:04:41.605962 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/fde066f9-4c42-40d7-b19c-8aa95a694991-kube-proxy\") pod \"kube-proxy-cnwzm\" (UID: \"fde066f9-4c42-40d7-b19c-8aa95a694991\") " pod="kube-system/kube-proxy-cnwzm" Jan 23 00:04:41.606131 kubelet[2916]: I0123 00:04:41.606011 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fde066f9-4c42-40d7-b19c-8aa95a694991-xtables-lock\") pod \"kube-proxy-cnwzm\" (UID: \"fde066f9-4c42-40d7-b19c-8aa95a694991\") " pod="kube-system/kube-proxy-cnwzm" Jan 23 00:04:41.606131 kubelet[2916]: I0123 00:04:41.606038 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fde066f9-4c42-40d7-b19c-8aa95a694991-lib-modules\") pod \"kube-proxy-cnwzm\" (UID: \"fde066f9-4c42-40d7-b19c-8aa95a694991\") " pod="kube-system/kube-proxy-cnwzm" Jan 23 00:04:41.606131 kubelet[2916]: I0123 00:04:41.606054 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrfk8\" (UniqueName: \"kubernetes.io/projected/fde066f9-4c42-40d7-b19c-8aa95a694991-kube-api-access-zrfk8\") pod \"kube-proxy-cnwzm\" (UID: \"fde066f9-4c42-40d7-b19c-8aa95a694991\") " pod="kube-system/kube-proxy-cnwzm" Jan 23 00:04:41.675671 systemd[1]: Created slice kubepods-besteffort-pod2e74f8fa_55eb_479d_adc6_a716ad811336.slice - libcontainer container kubepods-besteffort-pod2e74f8fa_55eb_479d_adc6_a716ad811336.slice. Jan 23 00:04:41.706669 kubelet[2916]: I0123 00:04:41.706570 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lkk6\" (UniqueName: \"kubernetes.io/projected/2e74f8fa-55eb-479d-adc6-a716ad811336-kube-api-access-4lkk6\") pod \"tigera-operator-65cdcdfd6d-mpqkr\" (UID: \"2e74f8fa-55eb-479d-adc6-a716ad811336\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-mpqkr" Jan 23 00:04:41.706669 kubelet[2916]: I0123 00:04:41.706625 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2e74f8fa-55eb-479d-adc6-a716ad811336-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-mpqkr\" (UID: \"2e74f8fa-55eb-479d-adc6-a716ad811336\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-mpqkr" Jan 23 00:04:41.824790 containerd[1664]: time="2026-01-23T00:04:41.824695291Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cnwzm,Uid:fde066f9-4c42-40d7-b19c-8aa95a694991,Namespace:kube-system,Attempt:0,}" Jan 23 00:04:41.840731 containerd[1664]: time="2026-01-23T00:04:41.840350576Z" level=info msg="connecting to shim 5e0c3f729534ec0c0f17e935e6ec0beffdfcfa4a558df8e88e57a6231ed1df0b" address="unix:///run/containerd/s/51c80143059992c4b41ffcbbbb4620d0efcd5d06968d1fc0753a6ca836e055bc" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:04:41.869998 systemd[1]: Started cri-containerd-5e0c3f729534ec0c0f17e935e6ec0beffdfcfa4a558df8e88e57a6231ed1df0b.scope - libcontainer container 5e0c3f729534ec0c0f17e935e6ec0beffdfcfa4a558df8e88e57a6231ed1df0b. Jan 23 00:04:41.892483 containerd[1664]: time="2026-01-23T00:04:41.892440525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cnwzm,Uid:fde066f9-4c42-40d7-b19c-8aa95a694991,Namespace:kube-system,Attempt:0,} returns sandbox id \"5e0c3f729534ec0c0f17e935e6ec0beffdfcfa4a558df8e88e57a6231ed1df0b\"" Jan 23 00:04:41.897973 containerd[1664]: time="2026-01-23T00:04:41.897935181Z" level=info msg="CreateContainer within sandbox \"5e0c3f729534ec0c0f17e935e6ec0beffdfcfa4a558df8e88e57a6231ed1df0b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 23 00:04:41.905813 containerd[1664]: time="2026-01-23T00:04:41.905705963Z" level=info msg="Container 0c2db734ff143d5d24c82037baaf5282b689c233c6d7854ae4d2a1e69d8aefc2: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:04:41.918851 containerd[1664]: time="2026-01-23T00:04:41.918805881Z" level=info msg="CreateContainer within sandbox \"5e0c3f729534ec0c0f17e935e6ec0beffdfcfa4a558df8e88e57a6231ed1df0b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0c2db734ff143d5d24c82037baaf5282b689c233c6d7854ae4d2a1e69d8aefc2\"" Jan 23 00:04:41.919656 containerd[1664]: time="2026-01-23T00:04:41.919374762Z" level=info msg="StartContainer for \"0c2db734ff143d5d24c82037baaf5282b689c233c6d7854ae4d2a1e69d8aefc2\"" Jan 23 00:04:41.921284 containerd[1664]: time="2026-01-23T00:04:41.921236928Z" level=info msg="connecting to shim 0c2db734ff143d5d24c82037baaf5282b689c233c6d7854ae4d2a1e69d8aefc2" address="unix:///run/containerd/s/51c80143059992c4b41ffcbbbb4620d0efcd5d06968d1fc0753a6ca836e055bc" protocol=ttrpc version=3 Jan 23 00:04:41.945940 systemd[1]: Started cri-containerd-0c2db734ff143d5d24c82037baaf5282b689c233c6d7854ae4d2a1e69d8aefc2.scope - libcontainer container 0c2db734ff143d5d24c82037baaf5282b689c233c6d7854ae4d2a1e69d8aefc2. Jan 23 00:04:41.981470 containerd[1664]: time="2026-01-23T00:04:41.981430780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-mpqkr,Uid:2e74f8fa-55eb-479d-adc6-a716ad811336,Namespace:tigera-operator,Attempt:0,}" Jan 23 00:04:41.997192 containerd[1664]: time="2026-01-23T00:04:41.997148906Z" level=info msg="connecting to shim 83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c" address="unix:///run/containerd/s/229a700e0f1566c9b92a9bc93c10e4c05db231d7da447938cfa7e0fc2dc1b4a6" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:04:42.019950 systemd[1]: Started cri-containerd-83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c.scope - libcontainer container 83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c. Jan 23 00:04:42.032519 containerd[1664]: time="2026-01-23T00:04:42.032476327Z" level=info msg="StartContainer for \"0c2db734ff143d5d24c82037baaf5282b689c233c6d7854ae4d2a1e69d8aefc2\" returns successfully" Jan 23 00:04:42.058791 containerd[1664]: time="2026-01-23T00:04:42.058744842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-mpqkr,Uid:2e74f8fa-55eb-479d-adc6-a716ad811336,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c\"" Jan 23 00:04:42.061949 containerd[1664]: time="2026-01-23T00:04:42.061909411Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 23 00:04:42.720250 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount988221176.mount: Deactivated successfully. Jan 23 00:04:43.780569 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3335955511.mount: Deactivated successfully. Jan 23 00:04:44.049215 containerd[1664]: time="2026-01-23T00:04:44.049087674Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:44.050231 containerd[1664]: time="2026-01-23T00:04:44.050200597Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Jan 23 00:04:44.051279 containerd[1664]: time="2026-01-23T00:04:44.051250600Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:44.053394 containerd[1664]: time="2026-01-23T00:04:44.053350286Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:04:44.054065 containerd[1664]: time="2026-01-23T00:04:44.053891248Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 1.991944917s" Jan 23 00:04:44.054065 containerd[1664]: time="2026-01-23T00:04:44.053925568Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 23 00:04:44.057940 containerd[1664]: time="2026-01-23T00:04:44.057913419Z" level=info msg="CreateContainer within sandbox \"83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 23 00:04:44.066316 containerd[1664]: time="2026-01-23T00:04:44.066290523Z" level=info msg="Container fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:04:44.073312 containerd[1664]: time="2026-01-23T00:04:44.073196543Z" level=info msg="CreateContainer within sandbox \"83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441\"" Jan 23 00:04:44.073779 containerd[1664]: time="2026-01-23T00:04:44.073689305Z" level=info msg="StartContainer for \"fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441\"" Jan 23 00:04:44.074821 containerd[1664]: time="2026-01-23T00:04:44.074776468Z" level=info msg="connecting to shim fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441" address="unix:///run/containerd/s/229a700e0f1566c9b92a9bc93c10e4c05db231d7da447938cfa7e0fc2dc1b4a6" protocol=ttrpc version=3 Jan 23 00:04:44.090897 systemd[1]: Started cri-containerd-fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441.scope - libcontainer container fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441. Jan 23 00:04:44.116588 containerd[1664]: time="2026-01-23T00:04:44.116529507Z" level=info msg="StartContainer for \"fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441\" returns successfully" Jan 23 00:04:44.452246 kubelet[2916]: I0123 00:04:44.452073 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cnwzm" podStartSLOduration=3.45205795 podStartE2EDuration="3.45205795s" podCreationTimestamp="2026-01-23 00:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 00:04:42.445123951 +0000 UTC m=+7.149976319" watchObservedRunningTime="2026-01-23 00:04:44.45205795 +0000 UTC m=+9.156910278" Jan 23 00:04:44.452246 kubelet[2916]: I0123 00:04:44.452173 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-mpqkr" podStartSLOduration=1.458944391 podStartE2EDuration="3.452169231s" podCreationTimestamp="2026-01-23 00:04:41 +0000 UTC" firstStartedPulling="2026-01-23 00:04:42.06148733 +0000 UTC m=+6.766339698" lastFinishedPulling="2026-01-23 00:04:44.05471217 +0000 UTC m=+8.759564538" observedRunningTime="2026-01-23 00:04:44.452134991 +0000 UTC m=+9.156987359" watchObservedRunningTime="2026-01-23 00:04:44.452169231 +0000 UTC m=+9.157021599" Jan 23 00:04:49.121432 sudo[1964]: pam_unix(sudo:session): session closed for user root Jan 23 00:04:49.218417 sshd[1963]: Connection closed by 20.161.92.111 port 57054 Jan 23 00:04:49.218788 sshd-session[1960]: pam_unix(sshd:session): session closed for user core Jan 23 00:04:49.222031 systemd[1]: sshd@10-10.0.0.231:22-20.161.92.111:57054.service: Deactivated successfully. Jan 23 00:04:49.223799 systemd[1]: session-11.scope: Deactivated successfully. Jan 23 00:04:49.224044 systemd[1]: session-11.scope: Consumed 5.973s CPU time, 224.6M memory peak. Jan 23 00:04:49.225906 systemd-logind[1635]: Session 11 logged out. Waiting for processes to exit. Jan 23 00:04:49.227517 systemd-logind[1635]: Removed session 11. Jan 23 00:04:56.628201 systemd[1]: Created slice kubepods-besteffort-pod70fc89db_9f9a_4cfa_8eab_cd415d773db3.slice - libcontainer container kubepods-besteffort-pod70fc89db_9f9a_4cfa_8eab_cd415d773db3.slice. Jan 23 00:04:56.704489 kubelet[2916]: I0123 00:04:56.704310 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70fc89db-9f9a-4cfa-8eab-cd415d773db3-tigera-ca-bundle\") pod \"calico-typha-64b54b6cf5-wdzbv\" (UID: \"70fc89db-9f9a-4cfa-8eab-cd415d773db3\") " pod="calico-system/calico-typha-64b54b6cf5-wdzbv" Jan 23 00:04:56.704916 kubelet[2916]: I0123 00:04:56.704543 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/70fc89db-9f9a-4cfa-8eab-cd415d773db3-typha-certs\") pod \"calico-typha-64b54b6cf5-wdzbv\" (UID: \"70fc89db-9f9a-4cfa-8eab-cd415d773db3\") " pod="calico-system/calico-typha-64b54b6cf5-wdzbv" Jan 23 00:04:56.704916 kubelet[2916]: I0123 00:04:56.704608 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtdw9\" (UniqueName: \"kubernetes.io/projected/70fc89db-9f9a-4cfa-8eab-cd415d773db3-kube-api-access-dtdw9\") pod \"calico-typha-64b54b6cf5-wdzbv\" (UID: \"70fc89db-9f9a-4cfa-8eab-cd415d773db3\") " pod="calico-system/calico-typha-64b54b6cf5-wdzbv" Jan 23 00:04:56.830547 systemd[1]: Created slice kubepods-besteffort-podf184b3a1_e153_40ba_849a_011e8c602eac.slice - libcontainer container kubepods-besteffort-podf184b3a1_e153_40ba_849a_011e8c602eac.slice. Jan 23 00:04:56.905736 kubelet[2916]: I0123 00:04:56.905613 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-flexvol-driver-host\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905736 kubelet[2916]: I0123 00:04:56.905659 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f184b3a1-e153-40ba-849a-011e8c602eac-tigera-ca-bundle\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905736 kubelet[2916]: I0123 00:04:56.905679 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-var-lib-calico\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905736 kubelet[2916]: I0123 00:04:56.905695 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-xtables-lock\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905736 kubelet[2916]: I0123 00:04:56.905710 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-policysync\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905925 kubelet[2916]: I0123 00:04:56.905743 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-cni-log-dir\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905925 kubelet[2916]: I0123 00:04:56.905759 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-cni-net-dir\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905925 kubelet[2916]: I0123 00:04:56.905772 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f184b3a1-e153-40ba-849a-011e8c602eac-node-certs\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905925 kubelet[2916]: I0123 00:04:56.905785 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-var-run-calico\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.905925 kubelet[2916]: I0123 00:04:56.905799 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-cni-bin-dir\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.906024 kubelet[2916]: I0123 00:04:56.905815 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f184b3a1-e153-40ba-849a-011e8c602eac-lib-modules\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.906024 kubelet[2916]: I0123 00:04:56.905831 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97zwl\" (UniqueName: \"kubernetes.io/projected/f184b3a1-e153-40ba-849a-011e8c602eac-kube-api-access-97zwl\") pod \"calico-node-q7mnj\" (UID: \"f184b3a1-e153-40ba-849a-011e8c602eac\") " pod="calico-system/calico-node-q7mnj" Jan 23 00:04:56.935029 containerd[1664]: time="2026-01-23T00:04:56.934989052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64b54b6cf5-wdzbv,Uid:70fc89db-9f9a-4cfa-8eab-cd415d773db3,Namespace:calico-system,Attempt:0,}" Jan 23 00:04:56.954463 containerd[1664]: time="2026-01-23T00:04:56.954400147Z" level=info msg="connecting to shim f71346a2f39f2ec38c24868bb3702babf5d0e8475e977ced47ecaec603b502d5" address="unix:///run/containerd/s/d54889301c01a50c732f00ff367d2e1f60a8e7aaba8742cdcb6fd35be9532801" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:04:56.975927 systemd[1]: Started cri-containerd-f71346a2f39f2ec38c24868bb3702babf5d0e8475e977ced47ecaec603b502d5.scope - libcontainer container f71346a2f39f2ec38c24868bb3702babf5d0e8475e977ced47ecaec603b502d5. Jan 23 00:04:57.004763 kubelet[2916]: E0123 00:04:57.004575 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:04:57.015466 kubelet[2916]: E0123 00:04:57.015410 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.015466 kubelet[2916]: W0123 00:04:57.015452 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.015466 kubelet[2916]: E0123 00:04:57.015474 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.023942 kubelet[2916]: E0123 00:04:57.023905 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.023942 kubelet[2916]: W0123 00:04:57.023929 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.024090 kubelet[2916]: E0123 00:04:57.023953 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.032948 containerd[1664]: time="2026-01-23T00:04:57.032907293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64b54b6cf5-wdzbv,Uid:70fc89db-9f9a-4cfa-8eab-cd415d773db3,Namespace:calico-system,Attempt:0,} returns sandbox id \"f71346a2f39f2ec38c24868bb3702babf5d0e8475e977ced47ecaec603b502d5\"" Jan 23 00:04:57.034414 containerd[1664]: time="2026-01-23T00:04:57.034389097Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 23 00:04:57.091622 kubelet[2916]: E0123 00:04:57.091409 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.091622 kubelet[2916]: W0123 00:04:57.091608 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.091622 kubelet[2916]: E0123 00:04:57.091632 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.092746 kubelet[2916]: E0123 00:04:57.092174 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.092746 kubelet[2916]: W0123 00:04:57.092189 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.092746 kubelet[2916]: E0123 00:04:57.092423 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.092956 kubelet[2916]: E0123 00:04:57.092892 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.093000 kubelet[2916]: W0123 00:04:57.092970 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.093000 kubelet[2916]: E0123 00:04:57.092984 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.093214 kubelet[2916]: E0123 00:04:57.093185 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.093214 kubelet[2916]: W0123 00:04:57.093204 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.093214 kubelet[2916]: E0123 00:04:57.093213 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.093363 kubelet[2916]: E0123 00:04:57.093345 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.093363 kubelet[2916]: W0123 00:04:57.093357 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.093419 kubelet[2916]: E0123 00:04:57.093383 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.094305 kubelet[2916]: E0123 00:04:57.093493 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.094305 kubelet[2916]: W0123 00:04:57.093505 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.094305 kubelet[2916]: E0123 00:04:57.093513 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.094305 kubelet[2916]: E0123 00:04:57.093634 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.094305 kubelet[2916]: W0123 00:04:57.093642 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.094305 kubelet[2916]: E0123 00:04:57.093652 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.094305 kubelet[2916]: E0123 00:04:57.093865 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.094305 kubelet[2916]: W0123 00:04:57.093875 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.094305 kubelet[2916]: E0123 00:04:57.093884 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.094526 kubelet[2916]: E0123 00:04:57.094405 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.094526 kubelet[2916]: W0123 00:04:57.094417 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.094526 kubelet[2916]: E0123 00:04:57.094434 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.094791 kubelet[2916]: E0123 00:04:57.094775 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.094834 kubelet[2916]: W0123 00:04:57.094807 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.094834 kubelet[2916]: E0123 00:04:57.094819 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.095047 kubelet[2916]: E0123 00:04:57.095028 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.095047 kubelet[2916]: W0123 00:04:57.095045 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.095103 kubelet[2916]: E0123 00:04:57.095054 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.095239 kubelet[2916]: E0123 00:04:57.095225 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.095239 kubelet[2916]: W0123 00:04:57.095237 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.095283 kubelet[2916]: E0123 00:04:57.095245 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.095459 kubelet[2916]: E0123 00:04:57.095445 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.095459 kubelet[2916]: W0123 00:04:57.095457 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.095511 kubelet[2916]: E0123 00:04:57.095485 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.095876 kubelet[2916]: E0123 00:04:57.095754 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.095876 kubelet[2916]: W0123 00:04:57.095827 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.095957 kubelet[2916]: E0123 00:04:57.095908 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.096202 kubelet[2916]: E0123 00:04:57.096174 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.096251 kubelet[2916]: W0123 00:04:57.096219 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.096371 kubelet[2916]: E0123 00:04:57.096231 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.096650 kubelet[2916]: E0123 00:04:57.096632 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.096776 kubelet[2916]: W0123 00:04:57.096748 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.097160 kubelet[2916]: E0123 00:04:57.096832 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.097717 kubelet[2916]: E0123 00:04:57.097622 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.097717 kubelet[2916]: W0123 00:04:57.097646 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.098218 kubelet[2916]: E0123 00:04:57.097658 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.098896 kubelet[2916]: E0123 00:04:57.098872 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.098896 kubelet[2916]: W0123 00:04:57.098892 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.099142 kubelet[2916]: E0123 00:04:57.098905 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.100952 kubelet[2916]: E0123 00:04:57.100025 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.101023 kubelet[2916]: W0123 00:04:57.100960 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.101023 kubelet[2916]: E0123 00:04:57.100977 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.101353 kubelet[2916]: E0123 00:04:57.101338 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.101390 kubelet[2916]: W0123 00:04:57.101354 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.101390 kubelet[2916]: E0123 00:04:57.101366 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.107505 kubelet[2916]: E0123 00:04:57.107458 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.107505 kubelet[2916]: W0123 00:04:57.107494 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.107599 kubelet[2916]: E0123 00:04:57.107511 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.107599 kubelet[2916]: I0123 00:04:57.107539 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxzqg\" (UniqueName: \"kubernetes.io/projected/e4fbecc4-8903-42d8-8af9-1aa47331d5be-kube-api-access-vxzqg\") pod \"csi-node-driver-hdvg2\" (UID: \"e4fbecc4-8903-42d8-8af9-1aa47331d5be\") " pod="calico-system/csi-node-driver-hdvg2" Jan 23 00:04:57.107767 kubelet[2916]: E0123 00:04:57.107748 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.107767 kubelet[2916]: W0123 00:04:57.107762 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.107815 kubelet[2916]: E0123 00:04:57.107771 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.107839 kubelet[2916]: I0123 00:04:57.107789 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4fbecc4-8903-42d8-8af9-1aa47331d5be-kubelet-dir\") pod \"csi-node-driver-hdvg2\" (UID: \"e4fbecc4-8903-42d8-8af9-1aa47331d5be\") " pod="calico-system/csi-node-driver-hdvg2" Jan 23 00:04:57.108018 kubelet[2916]: E0123 00:04:57.108002 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.108044 kubelet[2916]: W0123 00:04:57.108019 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.108044 kubelet[2916]: E0123 00:04:57.108032 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.108204 kubelet[2916]: E0123 00:04:57.108191 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.108204 kubelet[2916]: W0123 00:04:57.108202 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.108253 kubelet[2916]: E0123 00:04:57.108212 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.109400 kubelet[2916]: E0123 00:04:57.109266 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.109400 kubelet[2916]: W0123 00:04:57.109281 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.109400 kubelet[2916]: E0123 00:04:57.109294 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.109590 kubelet[2916]: E0123 00:04:57.109578 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.110057 kubelet[2916]: W0123 00:04:57.109965 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.110234 kubelet[2916]: E0123 00:04:57.110218 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.110659 kubelet[2916]: E0123 00:04:57.110643 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.110889 kubelet[2916]: W0123 00:04:57.110688 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.110889 kubelet[2916]: E0123 00:04:57.110702 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.110889 kubelet[2916]: I0123 00:04:57.110831 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4fbecc4-8903-42d8-8af9-1aa47331d5be-socket-dir\") pod \"csi-node-driver-hdvg2\" (UID: \"e4fbecc4-8903-42d8-8af9-1aa47331d5be\") " pod="calico-system/csi-node-driver-hdvg2" Jan 23 00:04:57.111213 kubelet[2916]: E0123 00:04:57.111173 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.111298 kubelet[2916]: W0123 00:04:57.111280 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.111420 kubelet[2916]: E0123 00:04:57.111407 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.111512 kubelet[2916]: I0123 00:04:57.111500 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4fbecc4-8903-42d8-8af9-1aa47331d5be-registration-dir\") pod \"csi-node-driver-hdvg2\" (UID: \"e4fbecc4-8903-42d8-8af9-1aa47331d5be\") " pod="calico-system/csi-node-driver-hdvg2" Jan 23 00:04:57.111748 kubelet[2916]: E0123 00:04:57.111701 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.111748 kubelet[2916]: W0123 00:04:57.111734 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.111748 kubelet[2916]: E0123 00:04:57.111748 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.111945 kubelet[2916]: E0123 00:04:57.111928 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.111945 kubelet[2916]: W0123 00:04:57.111941 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.112003 kubelet[2916]: E0123 00:04:57.111951 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.112159 kubelet[2916]: E0123 00:04:57.112144 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.112159 kubelet[2916]: W0123 00:04:57.112156 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.112214 kubelet[2916]: E0123 00:04:57.112165 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.112214 kubelet[2916]: I0123 00:04:57.112187 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e4fbecc4-8903-42d8-8af9-1aa47331d5be-varrun\") pod \"csi-node-driver-hdvg2\" (UID: \"e4fbecc4-8903-42d8-8af9-1aa47331d5be\") " pod="calico-system/csi-node-driver-hdvg2" Jan 23 00:04:57.112408 kubelet[2916]: E0123 00:04:57.112393 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.112408 kubelet[2916]: W0123 00:04:57.112407 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.112469 kubelet[2916]: E0123 00:04:57.112416 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.112587 kubelet[2916]: E0123 00:04:57.112574 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.112640 kubelet[2916]: W0123 00:04:57.112586 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.112640 kubelet[2916]: E0123 00:04:57.112610 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.112847 kubelet[2916]: E0123 00:04:57.112822 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.112887 kubelet[2916]: W0123 00:04:57.112847 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.112887 kubelet[2916]: E0123 00:04:57.112859 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.113044 kubelet[2916]: E0123 00:04:57.113031 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.113044 kubelet[2916]: W0123 00:04:57.113042 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.113143 kubelet[2916]: E0123 00:04:57.113051 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.135623 containerd[1664]: time="2026-01-23T00:04:57.135516787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-q7mnj,Uid:f184b3a1-e153-40ba-849a-011e8c602eac,Namespace:calico-system,Attempt:0,}" Jan 23 00:04:57.152913 containerd[1664]: time="2026-01-23T00:04:57.152868757Z" level=info msg="connecting to shim b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90" address="unix:///run/containerd/s/c30fccc952b601dbc93cb7f9a493fe7d459d9d4cb7a0a6e7d05ddb87ee357e42" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:04:57.174924 systemd[1]: Started cri-containerd-b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90.scope - libcontainer container b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90. Jan 23 00:04:57.200004 containerd[1664]: time="2026-01-23T00:04:57.199961972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-q7mnj,Uid:f184b3a1-e153-40ba-849a-011e8c602eac,Namespace:calico-system,Attempt:0,} returns sandbox id \"b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90\"" Jan 23 00:04:57.212917 kubelet[2916]: E0123 00:04:57.212796 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.212917 kubelet[2916]: W0123 00:04:57.212866 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.213252 kubelet[2916]: E0123 00:04:57.213117 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.213493 kubelet[2916]: E0123 00:04:57.213477 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.213570 kubelet[2916]: W0123 00:04:57.213552 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.213633 kubelet[2916]: E0123 00:04:57.213623 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.213909 kubelet[2916]: E0123 00:04:57.213887 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.214054 kubelet[2916]: W0123 00:04:57.213900 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.214054 kubelet[2916]: E0123 00:04:57.213987 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.214278 kubelet[2916]: E0123 00:04:57.214265 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.214418 kubelet[2916]: W0123 00:04:57.214301 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.214418 kubelet[2916]: E0123 00:04:57.214313 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.214651 kubelet[2916]: E0123 00:04:57.214614 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.214651 kubelet[2916]: W0123 00:04:57.214626 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.214778 kubelet[2916]: E0123 00:04:57.214639 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.215132 kubelet[2916]: E0123 00:04:57.215089 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.215132 kubelet[2916]: W0123 00:04:57.215102 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.215132 kubelet[2916]: E0123 00:04:57.215112 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.215380 kubelet[2916]: E0123 00:04:57.215362 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.215415 kubelet[2916]: W0123 00:04:57.215380 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.215415 kubelet[2916]: E0123 00:04:57.215393 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.215543 kubelet[2916]: E0123 00:04:57.215533 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.215543 kubelet[2916]: W0123 00:04:57.215542 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.215632 kubelet[2916]: E0123 00:04:57.215550 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.215698 kubelet[2916]: E0123 00:04:57.215682 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.215698 kubelet[2916]: W0123 00:04:57.215692 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.215802 kubelet[2916]: E0123 00:04:57.215700 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.215890 kubelet[2916]: E0123 00:04:57.215880 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.215927 kubelet[2916]: W0123 00:04:57.215890 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.215927 kubelet[2916]: E0123 00:04:57.215900 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.216040 kubelet[2916]: E0123 00:04:57.216030 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.216040 kubelet[2916]: W0123 00:04:57.216039 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.216113 kubelet[2916]: E0123 00:04:57.216047 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.216169 kubelet[2916]: E0123 00:04:57.216156 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.216169 kubelet[2916]: W0123 00:04:57.216166 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.216351 kubelet[2916]: E0123 00:04:57.216173 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.216450 kubelet[2916]: E0123 00:04:57.216438 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.216501 kubelet[2916]: W0123 00:04:57.216491 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.216570 kubelet[2916]: E0123 00:04:57.216559 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.216895 kubelet[2916]: E0123 00:04:57.216793 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.216895 kubelet[2916]: W0123 00:04:57.216805 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.216895 kubelet[2916]: E0123 00:04:57.216816 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.217057 kubelet[2916]: E0123 00:04:57.217045 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.217122 kubelet[2916]: W0123 00:04:57.217100 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.217183 kubelet[2916]: E0123 00:04:57.217169 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.217397 kubelet[2916]: E0123 00:04:57.217364 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.217397 kubelet[2916]: W0123 00:04:57.217376 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.217397 kubelet[2916]: E0123 00:04:57.217385 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.217674 kubelet[2916]: E0123 00:04:57.217641 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.217674 kubelet[2916]: W0123 00:04:57.217653 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.217674 kubelet[2916]: E0123 00:04:57.217663 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.218015 kubelet[2916]: E0123 00:04:57.217981 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.218015 kubelet[2916]: W0123 00:04:57.217993 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.218015 kubelet[2916]: E0123 00:04:57.218003 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.218363 kubelet[2916]: E0123 00:04:57.218328 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.218363 kubelet[2916]: W0123 00:04:57.218340 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.218363 kubelet[2916]: E0123 00:04:57.218350 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.218669 kubelet[2916]: E0123 00:04:57.218634 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.218669 kubelet[2916]: W0123 00:04:57.218646 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.218669 kubelet[2916]: E0123 00:04:57.218656 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.219042 kubelet[2916]: E0123 00:04:57.219004 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.219042 kubelet[2916]: W0123 00:04:57.219018 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.219042 kubelet[2916]: E0123 00:04:57.219029 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.219337 kubelet[2916]: E0123 00:04:57.219288 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.219337 kubelet[2916]: W0123 00:04:57.219299 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.219337 kubelet[2916]: E0123 00:04:57.219314 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.219806 kubelet[2916]: E0123 00:04:57.219767 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.219806 kubelet[2916]: W0123 00:04:57.219783 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.219806 kubelet[2916]: E0123 00:04:57.219793 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.220141 kubelet[2916]: E0123 00:04:57.220125 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.220219 kubelet[2916]: W0123 00:04:57.220192 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.220219 kubelet[2916]: E0123 00:04:57.220207 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.221441 kubelet[2916]: E0123 00:04:57.221350 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.221441 kubelet[2916]: W0123 00:04:57.221376 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.221441 kubelet[2916]: E0123 00:04:57.221405 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:57.231547 kubelet[2916]: E0123 00:04:57.231484 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:04:57.231547 kubelet[2916]: W0123 00:04:57.231534 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:04:57.231668 kubelet[2916]: E0123 00:04:57.231553 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:04:58.495617 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount361255139.mount: Deactivated successfully. Jan 23 00:04:59.400032 kubelet[2916]: E0123 00:04:59.399984 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:01.399892 kubelet[2916]: E0123 00:05:01.399837 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:01.843282 containerd[1664]: time="2026-01-23T00:05:01.843237456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:01.844017 containerd[1664]: time="2026-01-23T00:05:01.843992899Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Jan 23 00:05:01.845210 containerd[1664]: time="2026-01-23T00:05:01.845181702Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:01.851138 containerd[1664]: time="2026-01-23T00:05:01.851002679Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:01.851765 containerd[1664]: time="2026-01-23T00:05:01.851704961Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 4.817284264s" Jan 23 00:05:01.851799 containerd[1664]: time="2026-01-23T00:05:01.851767841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 23 00:05:01.853281 containerd[1664]: time="2026-01-23T00:05:01.853244485Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 23 00:05:01.862530 containerd[1664]: time="2026-01-23T00:05:01.862489552Z" level=info msg="CreateContainer within sandbox \"f71346a2f39f2ec38c24868bb3702babf5d0e8475e977ced47ecaec603b502d5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 23 00:05:01.869765 containerd[1664]: time="2026-01-23T00:05:01.869317451Z" level=info msg="Container 3f4e65290f6ba1c0171551280258f9cb6b8b24ff69aa986f4625813a5fedaf5e: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:05:01.880229 containerd[1664]: time="2026-01-23T00:05:01.880157682Z" level=info msg="CreateContainer within sandbox \"f71346a2f39f2ec38c24868bb3702babf5d0e8475e977ced47ecaec603b502d5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3f4e65290f6ba1c0171551280258f9cb6b8b24ff69aa986f4625813a5fedaf5e\"" Jan 23 00:05:01.882225 containerd[1664]: time="2026-01-23T00:05:01.880833764Z" level=info msg="StartContainer for \"3f4e65290f6ba1c0171551280258f9cb6b8b24ff69aa986f4625813a5fedaf5e\"" Jan 23 00:05:01.882785 containerd[1664]: time="2026-01-23T00:05:01.882752890Z" level=info msg="connecting to shim 3f4e65290f6ba1c0171551280258f9cb6b8b24ff69aa986f4625813a5fedaf5e" address="unix:///run/containerd/s/d54889301c01a50c732f00ff367d2e1f60a8e7aaba8742cdcb6fd35be9532801" protocol=ttrpc version=3 Jan 23 00:05:01.902909 systemd[1]: Started cri-containerd-3f4e65290f6ba1c0171551280258f9cb6b8b24ff69aa986f4625813a5fedaf5e.scope - libcontainer container 3f4e65290f6ba1c0171551280258f9cb6b8b24ff69aa986f4625813a5fedaf5e. Jan 23 00:05:01.938820 containerd[1664]: time="2026-01-23T00:05:01.938780411Z" level=info msg="StartContainer for \"3f4e65290f6ba1c0171551280258f9cb6b8b24ff69aa986f4625813a5fedaf5e\" returns successfully" Jan 23 00:05:02.492694 kubelet[2916]: I0123 00:05:02.492575 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-64b54b6cf5-wdzbv" podStartSLOduration=1.674188813 podStartE2EDuration="6.49255864s" podCreationTimestamp="2026-01-23 00:04:56 +0000 UTC" firstStartedPulling="2026-01-23 00:04:57.034126376 +0000 UTC m=+21.738978744" lastFinishedPulling="2026-01-23 00:05:01.852496203 +0000 UTC m=+26.557348571" observedRunningTime="2026-01-23 00:05:02.491905998 +0000 UTC m=+27.196758406" watchObservedRunningTime="2026-01-23 00:05:02.49255864 +0000 UTC m=+27.197411008" Jan 23 00:05:02.535035 kubelet[2916]: E0123 00:05:02.534990 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.535035 kubelet[2916]: W0123 00:05:02.535015 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.535035 kubelet[2916]: E0123 00:05:02.535035 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.535381 kubelet[2916]: E0123 00:05:02.535342 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.535402 kubelet[2916]: W0123 00:05:02.535357 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.535422 kubelet[2916]: E0123 00:05:02.535402 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.535554 kubelet[2916]: E0123 00:05:02.535542 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.535554 kubelet[2916]: W0123 00:05:02.535552 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.535606 kubelet[2916]: E0123 00:05:02.535560 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.535745 kubelet[2916]: E0123 00:05:02.535733 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.535745 kubelet[2916]: W0123 00:05:02.535743 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.535796 kubelet[2916]: E0123 00:05:02.535751 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.535891 kubelet[2916]: E0123 00:05:02.535881 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.535912 kubelet[2916]: W0123 00:05:02.535902 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.535933 kubelet[2916]: E0123 00:05:02.535911 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.536052 kubelet[2916]: E0123 00:05:02.536041 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.536081 kubelet[2916]: W0123 00:05:02.536053 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.536081 kubelet[2916]: E0123 00:05:02.536062 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.536187 kubelet[2916]: E0123 00:05:02.536175 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.536187 kubelet[2916]: W0123 00:05:02.536184 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.536236 kubelet[2916]: E0123 00:05:02.536200 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.536323 kubelet[2916]: E0123 00:05:02.536313 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.536346 kubelet[2916]: W0123 00:05:02.536322 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.536346 kubelet[2916]: E0123 00:05:02.536330 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.536471 kubelet[2916]: E0123 00:05:02.536460 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.536471 kubelet[2916]: W0123 00:05:02.536469 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.536515 kubelet[2916]: E0123 00:05:02.536485 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.536602 kubelet[2916]: E0123 00:05:02.536592 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.536640 kubelet[2916]: W0123 00:05:02.536600 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.536640 kubelet[2916]: E0123 00:05:02.536610 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.536768 kubelet[2916]: E0123 00:05:02.536756 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.536768 kubelet[2916]: W0123 00:05:02.536766 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.536818 kubelet[2916]: E0123 00:05:02.536774 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.536924 kubelet[2916]: E0123 00:05:02.536914 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.536945 kubelet[2916]: W0123 00:05:02.536924 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.536945 kubelet[2916]: E0123 00:05:02.536933 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.537101 kubelet[2916]: E0123 00:05:02.537090 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.537101 kubelet[2916]: W0123 00:05:02.537099 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.537150 kubelet[2916]: E0123 00:05:02.537107 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.537234 kubelet[2916]: E0123 00:05:02.537224 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.537234 kubelet[2916]: W0123 00:05:02.537232 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.537281 kubelet[2916]: E0123 00:05:02.537240 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.537365 kubelet[2916]: E0123 00:05:02.537356 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.537427 kubelet[2916]: W0123 00:05:02.537367 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.537427 kubelet[2916]: E0123 00:05:02.537374 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.550102 kubelet[2916]: E0123 00:05:02.550080 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.550102 kubelet[2916]: W0123 00:05:02.550095 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.550273 kubelet[2916]: E0123 00:05:02.550110 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.550319 kubelet[2916]: E0123 00:05:02.550304 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.550319 kubelet[2916]: W0123 00:05:02.550316 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.550523 kubelet[2916]: E0123 00:05:02.550324 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.550605 kubelet[2916]: E0123 00:05:02.550589 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.550655 kubelet[2916]: W0123 00:05:02.550644 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.550705 kubelet[2916]: E0123 00:05:02.550695 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.550948 kubelet[2916]: E0123 00:05:02.550933 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.551262 kubelet[2916]: W0123 00:05:02.551059 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.551262 kubelet[2916]: E0123 00:05:02.551077 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.551528 kubelet[2916]: E0123 00:05:02.551514 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.551590 kubelet[2916]: W0123 00:05:02.551579 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.551641 kubelet[2916]: E0123 00:05:02.551631 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.551970 kubelet[2916]: E0123 00:05:02.551955 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.552050 kubelet[2916]: W0123 00:05:02.552038 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.552108 kubelet[2916]: E0123 00:05:02.552097 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.552417 kubelet[2916]: E0123 00:05:02.552310 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.552417 kubelet[2916]: W0123 00:05:02.552321 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.552417 kubelet[2916]: E0123 00:05:02.552331 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.552564 kubelet[2916]: E0123 00:05:02.552552 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.552617 kubelet[2916]: W0123 00:05:02.552607 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.552714 kubelet[2916]: E0123 00:05:02.552702 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.553081 kubelet[2916]: E0123 00:05:02.552982 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.553081 kubelet[2916]: W0123 00:05:02.552995 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.553081 kubelet[2916]: E0123 00:05:02.553006 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.553233 kubelet[2916]: E0123 00:05:02.553221 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.553285 kubelet[2916]: W0123 00:05:02.553276 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.553453 kubelet[2916]: E0123 00:05:02.553331 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.553554 kubelet[2916]: E0123 00:05:02.553542 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.553604 kubelet[2916]: W0123 00:05:02.553594 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.553659 kubelet[2916]: E0123 00:05:02.553647 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.553980 kubelet[2916]: E0123 00:05:02.553877 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.553980 kubelet[2916]: W0123 00:05:02.553890 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.553980 kubelet[2916]: E0123 00:05:02.553900 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.554140 kubelet[2916]: E0123 00:05:02.554127 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.554188 kubelet[2916]: W0123 00:05:02.554178 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.554241 kubelet[2916]: E0123 00:05:02.554232 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.554548 kubelet[2916]: E0123 00:05:02.554448 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.554548 kubelet[2916]: W0123 00:05:02.554461 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.554548 kubelet[2916]: E0123 00:05:02.554472 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.554707 kubelet[2916]: E0123 00:05:02.554693 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.554787 kubelet[2916]: W0123 00:05:02.554774 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.554837 kubelet[2916]: E0123 00:05:02.554826 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.555057 kubelet[2916]: E0123 00:05:02.555043 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.555278 kubelet[2916]: W0123 00:05:02.555108 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.555278 kubelet[2916]: E0123 00:05:02.555127 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.555375 kubelet[2916]: E0123 00:05:02.555350 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.555375 kubelet[2916]: W0123 00:05:02.555368 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.555431 kubelet[2916]: E0123 00:05:02.555381 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:02.555554 kubelet[2916]: E0123 00:05:02.555521 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:02.555554 kubelet[2916]: W0123 00:05:02.555532 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:02.555554 kubelet[2916]: E0123 00:05:02.555541 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.400115 kubelet[2916]: E0123 00:05:03.399970 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:03.487707 containerd[1664]: time="2026-01-23T00:05:03.487652495Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:03.489337 containerd[1664]: time="2026-01-23T00:05:03.489299780Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Jan 23 00:05:03.490992 containerd[1664]: time="2026-01-23T00:05:03.490964945Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:03.494532 containerd[1664]: time="2026-01-23T00:05:03.494488995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:03.495247 containerd[1664]: time="2026-01-23T00:05:03.495225117Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.641936592s" Jan 23 00:05:03.495288 containerd[1664]: time="2026-01-23T00:05:03.495251077Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 23 00:05:03.500794 containerd[1664]: time="2026-01-23T00:05:03.500766653Z" level=info msg="CreateContainer within sandbox \"b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 23 00:05:03.508582 containerd[1664]: time="2026-01-23T00:05:03.508532595Z" level=info msg="Container 81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:05:03.519444 containerd[1664]: time="2026-01-23T00:05:03.519406666Z" level=info msg="CreateContainer within sandbox \"b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda\"" Jan 23 00:05:03.519931 containerd[1664]: time="2026-01-23T00:05:03.519889868Z" level=info msg="StartContainer for \"81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda\"" Jan 23 00:05:03.521442 containerd[1664]: time="2026-01-23T00:05:03.521403752Z" level=info msg="connecting to shim 81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda" address="unix:///run/containerd/s/c30fccc952b601dbc93cb7f9a493fe7d459d9d4cb7a0a6e7d05ddb87ee357e42" protocol=ttrpc version=3 Jan 23 00:05:03.540007 systemd[1]: Started cri-containerd-81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda.scope - libcontainer container 81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda. Jan 23 00:05:03.542605 kubelet[2916]: E0123 00:05:03.542367 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.542605 kubelet[2916]: W0123 00:05:03.542597 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.542879 kubelet[2916]: E0123 00:05:03.542620 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.542879 kubelet[2916]: E0123 00:05:03.542844 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.542879 kubelet[2916]: W0123 00:05:03.542854 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.542957 kubelet[2916]: E0123 00:05:03.542863 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.543082 kubelet[2916]: E0123 00:05:03.543068 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.543082 kubelet[2916]: W0123 00:05:03.543081 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.543137 kubelet[2916]: E0123 00:05:03.543092 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.543247 kubelet[2916]: E0123 00:05:03.543230 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.543277 kubelet[2916]: W0123 00:05:03.543255 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.543277 kubelet[2916]: E0123 00:05:03.543265 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.543481 kubelet[2916]: E0123 00:05:03.543444 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.543481 kubelet[2916]: W0123 00:05:03.543456 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.543481 kubelet[2916]: E0123 00:05:03.543466 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.543700 kubelet[2916]: E0123 00:05:03.543678 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.543700 kubelet[2916]: W0123 00:05:03.543689 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.543700 kubelet[2916]: E0123 00:05:03.543698 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.543873 kubelet[2916]: E0123 00:05:03.543854 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.543873 kubelet[2916]: W0123 00:05:03.543866 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.543951 kubelet[2916]: E0123 00:05:03.543875 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.544161 kubelet[2916]: E0123 00:05:03.544146 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.544161 kubelet[2916]: W0123 00:05:03.544159 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.544226 kubelet[2916]: E0123 00:05:03.544169 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.544353 kubelet[2916]: E0123 00:05:03.544338 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.544389 kubelet[2916]: W0123 00:05:03.544369 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.544389 kubelet[2916]: E0123 00:05:03.544382 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.544568 kubelet[2916]: E0123 00:05:03.544555 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.544568 kubelet[2916]: W0123 00:05:03.544567 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.544640 kubelet[2916]: E0123 00:05:03.544576 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.544785 kubelet[2916]: E0123 00:05:03.544770 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.544785 kubelet[2916]: W0123 00:05:03.544783 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.544866 kubelet[2916]: E0123 00:05:03.544792 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.544973 kubelet[2916]: E0123 00:05:03.544952 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.544998 kubelet[2916]: W0123 00:05:03.544963 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.544998 kubelet[2916]: E0123 00:05:03.544991 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.545189 kubelet[2916]: E0123 00:05:03.545176 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.545189 kubelet[2916]: W0123 00:05:03.545187 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.545255 kubelet[2916]: E0123 00:05:03.545196 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.545361 kubelet[2916]: E0123 00:05:03.545347 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.545361 kubelet[2916]: W0123 00:05:03.545358 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.545404 kubelet[2916]: E0123 00:05:03.545366 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.545505 kubelet[2916]: E0123 00:05:03.545492 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.545505 kubelet[2916]: W0123 00:05:03.545502 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.545555 kubelet[2916]: E0123 00:05:03.545529 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.558158 kubelet[2916]: E0123 00:05:03.558123 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.558216 kubelet[2916]: W0123 00:05:03.558163 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.558216 kubelet[2916]: E0123 00:05:03.558193 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.558515 kubelet[2916]: E0123 00:05:03.558501 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.558515 kubelet[2916]: W0123 00:05:03.558513 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.558588 kubelet[2916]: E0123 00:05:03.558523 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.558758 kubelet[2916]: E0123 00:05:03.558741 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.558794 kubelet[2916]: W0123 00:05:03.558759 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.558794 kubelet[2916]: E0123 00:05:03.558773 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.558989 kubelet[2916]: E0123 00:05:03.558974 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.558989 kubelet[2916]: W0123 00:05:03.558987 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.559039 kubelet[2916]: E0123 00:05:03.558997 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.559157 kubelet[2916]: E0123 00:05:03.559146 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.559157 kubelet[2916]: W0123 00:05:03.559155 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.559201 kubelet[2916]: E0123 00:05:03.559166 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.559322 kubelet[2916]: E0123 00:05:03.559311 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.559322 kubelet[2916]: W0123 00:05:03.559321 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.559369 kubelet[2916]: E0123 00:05:03.559329 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.559597 kubelet[2916]: E0123 00:05:03.559585 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.559597 kubelet[2916]: W0123 00:05:03.559595 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.559649 kubelet[2916]: E0123 00:05:03.559604 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.559802 kubelet[2916]: E0123 00:05:03.559789 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.559802 kubelet[2916]: W0123 00:05:03.559800 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.559854 kubelet[2916]: E0123 00:05:03.559810 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.559972 kubelet[2916]: E0123 00:05:03.559959 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.559972 kubelet[2916]: W0123 00:05:03.559969 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.560015 kubelet[2916]: E0123 00:05:03.559977 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.560112 kubelet[2916]: E0123 00:05:03.560102 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.560135 kubelet[2916]: W0123 00:05:03.560113 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.560135 kubelet[2916]: E0123 00:05:03.560121 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.560281 kubelet[2916]: E0123 00:05:03.560271 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.560304 kubelet[2916]: W0123 00:05:03.560281 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.560304 kubelet[2916]: E0123 00:05:03.560291 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.560418 kubelet[2916]: E0123 00:05:03.560407 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.560418 kubelet[2916]: W0123 00:05:03.560417 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.560467 kubelet[2916]: E0123 00:05:03.560424 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.560582 kubelet[2916]: E0123 00:05:03.560570 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.560582 kubelet[2916]: W0123 00:05:03.560581 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.560640 kubelet[2916]: E0123 00:05:03.560589 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.561041 kubelet[2916]: E0123 00:05:03.561009 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.561041 kubelet[2916]: W0123 00:05:03.561024 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.561041 kubelet[2916]: E0123 00:05:03.561033 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.561194 kubelet[2916]: E0123 00:05:03.561180 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.561194 kubelet[2916]: W0123 00:05:03.561190 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.561244 kubelet[2916]: E0123 00:05:03.561199 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.561389 kubelet[2916]: E0123 00:05:03.561378 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.561389 kubelet[2916]: W0123 00:05:03.561388 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.561435 kubelet[2916]: E0123 00:05:03.561396 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.561617 kubelet[2916]: E0123 00:05:03.561607 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.561638 kubelet[2916]: W0123 00:05:03.561617 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.561638 kubelet[2916]: E0123 00:05:03.561626 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.561890 kubelet[2916]: E0123 00:05:03.561877 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 00:05:03.561890 kubelet[2916]: W0123 00:05:03.561888 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 00:05:03.561948 kubelet[2916]: E0123 00:05:03.561897 2916 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 00:05:03.595197 containerd[1664]: time="2026-01-23T00:05:03.595149164Z" level=info msg="StartContainer for \"81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda\" returns successfully" Jan 23 00:05:03.606894 systemd[1]: cri-containerd-81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda.scope: Deactivated successfully. Jan 23 00:05:03.611175 containerd[1664]: time="2026-01-23T00:05:03.611130210Z" level=info msg="received container exit event container_id:\"81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda\" id:\"81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda\" pid:3605 exited_at:{seconds:1769126703 nanos:610786729}" Jan 23 00:05:03.628558 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-81bd3be0294304920656e1d57257fb26e87de1dd95595e6f6a58e4735c4dcbda-rootfs.mount: Deactivated successfully. Jan 23 00:05:05.400696 kubelet[2916]: E0123 00:05:05.400588 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:07.400111 kubelet[2916]: E0123 00:05:07.399984 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:07.494666 containerd[1664]: time="2026-01-23T00:05:07.494625914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 23 00:05:09.399400 kubelet[2916]: E0123 00:05:09.399336 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:10.984447 containerd[1664]: time="2026-01-23T00:05:10.983797007Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:10.985351 containerd[1664]: time="2026-01-23T00:05:10.985306291Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Jan 23 00:05:10.986494 containerd[1664]: time="2026-01-23T00:05:10.986450614Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:10.988593 containerd[1664]: time="2026-01-23T00:05:10.988551140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:10.989161 containerd[1664]: time="2026-01-23T00:05:10.989136422Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.494467468s" Jan 23 00:05:10.989214 containerd[1664]: time="2026-01-23T00:05:10.989165382Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 23 00:05:10.994461 containerd[1664]: time="2026-01-23T00:05:10.994426157Z" level=info msg="CreateContainer within sandbox \"b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 23 00:05:11.002152 containerd[1664]: time="2026-01-23T00:05:11.002111579Z" level=info msg="Container 2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:05:11.010203 containerd[1664]: time="2026-01-23T00:05:11.010172722Z" level=info msg="CreateContainer within sandbox \"b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d\"" Jan 23 00:05:11.011249 containerd[1664]: time="2026-01-23T00:05:11.010557163Z" level=info msg="StartContainer for \"2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d\"" Jan 23 00:05:11.012003 containerd[1664]: time="2026-01-23T00:05:11.011970287Z" level=info msg="connecting to shim 2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d" address="unix:///run/containerd/s/c30fccc952b601dbc93cb7f9a493fe7d459d9d4cb7a0a6e7d05ddb87ee357e42" protocol=ttrpc version=3 Jan 23 00:05:11.040066 systemd[1]: Started cri-containerd-2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d.scope - libcontainer container 2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d. Jan 23 00:05:11.113909 containerd[1664]: time="2026-01-23T00:05:11.113868580Z" level=info msg="StartContainer for \"2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d\" returns successfully" Jan 23 00:05:11.401751 kubelet[2916]: E0123 00:05:11.400947 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:12.376731 containerd[1664]: time="2026-01-23T00:05:12.376646604Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 00:05:12.379063 systemd[1]: cri-containerd-2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d.scope: Deactivated successfully. Jan 23 00:05:12.379350 systemd[1]: cri-containerd-2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d.scope: Consumed 456ms CPU time, 188.8M memory peak, 165.9M written to disk. Jan 23 00:05:12.380341 containerd[1664]: time="2026-01-23T00:05:12.380307574Z" level=info msg="received container exit event container_id:\"2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d\" id:\"2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d\" pid:3698 exited_at:{seconds:1769126712 nanos:380057173}" Jan 23 00:05:12.399462 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2fe4289593c62a6e3513607017dfaefc6748521f4c8a688f1ba7eba6919a7e1d-rootfs.mount: Deactivated successfully. Jan 23 00:05:12.505275 kubelet[2916]: I0123 00:05:12.415364 2916 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Jan 23 00:05:13.408871 systemd[1]: Created slice kubepods-burstable-pod43a640f6_edee_4332_9215_0d039b0402e6.slice - libcontainer container kubepods-burstable-pod43a640f6_edee_4332_9215_0d039b0402e6.slice. Jan 23 00:05:13.451941 kubelet[2916]: I0123 00:05:13.451896 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43a640f6-edee-4332-9215-0d039b0402e6-config-volume\") pod \"coredns-66bc5c9577-5xn2v\" (UID: \"43a640f6-edee-4332-9215-0d039b0402e6\") " pod="kube-system/coredns-66bc5c9577-5xn2v" Jan 23 00:05:13.452093 kubelet[2916]: I0123 00:05:13.451989 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbjt\" (UniqueName: \"kubernetes.io/projected/43a640f6-edee-4332-9215-0d039b0402e6-kube-api-access-vpbjt\") pod \"coredns-66bc5c9577-5xn2v\" (UID: \"43a640f6-edee-4332-9215-0d039b0402e6\") " pod="kube-system/coredns-66bc5c9577-5xn2v" Jan 23 00:05:13.578367 systemd[1]: Created slice kubepods-burstable-pod48cd25bc_c079_4c45_9a7e_c9867ffb9751.slice - libcontainer container kubepods-burstable-pod48cd25bc_c079_4c45_9a7e_c9867ffb9751.slice. Jan 23 00:05:13.585608 systemd[1]: Created slice kubepods-besteffort-pode4fbecc4_8903_42d8_8af9_1aa47331d5be.slice - libcontainer container kubepods-besteffort-pode4fbecc4_8903_42d8_8af9_1aa47331d5be.slice. Jan 23 00:05:13.590520 systemd[1]: Created slice kubepods-besteffort-pod2511aa5d_56b0_481d_8829_6daaa6eae613.slice - libcontainer container kubepods-besteffort-pod2511aa5d_56b0_481d_8829_6daaa6eae613.slice. Jan 23 00:05:13.656007 kubelet[2916]: I0123 00:05:13.653441 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd7d6\" (UniqueName: \"kubernetes.io/projected/2511aa5d-56b0-481d-8829-6daaa6eae613-kube-api-access-nd7d6\") pod \"calico-apiserver-6b477b4fc8-6qcz2\" (UID: \"2511aa5d-56b0-481d-8829-6daaa6eae613\") " pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" Jan 23 00:05:13.656007 kubelet[2916]: I0123 00:05:13.653483 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48cd25bc-c079-4c45-9a7e-c9867ffb9751-config-volume\") pod \"coredns-66bc5c9577-rrn95\" (UID: \"48cd25bc-c079-4c45-9a7e-c9867ffb9751\") " pod="kube-system/coredns-66bc5c9577-rrn95" Jan 23 00:05:13.656007 kubelet[2916]: I0123 00:05:13.653507 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2511aa5d-56b0-481d-8829-6daaa6eae613-calico-apiserver-certs\") pod \"calico-apiserver-6b477b4fc8-6qcz2\" (UID: \"2511aa5d-56b0-481d-8829-6daaa6eae613\") " pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" Jan 23 00:05:13.656007 kubelet[2916]: I0123 00:05:13.653522 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pxzl\" (UniqueName: \"kubernetes.io/projected/48cd25bc-c079-4c45-9a7e-c9867ffb9751-kube-api-access-8pxzl\") pod \"coredns-66bc5c9577-rrn95\" (UID: \"48cd25bc-c079-4c45-9a7e-c9867ffb9751\") " pod="kube-system/coredns-66bc5c9577-rrn95" Jan 23 00:05:13.752762 systemd[1]: Created slice kubepods-besteffort-pod2c86da6b_b89e_4517_904a_9f7bcd6f830a.slice - libcontainer container kubepods-besteffort-pod2c86da6b_b89e_4517_904a_9f7bcd6f830a.slice. Jan 23 00:05:13.757481 containerd[1664]: time="2026-01-23T00:05:13.757433246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdvg2,Uid:e4fbecc4-8903-42d8-8af9-1aa47331d5be,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:13.835569 containerd[1664]: time="2026-01-23T00:05:13.835300509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-5xn2v,Uid:43a640f6-edee-4332-9215-0d039b0402e6,Namespace:kube-system,Attempt:0,}" Jan 23 00:05:13.840332 systemd[1]: Created slice kubepods-besteffort-pod46de2902_bb4c_4eda_81ac_f00ac179b50d.slice - libcontainer container kubepods-besteffort-pod46de2902_bb4c_4eda_81ac_f00ac179b50d.slice. Jan 23 00:05:13.854621 systemd[1]: Created slice kubepods-besteffort-podfb016e95_3a65_4e3c_b265_30c14a6c50c6.slice - libcontainer container kubepods-besteffort-podfb016e95_3a65_4e3c_b265_30c14a6c50c6.slice. Jan 23 00:05:13.855923 kubelet[2916]: I0123 00:05:13.854919 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j7tt\" (UniqueName: \"kubernetes.io/projected/2c86da6b-b89e-4517-904a-9f7bcd6f830a-kube-api-access-8j7tt\") pod \"calico-kube-controllers-557fb68f57-qftrq\" (UID: \"2c86da6b-b89e-4517-904a-9f7bcd6f830a\") " pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" Jan 23 00:05:13.855923 kubelet[2916]: I0123 00:05:13.854956 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c86da6b-b89e-4517-904a-9f7bcd6f830a-tigera-ca-bundle\") pod \"calico-kube-controllers-557fb68f57-qftrq\" (UID: \"2c86da6b-b89e-4517-904a-9f7bcd6f830a\") " pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" Jan 23 00:05:13.855923 kubelet[2916]: I0123 00:05:13.854975 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/46de2902-bb4c-4eda-81ac-f00ac179b50d-calico-apiserver-certs\") pod \"calico-apiserver-6b477b4fc8-45hrp\" (UID: \"46de2902-bb4c-4eda-81ac-f00ac179b50d\") " pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" Jan 23 00:05:13.855923 kubelet[2916]: I0123 00:05:13.855154 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkm9\" (UniqueName: \"kubernetes.io/projected/46de2902-bb4c-4eda-81ac-f00ac179b50d-kube-api-access-4pkm9\") pod \"calico-apiserver-6b477b4fc8-45hrp\" (UID: \"46de2902-bb4c-4eda-81ac-f00ac179b50d\") " pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" Jan 23 00:05:13.863463 systemd[1]: Created slice kubepods-besteffort-poda95e0f65_bd4d_4d04_8db9_17e0561272ca.slice - libcontainer container kubepods-besteffort-poda95e0f65_bd4d_4d04_8db9_17e0561272ca.slice. Jan 23 00:05:13.885086 containerd[1664]: time="2026-01-23T00:05:13.884775171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rrn95,Uid:48cd25bc-c079-4c45-9a7e-c9867ffb9751,Namespace:kube-system,Attempt:0,}" Jan 23 00:05:13.913916 containerd[1664]: time="2026-01-23T00:05:13.913863575Z" level=error msg="Failed to destroy network for sandbox \"5cd85d8198e98ac8ce8c703dc9f419122973eefb87c843cbad945b09c46f0345\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.915454 containerd[1664]: time="2026-01-23T00:05:13.915373299Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdvg2,Uid:e4fbecc4-8903-42d8-8af9-1aa47331d5be,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd85d8198e98ac8ce8c703dc9f419122973eefb87c843cbad945b09c46f0345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.915821 kubelet[2916]: E0123 00:05:13.915653 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd85d8198e98ac8ce8c703dc9f419122973eefb87c843cbad945b09c46f0345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.915821 kubelet[2916]: E0123 00:05:13.915728 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd85d8198e98ac8ce8c703dc9f419122973eefb87c843cbad945b09c46f0345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hdvg2" Jan 23 00:05:13.915821 kubelet[2916]: E0123 00:05:13.915748 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd85d8198e98ac8ce8c703dc9f419122973eefb87c843cbad945b09c46f0345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hdvg2" Jan 23 00:05:13.915940 containerd[1664]: time="2026-01-23T00:05:13.915741980Z" level=error msg="Failed to destroy network for sandbox \"9201d592d19599195796fddb706982f200231a7e6b26ed3d4bb8b4dd9a7e19aa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.915966 kubelet[2916]: E0123 00:05:13.915806 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5cd85d8198e98ac8ce8c703dc9f419122973eefb87c843cbad945b09c46f0345\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:13.917813 containerd[1664]: time="2026-01-23T00:05:13.917668106Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-5xn2v,Uid:43a640f6-edee-4332-9215-0d039b0402e6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9201d592d19599195796fddb706982f200231a7e6b26ed3d4bb8b4dd9a7e19aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.918051 kubelet[2916]: E0123 00:05:13.918020 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9201d592d19599195796fddb706982f200231a7e6b26ed3d4bb8b4dd9a7e19aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.918367 kubelet[2916]: E0123 00:05:13.918201 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9201d592d19599195796fddb706982f200231a7e6b26ed3d4bb8b4dd9a7e19aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-5xn2v" Jan 23 00:05:13.918367 kubelet[2916]: E0123 00:05:13.918238 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9201d592d19599195796fddb706982f200231a7e6b26ed3d4bb8b4dd9a7e19aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-5xn2v" Jan 23 00:05:13.918367 kubelet[2916]: E0123 00:05:13.918322 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-5xn2v_kube-system(43a640f6-edee-4332-9215-0d039b0402e6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-5xn2v_kube-system(43a640f6-edee-4332-9215-0d039b0402e6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9201d592d19599195796fddb706982f200231a7e6b26ed3d4bb8b4dd9a7e19aa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-5xn2v" podUID="43a640f6-edee-4332-9215-0d039b0402e6" Jan 23 00:05:13.936450 containerd[1664]: time="2026-01-23T00:05:13.936398239Z" level=error msg="Failed to destroy network for sandbox \"2d394ff47764ab9a389f9bd307d356a2a85f68c64a88e3b9f72a2659bde45938\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.937693 containerd[1664]: time="2026-01-23T00:05:13.937660923Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rrn95,Uid:48cd25bc-c079-4c45-9a7e-c9867ffb9751,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d394ff47764ab9a389f9bd307d356a2a85f68c64a88e3b9f72a2659bde45938\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.938025 kubelet[2916]: E0123 00:05:13.937972 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d394ff47764ab9a389f9bd307d356a2a85f68c64a88e3b9f72a2659bde45938\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:13.938077 kubelet[2916]: E0123 00:05:13.938027 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d394ff47764ab9a389f9bd307d356a2a85f68c64a88e3b9f72a2659bde45938\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rrn95" Jan 23 00:05:13.938077 kubelet[2916]: E0123 00:05:13.938046 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d394ff47764ab9a389f9bd307d356a2a85f68c64a88e3b9f72a2659bde45938\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rrn95" Jan 23 00:05:13.938131 kubelet[2916]: E0123 00:05:13.938087 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-rrn95_kube-system(48cd25bc-c079-4c45-9a7e-c9867ffb9751)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-rrn95_kube-system(48cd25bc-c079-4c45-9a7e-c9867ffb9751)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d394ff47764ab9a389f9bd307d356a2a85f68c64a88e3b9f72a2659bde45938\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-rrn95" podUID="48cd25bc-c079-4c45-9a7e-c9867ffb9751" Jan 23 00:05:13.955685 kubelet[2916]: I0123 00:05:13.955655 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb016e95-3a65-4e3c-b265-30c14a6c50c6-config\") pod \"goldmane-7c778bb748-dsv6d\" (UID: \"fb016e95-3a65-4e3c-b265-30c14a6c50c6\") " pod="calico-system/goldmane-7c778bb748-dsv6d" Jan 23 00:05:13.955685 kubelet[2916]: I0123 00:05:13.955692 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/fb016e95-3a65-4e3c-b265-30c14a6c50c6-goldmane-key-pair\") pod \"goldmane-7c778bb748-dsv6d\" (UID: \"fb016e95-3a65-4e3c-b265-30c14a6c50c6\") " pod="calico-system/goldmane-7c778bb748-dsv6d" Jan 23 00:05:13.955808 kubelet[2916]: I0123 00:05:13.955761 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbf6\" (UniqueName: \"kubernetes.io/projected/a95e0f65-bd4d-4d04-8db9-17e0561272ca-kube-api-access-6nbf6\") pod \"whisker-6ffdc7bfd5-5pm7d\" (UID: \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\") " pod="calico-system/whisker-6ffdc7bfd5-5pm7d" Jan 23 00:05:13.955808 kubelet[2916]: I0123 00:05:13.955778 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-backend-key-pair\") pod \"whisker-6ffdc7bfd5-5pm7d\" (UID: \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\") " pod="calico-system/whisker-6ffdc7bfd5-5pm7d" Jan 23 00:05:13.955808 kubelet[2916]: I0123 00:05:13.955793 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-ca-bundle\") pod \"whisker-6ffdc7bfd5-5pm7d\" (UID: \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\") " pod="calico-system/whisker-6ffdc7bfd5-5pm7d" Jan 23 00:05:13.955808 kubelet[2916]: I0123 00:05:13.955807 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb016e95-3a65-4e3c-b265-30c14a6c50c6-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-dsv6d\" (UID: \"fb016e95-3a65-4e3c-b265-30c14a6c50c6\") " pod="calico-system/goldmane-7c778bb748-dsv6d" Jan 23 00:05:13.955900 kubelet[2916]: I0123 00:05:13.955823 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdhqj\" (UniqueName: \"kubernetes.io/projected/fb016e95-3a65-4e3c-b265-30c14a6c50c6-kube-api-access-hdhqj\") pod \"goldmane-7c778bb748-dsv6d\" (UID: \"fb016e95-3a65-4e3c-b265-30c14a6c50c6\") " pod="calico-system/goldmane-7c778bb748-dsv6d" Jan 23 00:05:13.960750 containerd[1664]: time="2026-01-23T00:05:13.960382068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-6qcz2,Uid:2511aa5d-56b0-481d-8829-6daaa6eae613,Namespace:calico-apiserver,Attempt:0,}" Jan 23 00:05:14.004445 containerd[1664]: time="2026-01-23T00:05:14.004303994Z" level=error msg="Failed to destroy network for sandbox \"8fdcb22920c439b68d36d63d5154e6b31994e40efcbd455a01931f1031b8942c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.005956 containerd[1664]: time="2026-01-23T00:05:14.005910919Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-6qcz2,Uid:2511aa5d-56b0-481d-8829-6daaa6eae613,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fdcb22920c439b68d36d63d5154e6b31994e40efcbd455a01931f1031b8942c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.006222 kubelet[2916]: E0123 00:05:14.006178 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fdcb22920c439b68d36d63d5154e6b31994e40efcbd455a01931f1031b8942c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.006292 kubelet[2916]: E0123 00:05:14.006237 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fdcb22920c439b68d36d63d5154e6b31994e40efcbd455a01931f1031b8942c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" Jan 23 00:05:14.006292 kubelet[2916]: E0123 00:05:14.006260 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fdcb22920c439b68d36d63d5154e6b31994e40efcbd455a01931f1031b8942c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" Jan 23 00:05:14.006353 kubelet[2916]: E0123 00:05:14.006312 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b477b4fc8-6qcz2_calico-apiserver(2511aa5d-56b0-481d-8829-6daaa6eae613)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b477b4fc8-6qcz2_calico-apiserver(2511aa5d-56b0-481d-8829-6daaa6eae613)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8fdcb22920c439b68d36d63d5154e6b31994e40efcbd455a01931f1031b8942c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:05:14.063147 containerd[1664]: time="2026-01-23T00:05:14.063101403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-557fb68f57-qftrq,Uid:2c86da6b-b89e-4517-904a-9f7bcd6f830a,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:14.107117 containerd[1664]: time="2026-01-23T00:05:14.107072209Z" level=error msg="Failed to destroy network for sandbox \"e1c6546fb6ca483fe5e12825bb4811913ccafac940901f05efca1af5d3404f42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.109110 containerd[1664]: time="2026-01-23T00:05:14.109021775Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-557fb68f57-qftrq,Uid:2c86da6b-b89e-4517-904a-9f7bcd6f830a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1c6546fb6ca483fe5e12825bb4811913ccafac940901f05efca1af5d3404f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.109427 kubelet[2916]: E0123 00:05:14.109384 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1c6546fb6ca483fe5e12825bb4811913ccafac940901f05efca1af5d3404f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.109540 kubelet[2916]: E0123 00:05:14.109524 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1c6546fb6ca483fe5e12825bb4811913ccafac940901f05efca1af5d3404f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" Jan 23 00:05:14.109599 kubelet[2916]: E0123 00:05:14.109587 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1c6546fb6ca483fe5e12825bb4811913ccafac940901f05efca1af5d3404f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" Jan 23 00:05:14.109707 kubelet[2916]: E0123 00:05:14.109685 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-557fb68f57-qftrq_calico-system(2c86da6b-b89e-4517-904a-9f7bcd6f830a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-557fb68f57-qftrq_calico-system(2c86da6b-b89e-4517-904a-9f7bcd6f830a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e1c6546fb6ca483fe5e12825bb4811913ccafac940901f05efca1af5d3404f42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:05:14.147118 containerd[1664]: time="2026-01-23T00:05:14.147079124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-45hrp,Uid:46de2902-bb4c-4eda-81ac-f00ac179b50d,Namespace:calico-apiserver,Attempt:0,}" Jan 23 00:05:14.164979 containerd[1664]: time="2026-01-23T00:05:14.164937495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-dsv6d,Uid:fb016e95-3a65-4e3c-b265-30c14a6c50c6,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:14.169375 containerd[1664]: time="2026-01-23T00:05:14.169234228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6ffdc7bfd5-5pm7d,Uid:a95e0f65-bd4d-4d04-8db9-17e0561272ca,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:14.195571 containerd[1664]: time="2026-01-23T00:05:14.195482503Z" level=error msg="Failed to destroy network for sandbox \"e597e1f891dabafe7de8fc0ff791d54cd9525d48be4428545354c55100879482\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.197831 containerd[1664]: time="2026-01-23T00:05:14.197788070Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-45hrp,Uid:46de2902-bb4c-4eda-81ac-f00ac179b50d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e597e1f891dabafe7de8fc0ff791d54cd9525d48be4428545354c55100879482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.198041 kubelet[2916]: E0123 00:05:14.198005 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e597e1f891dabafe7de8fc0ff791d54cd9525d48be4428545354c55100879482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.198099 kubelet[2916]: E0123 00:05:14.198068 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e597e1f891dabafe7de8fc0ff791d54cd9525d48be4428545354c55100879482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" Jan 23 00:05:14.198099 kubelet[2916]: E0123 00:05:14.198087 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e597e1f891dabafe7de8fc0ff791d54cd9525d48be4428545354c55100879482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" Jan 23 00:05:14.198166 kubelet[2916]: E0123 00:05:14.198142 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b477b4fc8-45hrp_calico-apiserver(46de2902-bb4c-4eda-81ac-f00ac179b50d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b477b4fc8-45hrp_calico-apiserver(46de2902-bb4c-4eda-81ac-f00ac179b50d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e597e1f891dabafe7de8fc0ff791d54cd9525d48be4428545354c55100879482\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:05:14.218240 containerd[1664]: time="2026-01-23T00:05:14.218185568Z" level=error msg="Failed to destroy network for sandbox \"1986a5323e6ae1fcdceaef502917497bea7eef3873eeca01f3a46e8ac6d9faa5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.222820 containerd[1664]: time="2026-01-23T00:05:14.222768541Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-dsv6d,Uid:fb016e95-3a65-4e3c-b265-30c14a6c50c6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1986a5323e6ae1fcdceaef502917497bea7eef3873eeca01f3a46e8ac6d9faa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.223455 kubelet[2916]: E0123 00:05:14.222988 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1986a5323e6ae1fcdceaef502917497bea7eef3873eeca01f3a46e8ac6d9faa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.223455 kubelet[2916]: E0123 00:05:14.223037 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1986a5323e6ae1fcdceaef502917497bea7eef3873eeca01f3a46e8ac6d9faa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-dsv6d" Jan 23 00:05:14.223455 kubelet[2916]: E0123 00:05:14.223057 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1986a5323e6ae1fcdceaef502917497bea7eef3873eeca01f3a46e8ac6d9faa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-dsv6d" Jan 23 00:05:14.223582 kubelet[2916]: E0123 00:05:14.223112 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-dsv6d_calico-system(fb016e95-3a65-4e3c-b265-30c14a6c50c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-dsv6d_calico-system(fb016e95-3a65-4e3c-b265-30c14a6c50c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1986a5323e6ae1fcdceaef502917497bea7eef3873eeca01f3a46e8ac6d9faa5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:05:14.230071 containerd[1664]: time="2026-01-23T00:05:14.230035882Z" level=error msg="Failed to destroy network for sandbox \"87d73bc8405de6e8cf8dcffbbc214a2857e301f79818443a8af17985ba4c0332\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.231585 containerd[1664]: time="2026-01-23T00:05:14.231526686Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6ffdc7bfd5-5pm7d,Uid:a95e0f65-bd4d-4d04-8db9-17e0561272ca,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"87d73bc8405de6e8cf8dcffbbc214a2857e301f79818443a8af17985ba4c0332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.232210 kubelet[2916]: E0123 00:05:14.231749 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87d73bc8405de6e8cf8dcffbbc214a2857e301f79818443a8af17985ba4c0332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 00:05:14.232210 kubelet[2916]: E0123 00:05:14.231811 2916 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87d73bc8405de6e8cf8dcffbbc214a2857e301f79818443a8af17985ba4c0332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6ffdc7bfd5-5pm7d" Jan 23 00:05:14.232210 kubelet[2916]: E0123 00:05:14.231829 2916 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87d73bc8405de6e8cf8dcffbbc214a2857e301f79818443a8af17985ba4c0332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6ffdc7bfd5-5pm7d" Jan 23 00:05:14.232342 kubelet[2916]: E0123 00:05:14.231879 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6ffdc7bfd5-5pm7d_calico-system(a95e0f65-bd4d-4d04-8db9-17e0561272ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6ffdc7bfd5-5pm7d_calico-system(a95e0f65-bd4d-4d04-8db9-17e0561272ca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"87d73bc8405de6e8cf8dcffbbc214a2857e301f79818443a8af17985ba4c0332\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6ffdc7bfd5-5pm7d" podUID="a95e0f65-bd4d-4d04-8db9-17e0561272ca" Jan 23 00:05:14.518600 containerd[1664]: time="2026-01-23T00:05:14.517944148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 23 00:05:14.572455 systemd[1]: run-netns-cni\x2d69df080e\x2d8369\x2dfc5f\x2def03\x2d7171138f4b89.mount: Deactivated successfully. Jan 23 00:05:20.781991 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3786205574.mount: Deactivated successfully. Jan 23 00:05:20.808059 containerd[1664]: time="2026-01-23T00:05:20.808006158Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:20.809687 containerd[1664]: time="2026-01-23T00:05:20.809653843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Jan 23 00:05:20.810750 containerd[1664]: time="2026-01-23T00:05:20.810573606Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:20.812907 containerd[1664]: time="2026-01-23T00:05:20.812859692Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 00:05:20.813338 containerd[1664]: time="2026-01-23T00:05:20.813295533Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 6.294686543s" Jan 23 00:05:20.813338 containerd[1664]: time="2026-01-23T00:05:20.813331094Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 23 00:05:20.826086 containerd[1664]: time="2026-01-23T00:05:20.826047770Z" level=info msg="CreateContainer within sandbox \"b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 23 00:05:20.836936 containerd[1664]: time="2026-01-23T00:05:20.836891121Z" level=info msg="Container 1e25023faf5d16ae62f984040108117f2e5bcf93feae7b0dcd90ee9709a13547: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:05:20.845481 containerd[1664]: time="2026-01-23T00:05:20.845430266Z" level=info msg="CreateContainer within sandbox \"b9e2f9993ce47e8c35ae5bd99614b275dbd2b9475fe3daf46903ffd0fcbd9e90\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1e25023faf5d16ae62f984040108117f2e5bcf93feae7b0dcd90ee9709a13547\"" Jan 23 00:05:20.846224 containerd[1664]: time="2026-01-23T00:05:20.845942267Z" level=info msg="StartContainer for \"1e25023faf5d16ae62f984040108117f2e5bcf93feae7b0dcd90ee9709a13547\"" Jan 23 00:05:20.847569 containerd[1664]: time="2026-01-23T00:05:20.847520592Z" level=info msg="connecting to shim 1e25023faf5d16ae62f984040108117f2e5bcf93feae7b0dcd90ee9709a13547" address="unix:///run/containerd/s/c30fccc952b601dbc93cb7f9a493fe7d459d9d4cb7a0a6e7d05ddb87ee357e42" protocol=ttrpc version=3 Jan 23 00:05:20.870042 systemd[1]: Started cri-containerd-1e25023faf5d16ae62f984040108117f2e5bcf93feae7b0dcd90ee9709a13547.scope - libcontainer container 1e25023faf5d16ae62f984040108117f2e5bcf93feae7b0dcd90ee9709a13547. Jan 23 00:05:20.955970 containerd[1664]: time="2026-01-23T00:05:20.955820822Z" level=info msg="StartContainer for \"1e25023faf5d16ae62f984040108117f2e5bcf93feae7b0dcd90ee9709a13547\" returns successfully" Jan 23 00:05:21.098969 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 23 00:05:21.099076 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 23 00:05:21.307980 kubelet[2916]: I0123 00:05:21.307933 2916 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-ca-bundle\") pod \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\" (UID: \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\") " Jan 23 00:05:21.308974 kubelet[2916]: I0123 00:05:21.308510 2916 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nbf6\" (UniqueName: \"kubernetes.io/projected/a95e0f65-bd4d-4d04-8db9-17e0561272ca-kube-api-access-6nbf6\") pod \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\" (UID: \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\") " Jan 23 00:05:21.308974 kubelet[2916]: I0123 00:05:21.308466 2916 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "a95e0f65-bd4d-4d04-8db9-17e0561272ca" (UID: "a95e0f65-bd4d-4d04-8db9-17e0561272ca"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 23 00:05:21.308974 kubelet[2916]: I0123 00:05:21.308571 2916 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-backend-key-pair\") pod \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\" (UID: \"a95e0f65-bd4d-4d04-8db9-17e0561272ca\") " Jan 23 00:05:21.308974 kubelet[2916]: I0123 00:05:21.308670 2916 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-ca-bundle\") on node \"ci-4459-2-2-n-22c0b85714\" DevicePath \"\"" Jan 23 00:05:21.311662 kubelet[2916]: I0123 00:05:21.311630 2916 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a95e0f65-bd4d-4d04-8db9-17e0561272ca-kube-api-access-6nbf6" (OuterVolumeSpecName: "kube-api-access-6nbf6") pod "a95e0f65-bd4d-4d04-8db9-17e0561272ca" (UID: "a95e0f65-bd4d-4d04-8db9-17e0561272ca"). InnerVolumeSpecName "kube-api-access-6nbf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 23 00:05:21.312496 kubelet[2916]: I0123 00:05:21.312467 2916 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "a95e0f65-bd4d-4d04-8db9-17e0561272ca" (UID: "a95e0f65-bd4d-4d04-8db9-17e0561272ca"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 23 00:05:21.406012 systemd[1]: Removed slice kubepods-besteffort-poda95e0f65_bd4d_4d04_8db9_17e0561272ca.slice - libcontainer container kubepods-besteffort-poda95e0f65_bd4d_4d04_8db9_17e0561272ca.slice. Jan 23 00:05:21.409037 kubelet[2916]: I0123 00:05:21.409009 2916 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6nbf6\" (UniqueName: \"kubernetes.io/projected/a95e0f65-bd4d-4d04-8db9-17e0561272ca-kube-api-access-6nbf6\") on node \"ci-4459-2-2-n-22c0b85714\" DevicePath \"\"" Jan 23 00:05:21.409150 kubelet[2916]: I0123 00:05:21.409138 2916 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a95e0f65-bd4d-4d04-8db9-17e0561272ca-whisker-backend-key-pair\") on node \"ci-4459-2-2-n-22c0b85714\" DevicePath \"\"" Jan 23 00:05:21.556753 kubelet[2916]: I0123 00:05:21.555570 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-q7mnj" podStartSLOduration=1.942496503 podStartE2EDuration="25.555555863s" podCreationTimestamp="2026-01-23 00:04:56 +0000 UTC" firstStartedPulling="2026-01-23 00:04:57.201244336 +0000 UTC m=+21.906096704" lastFinishedPulling="2026-01-23 00:05:20.814303696 +0000 UTC m=+45.519156064" observedRunningTime="2026-01-23 00:05:21.552562695 +0000 UTC m=+46.257415103" watchObservedRunningTime="2026-01-23 00:05:21.555555863 +0000 UTC m=+46.260408191" Jan 23 00:05:21.618021 systemd[1]: Created slice kubepods-besteffort-pod01440e69_1437_4f1f_8ae8_f18c381a9217.slice - libcontainer container kubepods-besteffort-pod01440e69_1437_4f1f_8ae8_f18c381a9217.slice. Jan 23 00:05:21.711907 kubelet[2916]: I0123 00:05:21.711866 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/01440e69-1437-4f1f-8ae8-f18c381a9217-whisker-backend-key-pair\") pod \"whisker-59b64767c-5cv6g\" (UID: \"01440e69-1437-4f1f-8ae8-f18c381a9217\") " pod="calico-system/whisker-59b64767c-5cv6g" Jan 23 00:05:21.711907 kubelet[2916]: I0123 00:05:21.711908 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01440e69-1437-4f1f-8ae8-f18c381a9217-whisker-ca-bundle\") pod \"whisker-59b64767c-5cv6g\" (UID: \"01440e69-1437-4f1f-8ae8-f18c381a9217\") " pod="calico-system/whisker-59b64767c-5cv6g" Jan 23 00:05:21.712074 kubelet[2916]: I0123 00:05:21.711928 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx29p\" (UniqueName: \"kubernetes.io/projected/01440e69-1437-4f1f-8ae8-f18c381a9217-kube-api-access-lx29p\") pod \"whisker-59b64767c-5cv6g\" (UID: \"01440e69-1437-4f1f-8ae8-f18c381a9217\") " pod="calico-system/whisker-59b64767c-5cv6g" Jan 23 00:05:21.783203 systemd[1]: var-lib-kubelet-pods-a95e0f65\x2dbd4d\x2d4d04\x2d8db9\x2d17e0561272ca-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6nbf6.mount: Deactivated successfully. Jan 23 00:05:21.783297 systemd[1]: var-lib-kubelet-pods-a95e0f65\x2dbd4d\x2d4d04\x2d8db9\x2d17e0561272ca-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 23 00:05:21.925026 containerd[1664]: time="2026-01-23T00:05:21.924908883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59b64767c-5cv6g,Uid:01440e69-1437-4f1f-8ae8-f18c381a9217,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:22.054903 systemd-networkd[1519]: calif19ff4f1536: Link UP Jan 23 00:05:22.056766 systemd-networkd[1519]: calif19ff4f1536: Gained carrier Jan 23 00:05:22.068819 containerd[1664]: 2026-01-23 00:05:21.945 [INFO][4072] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 23 00:05:22.068819 containerd[1664]: 2026-01-23 00:05:21.965 [INFO][4072] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0 whisker-59b64767c- calico-system 01440e69-1437-4f1f-8ae8-f18c381a9217 896 0 2026-01-23 00:05:21 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:59b64767c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 whisker-59b64767c-5cv6g eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif19ff4f1536 [] [] }} ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-" Jan 23 00:05:22.068819 containerd[1664]: 2026-01-23 00:05:21.965 [INFO][4072] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" Jan 23 00:05:22.068819 containerd[1664]: 2026-01-23 00:05:22.009 [INFO][4087] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" HandleID="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Workload="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.009 [INFO][4087] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" HandleID="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Workload="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000585f00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-22c0b85714", "pod":"whisker-59b64767c-5cv6g", "timestamp":"2026-01-23 00:05:22.009514246 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.009 [INFO][4087] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.009 [INFO][4087] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.009 [INFO][4087] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.020 [INFO][4087] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.025 [INFO][4087] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.029 [INFO][4087] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.030 [INFO][4087] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069064 containerd[1664]: 2026-01-23 00:05:22.032 [INFO][4087] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069247 containerd[1664]: 2026-01-23 00:05:22.033 [INFO][4087] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069247 containerd[1664]: 2026-01-23 00:05:22.034 [INFO][4087] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c Jan 23 00:05:22.069247 containerd[1664]: 2026-01-23 00:05:22.038 [INFO][4087] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069247 containerd[1664]: 2026-01-23 00:05:22.044 [INFO][4087] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.129/26] block=192.168.118.128/26 handle="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069247 containerd[1664]: 2026-01-23 00:05:22.044 [INFO][4087] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.129/26] handle="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:22.069247 containerd[1664]: 2026-01-23 00:05:22.044 [INFO][4087] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:22.069247 containerd[1664]: 2026-01-23 00:05:22.044 [INFO][4087] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.129/26] IPv6=[] ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" HandleID="k8s-pod-network.7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Workload="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" Jan 23 00:05:22.069371 containerd[1664]: 2026-01-23 00:05:22.046 [INFO][4072] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0", GenerateName:"whisker-59b64767c-", Namespace:"calico-system", SelfLink:"", UID:"01440e69-1437-4f1f-8ae8-f18c381a9217", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 5, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"59b64767c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"whisker-59b64767c-5cv6g", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.118.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif19ff4f1536", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:22.069371 containerd[1664]: 2026-01-23 00:05:22.046 [INFO][4072] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.129/32] ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" Jan 23 00:05:22.069478 containerd[1664]: 2026-01-23 00:05:22.046 [INFO][4072] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif19ff4f1536 ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" Jan 23 00:05:22.069478 containerd[1664]: 2026-01-23 00:05:22.055 [INFO][4072] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" Jan 23 00:05:22.069523 containerd[1664]: 2026-01-23 00:05:22.057 [INFO][4072] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0", GenerateName:"whisker-59b64767c-", Namespace:"calico-system", SelfLink:"", UID:"01440e69-1437-4f1f-8ae8-f18c381a9217", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 5, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"59b64767c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c", Pod:"whisker-59b64767c-5cv6g", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.118.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif19ff4f1536", MAC:"8e:1c:5f:8f:04:c4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:22.069570 containerd[1664]: 2026-01-23 00:05:22.067 [INFO][4072] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" Namespace="calico-system" Pod="whisker-59b64767c-5cv6g" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-whisker--59b64767c--5cv6g-eth0" Jan 23 00:05:22.091807 containerd[1664]: time="2026-01-23T00:05:22.091765402Z" level=info msg="connecting to shim 7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c" address="unix:///run/containerd/s/e533ed2edef1a73ff44fafbe8310d7f3f0e61a3b837f0effd43ed62fc343397c" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:22.122944 systemd[1]: Started cri-containerd-7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c.scope - libcontainer container 7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c. Jan 23 00:05:22.163121 containerd[1664]: time="2026-01-23T00:05:22.163072647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59b64767c-5cv6g,Uid:01440e69-1437-4f1f-8ae8-f18c381a9217,Namespace:calico-system,Attempt:0,} returns sandbox id \"7a28a8d3e88b9fd5ad38db70f55f3109e3784507685ad3fdb4ceb6d502048e4c\"" Jan 23 00:05:22.165429 containerd[1664]: time="2026-01-23T00:05:22.165401293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 00:05:22.520048 containerd[1664]: time="2026-01-23T00:05:22.519957191Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:22.521577 containerd[1664]: time="2026-01-23T00:05:22.521444475Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 00:05:22.521645 containerd[1664]: time="2026-01-23T00:05:22.521616076Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 00:05:22.521997 kubelet[2916]: E0123 00:05:22.521952 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:05:22.522363 kubelet[2916]: E0123 00:05:22.522008 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:05:22.522363 kubelet[2916]: E0123 00:05:22.522082 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:22.523430 containerd[1664]: time="2026-01-23T00:05:22.523397321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 00:05:22.740992 systemd-networkd[1519]: vxlan.calico: Link UP Jan 23 00:05:22.741000 systemd-networkd[1519]: vxlan.calico: Gained carrier Jan 23 00:05:22.879798 containerd[1664]: time="2026-01-23T00:05:22.879683783Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:22.881576 containerd[1664]: time="2026-01-23T00:05:22.881477828Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 00:05:22.881576 containerd[1664]: time="2026-01-23T00:05:22.881549829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 00:05:22.881795 kubelet[2916]: E0123 00:05:22.881755 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:05:22.881922 kubelet[2916]: E0123 00:05:22.881876 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:05:22.882218 kubelet[2916]: E0123 00:05:22.882196 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:22.882312 kubelet[2916]: E0123 00:05:22.882285 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:05:23.109952 systemd-networkd[1519]: calif19ff4f1536: Gained IPv6LL Jan 23 00:05:23.402778 kubelet[2916]: I0123 00:05:23.402676 2916 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a95e0f65-bd4d-4d04-8db9-17e0561272ca" path="/var/lib/kubelet/pods/a95e0f65-bd4d-4d04-8db9-17e0561272ca/volumes" Jan 23 00:05:23.544743 kubelet[2916]: E0123 00:05:23.544672 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:05:24.006030 systemd-networkd[1519]: vxlan.calico: Gained IPv6LL Jan 23 00:05:24.296276 kubelet[2916]: I0123 00:05:24.296029 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 00:05:24.404489 containerd[1664]: time="2026-01-23T00:05:24.404450559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdvg2,Uid:e4fbecc4-8903-42d8-8af9-1aa47331d5be,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:24.406136 containerd[1664]: time="2026-01-23T00:05:24.406106163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rrn95,Uid:48cd25bc-c079-4c45-9a7e-c9867ffb9751,Namespace:kube-system,Attempt:0,}" Jan 23 00:05:24.526008 systemd-networkd[1519]: calia3b2b9c13ae: Link UP Jan 23 00:05:24.526353 systemd-networkd[1519]: calia3b2b9c13ae: Gained carrier Jan 23 00:05:24.540691 containerd[1664]: 2026-01-23 00:05:24.455 [INFO][4410] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0 coredns-66bc5c9577- kube-system 48cd25bc-c079-4c45-9a7e-c9867ffb9751 826 0 2026-01-23 00:04:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 coredns-66bc5c9577-rrn95 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia3b2b9c13ae [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-" Jan 23 00:05:24.540691 containerd[1664]: 2026-01-23 00:05:24.455 [INFO][4410] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" Jan 23 00:05:24.540691 containerd[1664]: 2026-01-23 00:05:24.486 [INFO][4433] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" HandleID="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Workload="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.486 [INFO][4433] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" HandleID="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Workload="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000528950), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-n-22c0b85714", "pod":"coredns-66bc5c9577-rrn95", "timestamp":"2026-01-23 00:05:24.486150193 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.486 [INFO][4433] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.486 [INFO][4433] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.486 [INFO][4433] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.497 [INFO][4433] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.501 [INFO][4433] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.505 [INFO][4433] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.507 [INFO][4433] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.540907 containerd[1664]: 2026-01-23 00:05:24.510 [INFO][4433] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.541101 containerd[1664]: 2026-01-23 00:05:24.510 [INFO][4433] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.541101 containerd[1664]: 2026-01-23 00:05:24.511 [INFO][4433] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74 Jan 23 00:05:24.541101 containerd[1664]: 2026-01-23 00:05:24.515 [INFO][4433] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.541101 containerd[1664]: 2026-01-23 00:05:24.521 [INFO][4433] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.130/26] block=192.168.118.128/26 handle="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.541101 containerd[1664]: 2026-01-23 00:05:24.521 [INFO][4433] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.130/26] handle="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.541101 containerd[1664]: 2026-01-23 00:05:24.521 [INFO][4433] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:24.541101 containerd[1664]: 2026-01-23 00:05:24.521 [INFO][4433] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.130/26] IPv6=[] ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" HandleID="k8s-pod-network.d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Workload="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" Jan 23 00:05:24.541262 containerd[1664]: 2026-01-23 00:05:24.524 [INFO][4410] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"48cd25bc-c079-4c45-9a7e-c9867ffb9751", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"coredns-66bc5c9577-rrn95", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia3b2b9c13ae", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:24.541262 containerd[1664]: 2026-01-23 00:05:24.524 [INFO][4410] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.130/32] ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" Jan 23 00:05:24.541262 containerd[1664]: 2026-01-23 00:05:24.524 [INFO][4410] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia3b2b9c13ae ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" Jan 23 00:05:24.541262 containerd[1664]: 2026-01-23 00:05:24.526 [INFO][4410] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" Jan 23 00:05:24.541262 containerd[1664]: 2026-01-23 00:05:24.528 [INFO][4410] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"48cd25bc-c079-4c45-9a7e-c9867ffb9751", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74", Pod:"coredns-66bc5c9577-rrn95", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia3b2b9c13ae", MAC:"e6:4e:15:94:54:ae", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:24.541448 containerd[1664]: 2026-01-23 00:05:24.538 [INFO][4410] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" Namespace="kube-system" Pod="coredns-66bc5c9577-rrn95" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--rrn95-eth0" Jan 23 00:05:24.560318 containerd[1664]: time="2026-01-23T00:05:24.560209606Z" level=info msg="connecting to shim d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74" address="unix:///run/containerd/s/6516630d309c9d4e817435710fd0916e8f36c8da0b7a1b6fe327df59448dcd45" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:24.588010 systemd[1]: Started cri-containerd-d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74.scope - libcontainer container d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74. Jan 23 00:05:24.630617 containerd[1664]: time="2026-01-23T00:05:24.630575648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rrn95,Uid:48cd25bc-c079-4c45-9a7e-c9867ffb9751,Namespace:kube-system,Attempt:0,} returns sandbox id \"d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74\"" Jan 23 00:05:24.637083 containerd[1664]: time="2026-01-23T00:05:24.637044146Z" level=info msg="CreateContainer within sandbox \"d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 00:05:24.639140 systemd-networkd[1519]: cali6dd1fb9d4ab: Link UP Jan 23 00:05:24.640438 systemd-networkd[1519]: cali6dd1fb9d4ab: Gained carrier Jan 23 00:05:24.657918 containerd[1664]: time="2026-01-23T00:05:24.657810726Z" level=info msg="Container 47f514bbbc2a1bc6da31567def5266f4a97bc69317bc7cd1c23402c4c303b649: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.468 [INFO][4409] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0 csi-node-driver- calico-system e4fbecc4-8903-42d8-8af9-1aa47331d5be 699 0 2026-01-23 00:04:56 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 csi-node-driver-hdvg2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6dd1fb9d4ab [] [] }} ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.469 [INFO][4409] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.495 [INFO][4440] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" HandleID="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Workload="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.495 [INFO][4440] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" HandleID="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Workload="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005973c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-22c0b85714", "pod":"csi-node-driver-hdvg2", "timestamp":"2026-01-23 00:05:24.49558234 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.495 [INFO][4440] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.521 [INFO][4440] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.522 [INFO][4440] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.598 [INFO][4440] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.607 [INFO][4440] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.611 [INFO][4440] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.613 [INFO][4440] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.617 [INFO][4440] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.617 [INFO][4440] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.619 [INFO][4440] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.625 [INFO][4440] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.633 [INFO][4440] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.131/26] block=192.168.118.128/26 handle="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.633 [INFO][4440] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.131/26] handle="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.633 [INFO][4440] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:24.659506 containerd[1664]: 2026-01-23 00:05:24.633 [INFO][4440] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.131/26] IPv6=[] ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" HandleID="k8s-pod-network.d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Workload="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" Jan 23 00:05:24.661553 containerd[1664]: 2026-01-23 00:05:24.635 [INFO][4409] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e4fbecc4-8903-42d8-8af9-1aa47331d5be", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"csi-node-driver-hdvg2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.118.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6dd1fb9d4ab", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:24.661553 containerd[1664]: 2026-01-23 00:05:24.636 [INFO][4409] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.131/32] ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" Jan 23 00:05:24.661553 containerd[1664]: 2026-01-23 00:05:24.636 [INFO][4409] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6dd1fb9d4ab ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" Jan 23 00:05:24.661553 containerd[1664]: 2026-01-23 00:05:24.640 [INFO][4409] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" Jan 23 00:05:24.661553 containerd[1664]: 2026-01-23 00:05:24.641 [INFO][4409] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e4fbecc4-8903-42d8-8af9-1aa47331d5be", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d", Pod:"csi-node-driver-hdvg2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.118.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6dd1fb9d4ab", MAC:"ea:7f:40:41:76:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:24.661553 containerd[1664]: 2026-01-23 00:05:24.655 [INFO][4409] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" Namespace="calico-system" Pod="csi-node-driver-hdvg2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-csi--node--driver--hdvg2-eth0" Jan 23 00:05:24.667577 containerd[1664]: time="2026-01-23T00:05:24.667541714Z" level=info msg="CreateContainer within sandbox \"d261b16e7394499152df2d1146755da434028a7b222bad89eb3ddc609f44bb74\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"47f514bbbc2a1bc6da31567def5266f4a97bc69317bc7cd1c23402c4c303b649\"" Jan 23 00:05:24.669540 containerd[1664]: time="2026-01-23T00:05:24.668740197Z" level=info msg="StartContainer for \"47f514bbbc2a1bc6da31567def5266f4a97bc69317bc7cd1c23402c4c303b649\"" Jan 23 00:05:24.669793 containerd[1664]: time="2026-01-23T00:05:24.669762360Z" level=info msg="connecting to shim 47f514bbbc2a1bc6da31567def5266f4a97bc69317bc7cd1c23402c4c303b649" address="unix:///run/containerd/s/6516630d309c9d4e817435710fd0916e8f36c8da0b7a1b6fe327df59448dcd45" protocol=ttrpc version=3 Jan 23 00:05:24.690199 containerd[1664]: time="2026-01-23T00:05:24.690086338Z" level=info msg="connecting to shim d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d" address="unix:///run/containerd/s/245b110b20e0d7ff116cb24b39671fce8e75a3a0c6cb5dae1d9c704e1fe9bd91" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:24.695367 systemd[1]: Started cri-containerd-47f514bbbc2a1bc6da31567def5266f4a97bc69317bc7cd1c23402c4c303b649.scope - libcontainer container 47f514bbbc2a1bc6da31567def5266f4a97bc69317bc7cd1c23402c4c303b649. Jan 23 00:05:24.718955 systemd[1]: Started cri-containerd-d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d.scope - libcontainer container d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d. Jan 23 00:05:24.733829 containerd[1664]: time="2026-01-23T00:05:24.733762024Z" level=info msg="StartContainer for \"47f514bbbc2a1bc6da31567def5266f4a97bc69317bc7cd1c23402c4c303b649\" returns successfully" Jan 23 00:05:24.759348 containerd[1664]: time="2026-01-23T00:05:24.759293297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdvg2,Uid:e4fbecc4-8903-42d8-8af9-1aa47331d5be,Namespace:calico-system,Attempt:0,} returns sandbox id \"d9ef760bfd1d3ac13e2404607ab55b52953aadca82d80fb0f81adfddb074845d\"" Jan 23 00:05:24.761961 containerd[1664]: time="2026-01-23T00:05:24.761920424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 00:05:25.095399 containerd[1664]: time="2026-01-23T00:05:25.095311061Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:25.097143 containerd[1664]: time="2026-01-23T00:05:25.097089786Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 00:05:25.097143 containerd[1664]: time="2026-01-23T00:05:25.097123306Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 00:05:25.097381 kubelet[2916]: E0123 00:05:25.097348 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:05:25.097937 kubelet[2916]: E0123 00:05:25.097667 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:05:25.097937 kubelet[2916]: E0123 00:05:25.097781 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:25.099796 containerd[1664]: time="2026-01-23T00:05:25.099770074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 00:05:25.404509 containerd[1664]: time="2026-01-23T00:05:25.404347948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-dsv6d,Uid:fb016e95-3a65-4e3c-b265-30c14a6c50c6,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:25.413176 containerd[1664]: time="2026-01-23T00:05:25.413139933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-5xn2v,Uid:43a640f6-edee-4332-9215-0d039b0402e6,Namespace:kube-system,Attempt:0,}" Jan 23 00:05:25.417357 containerd[1664]: time="2026-01-23T00:05:25.417296225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-6qcz2,Uid:2511aa5d-56b0-481d-8829-6daaa6eae613,Namespace:calico-apiserver,Attempt:0,}" Jan 23 00:05:25.447593 containerd[1664]: time="2026-01-23T00:05:25.447537912Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:25.457765 containerd[1664]: time="2026-01-23T00:05:25.457690461Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 00:05:25.457881 containerd[1664]: time="2026-01-23T00:05:25.457783581Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 00:05:25.458045 kubelet[2916]: E0123 00:05:25.458008 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:05:25.458461 kubelet[2916]: E0123 00:05:25.458124 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:05:25.458461 kubelet[2916]: E0123 00:05:25.458195 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:25.458461 kubelet[2916]: E0123 00:05:25.458235 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:25.549932 kubelet[2916]: E0123 00:05:25.549877 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:25.571411 systemd-networkd[1519]: cali7c8661fa377: Link UP Jan 23 00:05:25.575018 systemd-networkd[1519]: cali7c8661fa377: Gained carrier Jan 23 00:05:25.588252 kubelet[2916]: I0123 00:05:25.588153 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-rrn95" podStartSLOduration=44.588134995 podStartE2EDuration="44.588134995s" podCreationTimestamp="2026-01-23 00:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 00:05:25.578978289 +0000 UTC m=+50.283830657" watchObservedRunningTime="2026-01-23 00:05:25.588134995 +0000 UTC m=+50.292987363" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.475 [INFO][4599] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0 goldmane-7c778bb748- calico-system fb016e95-3a65-4e3c-b265-30c14a6c50c6 831 0 2026-01-23 00:04:54 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 goldmane-7c778bb748-dsv6d eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7c8661fa377 [] [] }} ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.475 [INFO][4599] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.509 [INFO][4641] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" HandleID="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Workload="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.510 [INFO][4641] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" HandleID="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Workload="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003b9d50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-22c0b85714", "pod":"goldmane-7c778bb748-dsv6d", "timestamp":"2026-01-23 00:05:25.50956501 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.510 [INFO][4641] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.510 [INFO][4641] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.510 [INFO][4641] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.521 [INFO][4641] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.529 [INFO][4641] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.535 [INFO][4641] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.537 [INFO][4641] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.540 [INFO][4641] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.540 [INFO][4641] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.543 [INFO][4641] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8 Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.547 [INFO][4641] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.556 [INFO][4641] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.132/26] block=192.168.118.128/26 handle="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.556 [INFO][4641] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.132/26] handle="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.556 [INFO][4641] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:25.590762 containerd[1664]: 2026-01-23 00:05:25.556 [INFO][4641] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.132/26] IPv6=[] ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" HandleID="k8s-pod-network.69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Workload="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" Jan 23 00:05:25.591232 containerd[1664]: 2026-01-23 00:05:25.565 [INFO][4599] cni-plugin/k8s.go 418: Populated endpoint ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"fb016e95-3a65-4e3c-b265-30c14a6c50c6", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"goldmane-7c778bb748-dsv6d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.118.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7c8661fa377", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:25.591232 containerd[1664]: 2026-01-23 00:05:25.565 [INFO][4599] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.132/32] ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" Jan 23 00:05:25.591232 containerd[1664]: 2026-01-23 00:05:25.565 [INFO][4599] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7c8661fa377 ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" Jan 23 00:05:25.591232 containerd[1664]: 2026-01-23 00:05:25.575 [INFO][4599] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" Jan 23 00:05:25.591232 containerd[1664]: 2026-01-23 00:05:25.575 [INFO][4599] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"fb016e95-3a65-4e3c-b265-30c14a6c50c6", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8", Pod:"goldmane-7c778bb748-dsv6d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.118.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7c8661fa377", MAC:"76:ab:63:a7:d3:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:25.591232 containerd[1664]: 2026-01-23 00:05:25.587 [INFO][4599] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" Namespace="calico-system" Pod="goldmane-7c778bb748-dsv6d" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-goldmane--7c778bb748--dsv6d-eth0" Jan 23 00:05:25.641900 containerd[1664]: time="2026-01-23T00:05:25.641853950Z" level=info msg="connecting to shim 69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8" address="unix:///run/containerd/s/79846ecfb029f545a66dc1b23b0e5dcc1dccf94ab83b66bf928a40d0b63ad0b0" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:25.664743 systemd-networkd[1519]: calie34b59476f6: Link UP Jan 23 00:05:25.664934 systemd-networkd[1519]: calie34b59476f6: Gained carrier Jan 23 00:05:25.673048 systemd[1]: Started cri-containerd-69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8.scope - libcontainer container 69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8. Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.504 [INFO][4613] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0 coredns-66bc5c9577- kube-system 43a640f6-edee-4332-9215-0d039b0402e6 823 0 2026-01-23 00:04:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 coredns-66bc5c9577-5xn2v eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie34b59476f6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.504 [INFO][4613] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.535 [INFO][4654] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" HandleID="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Workload="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.535 [INFO][4654] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" HandleID="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Workload="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001aee60), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-n-22c0b85714", "pod":"coredns-66bc5c9577-5xn2v", "timestamp":"2026-01-23 00:05:25.535816725 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.536 [INFO][4654] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.556 [INFO][4654] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.556 [INFO][4654] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.623 [INFO][4654] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.629 [INFO][4654] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.638 [INFO][4654] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.642 [INFO][4654] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.645 [INFO][4654] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.645 [INFO][4654] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.647 [INFO][4654] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.651 [INFO][4654] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.658 [INFO][4654] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.133/26] block=192.168.118.128/26 handle="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.659 [INFO][4654] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.133/26] handle="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.659 [INFO][4654] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:25.684044 containerd[1664]: 2026-01-23 00:05:25.659 [INFO][4654] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.133/26] IPv6=[] ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" HandleID="k8s-pod-network.e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Workload="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" Jan 23 00:05:25.684756 containerd[1664]: 2026-01-23 00:05:25.662 [INFO][4613] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"43a640f6-edee-4332-9215-0d039b0402e6", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"coredns-66bc5c9577-5xn2v", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie34b59476f6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:25.684756 containerd[1664]: 2026-01-23 00:05:25.662 [INFO][4613] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.133/32] ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" Jan 23 00:05:25.684756 containerd[1664]: 2026-01-23 00:05:25.662 [INFO][4613] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie34b59476f6 ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" Jan 23 00:05:25.684756 containerd[1664]: 2026-01-23 00:05:25.664 [INFO][4613] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" Jan 23 00:05:25.684756 containerd[1664]: 2026-01-23 00:05:25.665 [INFO][4613] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"43a640f6-edee-4332-9215-0d039b0402e6", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f", Pod:"coredns-66bc5c9577-5xn2v", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.118.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie34b59476f6", MAC:"4e:31:1c:55:06:12", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:25.685109 containerd[1664]: 2026-01-23 00:05:25.680 [INFO][4613] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" Namespace="kube-system" Pod="coredns-66bc5c9577-5xn2v" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-coredns--66bc5c9577--5xn2v-eth0" Jan 23 00:05:25.716601 containerd[1664]: time="2026-01-23T00:05:25.716560324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-dsv6d,Uid:fb016e95-3a65-4e3c-b265-30c14a6c50c6,Namespace:calico-system,Attempt:0,} returns sandbox id \"69dfc738600c7dbd079b6f6b01e9aab5db75da6624841496302e1325930058b8\"" Jan 23 00:05:25.718244 containerd[1664]: time="2026-01-23T00:05:25.718169689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 00:05:25.723740 containerd[1664]: time="2026-01-23T00:05:25.723602424Z" level=info msg="connecting to shim e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f" address="unix:///run/containerd/s/605ecc79e625d2238e55cc1e971deb3319f4b6b884d2db21b7eb06b5218c68f5" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:25.734005 systemd-networkd[1519]: cali6dd1fb9d4ab: Gained IPv6LL Jan 23 00:05:25.761893 systemd[1]: Started cri-containerd-e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f.scope - libcontainer container e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f. Jan 23 00:05:25.786185 systemd-networkd[1519]: cali2ec59a4f406: Link UP Jan 23 00:05:25.786965 systemd-networkd[1519]: cali2ec59a4f406: Gained carrier Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.508 [INFO][4625] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0 calico-apiserver-6b477b4fc8- calico-apiserver 2511aa5d-56b0-481d-8829-6daaa6eae613 827 0 2026-01-23 00:04:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b477b4fc8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 calico-apiserver-6b477b4fc8-6qcz2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2ec59a4f406 [] [] }} ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.509 [INFO][4625] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.540 [INFO][4661] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" HandleID="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.540 [INFO][4661] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" HandleID="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000353610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-n-22c0b85714", "pod":"calico-apiserver-6b477b4fc8-6qcz2", "timestamp":"2026-01-23 00:05:25.540670499 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.540 [INFO][4661] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.659 [INFO][4661] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.659 [INFO][4661] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.722 [INFO][4661] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.730 [INFO][4661] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.744 [INFO][4661] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.750 [INFO][4661] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.754 [INFO][4661] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.754 [INFO][4661] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.756 [INFO][4661] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29 Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.760 [INFO][4661] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.775 [INFO][4661] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.134/26] block=192.168.118.128/26 handle="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.775 [INFO][4661] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.134/26] handle="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.775 [INFO][4661] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:25.803763 containerd[1664]: 2026-01-23 00:05:25.775 [INFO][4661] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.134/26] IPv6=[] ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" HandleID="k8s-pod-network.ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" Jan 23 00:05:25.804260 containerd[1664]: 2026-01-23 00:05:25.781 [INFO][4625] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0", GenerateName:"calico-apiserver-6b477b4fc8-", Namespace:"calico-apiserver", SelfLink:"", UID:"2511aa5d-56b0-481d-8829-6daaa6eae613", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b477b4fc8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"calico-apiserver-6b477b4fc8-6qcz2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2ec59a4f406", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:25.804260 containerd[1664]: 2026-01-23 00:05:25.781 [INFO][4625] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.134/32] ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" Jan 23 00:05:25.804260 containerd[1664]: 2026-01-23 00:05:25.781 [INFO][4625] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2ec59a4f406 ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" Jan 23 00:05:25.804260 containerd[1664]: 2026-01-23 00:05:25.786 [INFO][4625] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" Jan 23 00:05:25.804260 containerd[1664]: 2026-01-23 00:05:25.787 [INFO][4625] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0", GenerateName:"calico-apiserver-6b477b4fc8-", Namespace:"calico-apiserver", SelfLink:"", UID:"2511aa5d-56b0-481d-8829-6daaa6eae613", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b477b4fc8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29", Pod:"calico-apiserver-6b477b4fc8-6qcz2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2ec59a4f406", MAC:"7a:df:d1:7c:1d:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:25.804260 containerd[1664]: 2026-01-23 00:05:25.799 [INFO][4625] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-6qcz2" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--6qcz2-eth0" Jan 23 00:05:25.832874 containerd[1664]: time="2026-01-23T00:05:25.832809537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-5xn2v,Uid:43a640f6-edee-4332-9215-0d039b0402e6,Namespace:kube-system,Attempt:0,} returns sandbox id \"e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f\"" Jan 23 00:05:25.841786 containerd[1664]: time="2026-01-23T00:05:25.841702363Z" level=info msg="CreateContainer within sandbox \"e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 00:05:25.863139 systemd-networkd[1519]: calia3b2b9c13ae: Gained IPv6LL Jan 23 00:05:25.873354 containerd[1664]: time="2026-01-23T00:05:25.872948133Z" level=info msg="connecting to shim ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29" address="unix:///run/containerd/s/9ff94b173e8c74813da0e0b03e7b2fd82eab376fadc0365d2f92f5e9e03a3585" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:25.880933 containerd[1664]: time="2026-01-23T00:05:25.880877315Z" level=info msg="Container f35ca09f1254bd943a53f5ce793195fc61d73286f7973372ebd86869ff5ddd0b: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:05:25.892562 containerd[1664]: time="2026-01-23T00:05:25.892521789Z" level=info msg="CreateContainer within sandbox \"e7d7cbe352fc66589a8a41bf9b4797342bb3e0e1b9e847c73f981a76a7d3ce0f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f35ca09f1254bd943a53f5ce793195fc61d73286f7973372ebd86869ff5ddd0b\"" Jan 23 00:05:25.893204 containerd[1664]: time="2026-01-23T00:05:25.893180591Z" level=info msg="StartContainer for \"f35ca09f1254bd943a53f5ce793195fc61d73286f7973372ebd86869ff5ddd0b\"" Jan 23 00:05:25.894045 containerd[1664]: time="2026-01-23T00:05:25.894017273Z" level=info msg="connecting to shim f35ca09f1254bd943a53f5ce793195fc61d73286f7973372ebd86869ff5ddd0b" address="unix:///run/containerd/s/605ecc79e625d2238e55cc1e971deb3319f4b6b884d2db21b7eb06b5218c68f5" protocol=ttrpc version=3 Jan 23 00:05:25.898907 systemd[1]: Started cri-containerd-ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29.scope - libcontainer container ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29. Jan 23 00:05:25.907368 systemd[1]: Started cri-containerd-f35ca09f1254bd943a53f5ce793195fc61d73286f7973372ebd86869ff5ddd0b.scope - libcontainer container f35ca09f1254bd943a53f5ce793195fc61d73286f7973372ebd86869ff5ddd0b. Jan 23 00:05:25.943132 containerd[1664]: time="2026-01-23T00:05:25.942592533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-6qcz2,Uid:2511aa5d-56b0-481d-8829-6daaa6eae613,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ee0e87a1fb6b2cf71aa86d681e96328fbc20ff006e4e6bbc69e4d13619891f29\"" Jan 23 00:05:25.943132 containerd[1664]: time="2026-01-23T00:05:25.942822053Z" level=info msg="StartContainer for \"f35ca09f1254bd943a53f5ce793195fc61d73286f7973372ebd86869ff5ddd0b\" returns successfully" Jan 23 00:05:26.059899 containerd[1664]: time="2026-01-23T00:05:26.059850749Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:26.065185 containerd[1664]: time="2026-01-23T00:05:26.065137644Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 00:05:26.065430 containerd[1664]: time="2026-01-23T00:05:26.065238524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 00:05:26.065687 kubelet[2916]: E0123 00:05:26.065394 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:05:26.065687 kubelet[2916]: E0123 00:05:26.065521 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:05:26.065795 kubelet[2916]: E0123 00:05:26.065717 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-dsv6d_calico-system(fb016e95-3a65-4e3c-b265-30c14a6c50c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:26.065837 kubelet[2916]: E0123 00:05:26.065795 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:05:26.065871 containerd[1664]: time="2026-01-23T00:05:26.065832846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:05:26.399794 containerd[1664]: time="2026-01-23T00:05:26.399693804Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:26.401102 containerd[1664]: time="2026-01-23T00:05:26.401062168Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:05:26.401177 containerd[1664]: time="2026-01-23T00:05:26.401141088Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:05:26.401331 kubelet[2916]: E0123 00:05:26.401284 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:26.401598 kubelet[2916]: E0123 00:05:26.401348 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:26.401598 kubelet[2916]: E0123 00:05:26.401422 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-6qcz2_calico-apiserver(2511aa5d-56b0-481d-8829-6daaa6eae613): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:26.401598 kubelet[2916]: E0123 00:05:26.401451 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:05:26.558645 kubelet[2916]: E0123 00:05:26.558541 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:05:26.560777 kubelet[2916]: E0123 00:05:26.560740 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:05:26.561790 kubelet[2916]: E0123 00:05:26.561009 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:26.584081 kubelet[2916]: I0123 00:05:26.583961 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-5xn2v" podStartSLOduration=45.583945453 podStartE2EDuration="45.583945453s" podCreationTimestamp="2026-01-23 00:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 00:05:26.570550775 +0000 UTC m=+51.275403143" watchObservedRunningTime="2026-01-23 00:05:26.583945453 +0000 UTC m=+51.288797821" Jan 23 00:05:26.886894 systemd-networkd[1519]: calie34b59476f6: Gained IPv6LL Jan 23 00:05:27.408988 containerd[1664]: time="2026-01-23T00:05:27.408870780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-45hrp,Uid:46de2902-bb4c-4eda-81ac-f00ac179b50d,Namespace:calico-apiserver,Attempt:0,}" Jan 23 00:05:27.463082 systemd-networkd[1519]: cali7c8661fa377: Gained IPv6LL Jan 23 00:05:27.542875 systemd-networkd[1519]: cali9960065e9c4: Link UP Jan 23 00:05:27.543597 systemd-networkd[1519]: cali9960065e9c4: Gained carrier Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.455 [INFO][4880] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0 calico-apiserver-6b477b4fc8- calico-apiserver 46de2902-bb4c-4eda-81ac-f00ac179b50d 830 0 2026-01-23 00:04:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b477b4fc8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 calico-apiserver-6b477b4fc8-45hrp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9960065e9c4 [] [] }} ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.455 [INFO][4880] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.493 [INFO][4894] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" HandleID="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.493 [INFO][4894] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" HandleID="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c6a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-n-22c0b85714", "pod":"calico-apiserver-6b477b4fc8-45hrp", "timestamp":"2026-01-23 00:05:27.493292702 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.493 [INFO][4894] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.493 [INFO][4894] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.493 [INFO][4894] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.506 [INFO][4894] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.511 [INFO][4894] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.518 [INFO][4894] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.522 [INFO][4894] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.524 [INFO][4894] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.524 [INFO][4894] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.526 [INFO][4894] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.531 [INFO][4894] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.538 [INFO][4894] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.135/26] block=192.168.118.128/26 handle="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.538 [INFO][4894] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.135/26] handle="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.538 [INFO][4894] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:27.558403 containerd[1664]: 2026-01-23 00:05:27.538 [INFO][4894] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.135/26] IPv6=[] ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" HandleID="k8s-pod-network.42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" Jan 23 00:05:27.559621 containerd[1664]: 2026-01-23 00:05:27.540 [INFO][4880] cni-plugin/k8s.go 418: Populated endpoint ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0", GenerateName:"calico-apiserver-6b477b4fc8-", Namespace:"calico-apiserver", SelfLink:"", UID:"46de2902-bb4c-4eda-81ac-f00ac179b50d", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b477b4fc8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"calico-apiserver-6b477b4fc8-45hrp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9960065e9c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:27.559621 containerd[1664]: 2026-01-23 00:05:27.541 [INFO][4880] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.135/32] ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" Jan 23 00:05:27.559621 containerd[1664]: 2026-01-23 00:05:27.541 [INFO][4880] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9960065e9c4 ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" Jan 23 00:05:27.559621 containerd[1664]: 2026-01-23 00:05:27.543 [INFO][4880] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" Jan 23 00:05:27.559621 containerd[1664]: 2026-01-23 00:05:27.544 [INFO][4880] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0", GenerateName:"calico-apiserver-6b477b4fc8-", Namespace:"calico-apiserver", SelfLink:"", UID:"46de2902-bb4c-4eda-81ac-f00ac179b50d", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b477b4fc8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a", Pod:"calico-apiserver-6b477b4fc8-45hrp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.118.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9960065e9c4", MAC:"6e:34:18:79:a9:79", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:27.559621 containerd[1664]: 2026-01-23 00:05:27.555 [INFO][4880] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" Namespace="calico-apiserver" Pod="calico-apiserver-6b477b4fc8-45hrp" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--apiserver--6b477b4fc8--45hrp-eth0" Jan 23 00:05:27.563409 kubelet[2916]: E0123 00:05:27.562198 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:05:27.563409 kubelet[2916]: E0123 00:05:27.562309 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:05:27.597385 containerd[1664]: time="2026-01-23T00:05:27.597337641Z" level=info msg="connecting to shim 42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a" address="unix:///run/containerd/s/e4f499832f2c85ffe2a25a157538b84f3b131f2d6bff1fbe6000dcead287e670" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:27.622234 systemd[1]: Started cri-containerd-42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a.scope - libcontainer container 42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a. Jan 23 00:05:27.654811 systemd-networkd[1519]: cali2ec59a4f406: Gained IPv6LL Jan 23 00:05:27.659995 containerd[1664]: time="2026-01-23T00:05:27.659879140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b477b4fc8-45hrp,Uid:46de2902-bb4c-4eda-81ac-f00ac179b50d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"42628bdf3688cfd6f8a342331ceac33897fb38da5f7b6a4d53303f3744625a8a\"" Jan 23 00:05:27.662999 containerd[1664]: time="2026-01-23T00:05:27.662965309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:05:28.012618 containerd[1664]: time="2026-01-23T00:05:28.012344632Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:28.014134 containerd[1664]: time="2026-01-23T00:05:28.014035077Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:05:28.014134 containerd[1664]: time="2026-01-23T00:05:28.014083357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:05:28.014298 kubelet[2916]: E0123 00:05:28.014252 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:28.014380 kubelet[2916]: E0123 00:05:28.014303 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:28.015324 kubelet[2916]: E0123 00:05:28.014381 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-45hrp_calico-apiserver(46de2902-bb4c-4eda-81ac-f00ac179b50d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:28.015324 kubelet[2916]: E0123 00:05:28.014414 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:05:28.566108 kubelet[2916]: E0123 00:05:28.566068 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:05:28.997949 systemd-networkd[1519]: cali9960065e9c4: Gained IPv6LL Jan 23 00:05:29.410813 containerd[1664]: time="2026-01-23T00:05:29.410064323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-557fb68f57-qftrq,Uid:2c86da6b-b89e-4517-904a-9f7bcd6f830a,Namespace:calico-system,Attempt:0,}" Jan 23 00:05:29.554571 systemd-networkd[1519]: cali002acbad2fb: Link UP Jan 23 00:05:29.555201 systemd-networkd[1519]: cali002acbad2fb: Gained carrier Jan 23 00:05:29.567480 kubelet[2916]: E0123 00:05:29.567439 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.474 [INFO][4972] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0 calico-kube-controllers-557fb68f57- calico-system 2c86da6b-b89e-4517-904a-9f7bcd6f830a 828 0 2026-01-23 00:04:57 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:557fb68f57 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-n-22c0b85714 calico-kube-controllers-557fb68f57-qftrq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali002acbad2fb [] [] }} ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.474 [INFO][4972] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.502 [INFO][4982] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" HandleID="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.502 [INFO][4982] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" HandleID="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c770), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-n-22c0b85714", "pod":"calico-kube-controllers-557fb68f57-qftrq", "timestamp":"2026-01-23 00:05:29.502616069 +0000 UTC"}, Hostname:"ci-4459-2-2-n-22c0b85714", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.502 [INFO][4982] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.502 [INFO][4982] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.502 [INFO][4982] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-n-22c0b85714' Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.514 [INFO][4982] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.526 [INFO][4982] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.531 [INFO][4982] ipam/ipam.go 511: Trying affinity for 192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.533 [INFO][4982] ipam/ipam.go 158: Attempting to load block cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.535 [INFO][4982] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.118.128/26 host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.535 [INFO][4982] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.118.128/26 handle="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.537 [INFO][4982] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55 Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.540 [INFO][4982] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.118.128/26 handle="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.548 [INFO][4982] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.118.136/26] block=192.168.118.128/26 handle="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.548 [INFO][4982] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.118.136/26] handle="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" host="ci-4459-2-2-n-22c0b85714" Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.548 [INFO][4982] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 00:05:29.567928 containerd[1664]: 2026-01-23 00:05:29.548 [INFO][4982] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.118.136/26] IPv6=[] ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" HandleID="k8s-pod-network.5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Workload="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" Jan 23 00:05:29.568423 containerd[1664]: 2026-01-23 00:05:29.551 [INFO][4972] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0", GenerateName:"calico-kube-controllers-557fb68f57-", Namespace:"calico-system", SelfLink:"", UID:"2c86da6b-b89e-4517-904a-9f7bcd6f830a", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"557fb68f57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"", Pod:"calico-kube-controllers-557fb68f57-qftrq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.118.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali002acbad2fb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:29.568423 containerd[1664]: 2026-01-23 00:05:29.551 [INFO][4972] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.118.136/32] ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" Jan 23 00:05:29.568423 containerd[1664]: 2026-01-23 00:05:29.551 [INFO][4972] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali002acbad2fb ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" Jan 23 00:05:29.568423 containerd[1664]: 2026-01-23 00:05:29.555 [INFO][4972] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" Jan 23 00:05:29.568423 containerd[1664]: 2026-01-23 00:05:29.556 [INFO][4972] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0", GenerateName:"calico-kube-controllers-557fb68f57-", Namespace:"calico-system", SelfLink:"", UID:"2c86da6b-b89e-4517-904a-9f7bcd6f830a", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 0, 4, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"557fb68f57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-n-22c0b85714", ContainerID:"5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55", Pod:"calico-kube-controllers-557fb68f57-qftrq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.118.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali002acbad2fb", MAC:"36:b7:8d:d5:fd:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 00:05:29.568423 containerd[1664]: 2026-01-23 00:05:29.564 [INFO][4972] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" Namespace="calico-system" Pod="calico-kube-controllers-557fb68f57-qftrq" WorkloadEndpoint="ci--4459--2--2--n--22c0b85714-k8s-calico--kube--controllers--557fb68f57--qftrq-eth0" Jan 23 00:05:29.599661 containerd[1664]: time="2026-01-23T00:05:29.599610067Z" level=info msg="connecting to shim 5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55" address="unix:///run/containerd/s/73d139f8340463d1fe4c44e1ede787564ed971941b0c19ac35cf1a3d56b51f62" namespace=k8s.io protocol=ttrpc version=3 Jan 23 00:05:29.623956 systemd[1]: Started cri-containerd-5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55.scope - libcontainer container 5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55. Jan 23 00:05:29.666781 containerd[1664]: time="2026-01-23T00:05:29.666670859Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-557fb68f57-qftrq,Uid:2c86da6b-b89e-4517-904a-9f7bcd6f830a,Namespace:calico-system,Attempt:0,} returns sandbox id \"5079f886202aa76529f32596fb7929b980905c9f4754b4b8e6e11d3da2661d55\"" Jan 23 00:05:29.668690 containerd[1664]: time="2026-01-23T00:05:29.668607065Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 00:05:29.995530 containerd[1664]: time="2026-01-23T00:05:29.995488643Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:29.998500 containerd[1664]: time="2026-01-23T00:05:29.998429211Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 00:05:29.998629 containerd[1664]: time="2026-01-23T00:05:29.998572452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 00:05:29.999384 kubelet[2916]: E0123 00:05:29.999346 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:05:29.999449 kubelet[2916]: E0123 00:05:29.999397 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:05:29.999495 kubelet[2916]: E0123 00:05:29.999476 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-557fb68f57-qftrq_calico-system(2c86da6b-b89e-4517-904a-9f7bcd6f830a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:29.999550 kubelet[2916]: E0123 00:05:29.999512 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:05:30.573261 kubelet[2916]: E0123 00:05:30.573168 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:05:31.495340 systemd-networkd[1519]: cali002acbad2fb: Gained IPv6LL Jan 23 00:05:31.573192 kubelet[2916]: E0123 00:05:31.573152 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:05:35.404419 containerd[1664]: time="2026-01-23T00:05:35.404307604Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 00:05:35.747742 containerd[1664]: time="2026-01-23T00:05:35.747411109Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:35.749154 containerd[1664]: time="2026-01-23T00:05:35.749098074Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 00:05:35.749233 containerd[1664]: time="2026-01-23T00:05:35.749185834Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 00:05:35.749374 kubelet[2916]: E0123 00:05:35.749329 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:05:35.749628 kubelet[2916]: E0123 00:05:35.749380 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:05:35.749628 kubelet[2916]: E0123 00:05:35.749455 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:35.751742 containerd[1664]: time="2026-01-23T00:05:35.750912959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 00:05:36.079222 containerd[1664]: time="2026-01-23T00:05:36.079113141Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:36.086448 containerd[1664]: time="2026-01-23T00:05:36.085974961Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 00:05:36.086448 containerd[1664]: time="2026-01-23T00:05:36.086006081Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 00:05:36.086575 kubelet[2916]: E0123 00:05:36.086399 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:05:36.086575 kubelet[2916]: E0123 00:05:36.086448 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:05:36.086575 kubelet[2916]: E0123 00:05:36.086519 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:36.086682 kubelet[2916]: E0123 00:05:36.086556 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:05:40.401704 containerd[1664]: time="2026-01-23T00:05:40.401652745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 00:05:40.740433 containerd[1664]: time="2026-01-23T00:05:40.740382597Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:40.741776 containerd[1664]: time="2026-01-23T00:05:40.741728481Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 00:05:40.741856 containerd[1664]: time="2026-01-23T00:05:40.741816521Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 00:05:40.742000 kubelet[2916]: E0123 00:05:40.741958 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:05:40.742955 kubelet[2916]: E0123 00:05:40.742020 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:05:40.742955 kubelet[2916]: E0123 00:05:40.742207 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-dsv6d_calico-system(fb016e95-3a65-4e3c-b265-30c14a6c50c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:40.742955 kubelet[2916]: E0123 00:05:40.742250 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:05:40.743028 containerd[1664]: time="2026-01-23T00:05:40.742319082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:05:41.081414 containerd[1664]: time="2026-01-23T00:05:41.081186695Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:41.082747 containerd[1664]: time="2026-01-23T00:05:41.082644259Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:05:41.082946 containerd[1664]: time="2026-01-23T00:05:41.082858860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:05:41.083280 kubelet[2916]: E0123 00:05:41.083058 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:41.083280 kubelet[2916]: E0123 00:05:41.083133 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:41.083280 kubelet[2916]: E0123 00:05:41.083210 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-6qcz2_calico-apiserver(2511aa5d-56b0-481d-8829-6daaa6eae613): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:41.083280 kubelet[2916]: E0123 00:05:41.083244 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:05:41.401652 containerd[1664]: time="2026-01-23T00:05:41.401354374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 00:05:41.728191 containerd[1664]: time="2026-01-23T00:05:41.727910991Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:41.729478 containerd[1664]: time="2026-01-23T00:05:41.729375595Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 00:05:41.729478 containerd[1664]: time="2026-01-23T00:05:41.729456955Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 00:05:41.729709 kubelet[2916]: E0123 00:05:41.729635 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:05:41.729709 kubelet[2916]: E0123 00:05:41.729701 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:05:41.730117 kubelet[2916]: E0123 00:05:41.730095 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:41.731513 containerd[1664]: time="2026-01-23T00:05:41.731458761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 00:05:42.081867 containerd[1664]: time="2026-01-23T00:05:42.081645286Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:42.083337 containerd[1664]: time="2026-01-23T00:05:42.083257490Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 00:05:42.083337 containerd[1664]: time="2026-01-23T00:05:42.083301731Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 00:05:42.083539 kubelet[2916]: E0123 00:05:42.083493 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:05:42.083926 kubelet[2916]: E0123 00:05:42.083550 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:05:42.083926 kubelet[2916]: E0123 00:05:42.083634 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:42.083926 kubelet[2916]: E0123 00:05:42.083673 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:05:42.401047 containerd[1664]: time="2026-01-23T00:05:42.400689681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 00:05:42.739483 containerd[1664]: time="2026-01-23T00:05:42.739302533Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:42.740590 containerd[1664]: time="2026-01-23T00:05:42.740482016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 00:05:42.740590 containerd[1664]: time="2026-01-23T00:05:42.740544497Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 00:05:42.740780 kubelet[2916]: E0123 00:05:42.740735 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:05:42.740842 kubelet[2916]: E0123 00:05:42.740790 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:05:42.740914 kubelet[2916]: E0123 00:05:42.740867 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-557fb68f57-qftrq_calico-system(2c86da6b-b89e-4517-904a-9f7bcd6f830a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:42.740914 kubelet[2916]: E0123 00:05:42.740905 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:05:44.400930 containerd[1664]: time="2026-01-23T00:05:44.400808341Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:05:44.731058 containerd[1664]: time="2026-01-23T00:05:44.730784888Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:05:44.732186 containerd[1664]: time="2026-01-23T00:05:44.732129532Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:05:44.732288 containerd[1664]: time="2026-01-23T00:05:44.732213892Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:05:44.732383 kubelet[2916]: E0123 00:05:44.732348 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:44.732815 kubelet[2916]: E0123 00:05:44.732396 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:05:44.732815 kubelet[2916]: E0123 00:05:44.732468 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-45hrp_calico-apiserver(46de2902-bb4c-4eda-81ac-f00ac179b50d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:05:44.732815 kubelet[2916]: E0123 00:05:44.732498 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:05:50.403472 kubelet[2916]: E0123 00:05:50.403418 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:05:52.402258 kubelet[2916]: E0123 00:05:52.402208 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:05:54.403434 kubelet[2916]: E0123 00:05:54.402981 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:05:55.402186 kubelet[2916]: E0123 00:05:55.402089 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:05:55.402600 kubelet[2916]: E0123 00:05:55.402507 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:05:56.404407 kubelet[2916]: E0123 00:05:56.404355 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:06:01.403704 containerd[1664]: time="2026-01-23T00:06:01.402771330Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 00:06:01.974922 containerd[1664]: time="2026-01-23T00:06:01.974873892Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:01.976275 containerd[1664]: time="2026-01-23T00:06:01.976198655Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 00:06:01.976275 containerd[1664]: time="2026-01-23T00:06:01.976238775Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 00:06:01.976490 kubelet[2916]: E0123 00:06:01.976442 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:06:01.977023 kubelet[2916]: E0123 00:06:01.976498 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:06:01.977023 kubelet[2916]: E0123 00:06:01.976572 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:01.977735 containerd[1664]: time="2026-01-23T00:06:01.977694300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 00:06:02.309491 containerd[1664]: time="2026-01-23T00:06:02.308869290Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:02.311468 containerd[1664]: time="2026-01-23T00:06:02.311404937Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 00:06:02.311563 containerd[1664]: time="2026-01-23T00:06:02.311498098Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 00:06:02.311762 kubelet[2916]: E0123 00:06:02.311708 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:06:02.311872 kubelet[2916]: E0123 00:06:02.311838 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:06:02.312031 kubelet[2916]: E0123 00:06:02.312014 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:02.312484 kubelet[2916]: E0123 00:06:02.312454 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:06:07.403941 containerd[1664]: time="2026-01-23T00:06:07.403878471Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:06:07.748575 containerd[1664]: time="2026-01-23T00:06:07.748510500Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:07.749985 containerd[1664]: time="2026-01-23T00:06:07.749920904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:06:07.750070 containerd[1664]: time="2026-01-23T00:06:07.750012144Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:06:07.750229 kubelet[2916]: E0123 00:06:07.750190 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:06:07.750951 kubelet[2916]: E0123 00:06:07.750239 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:06:07.750951 kubelet[2916]: E0123 00:06:07.750476 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-6qcz2_calico-apiserver(2511aa5d-56b0-481d-8829-6daaa6eae613): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:07.750951 kubelet[2916]: E0123 00:06:07.750522 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:06:07.751056 containerd[1664]: time="2026-01-23T00:06:07.750523345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 00:06:08.087176 containerd[1664]: time="2026-01-23T00:06:08.087002311Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:08.088953 containerd[1664]: time="2026-01-23T00:06:08.088846436Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 00:06:08.088953 containerd[1664]: time="2026-01-23T00:06:08.088934957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 00:06:08.089285 kubelet[2916]: E0123 00:06:08.089241 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:06:08.089354 kubelet[2916]: E0123 00:06:08.089291 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:06:08.089384 kubelet[2916]: E0123 00:06:08.089369 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-dsv6d_calico-system(fb016e95-3a65-4e3c-b265-30c14a6c50c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:08.089503 kubelet[2916]: E0123 00:06:08.089399 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:06:09.401761 containerd[1664]: time="2026-01-23T00:06:09.401689764Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 00:06:09.751581 containerd[1664]: time="2026-01-23T00:06:09.751519288Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:09.752907 containerd[1664]: time="2026-01-23T00:06:09.752859131Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 00:06:09.752979 containerd[1664]: time="2026-01-23T00:06:09.752893692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 00:06:09.753158 kubelet[2916]: E0123 00:06:09.753123 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:06:09.753749 kubelet[2916]: E0123 00:06:09.753442 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:06:09.754004 kubelet[2916]: E0123 00:06:09.753706 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-557fb68f57-qftrq_calico-system(2c86da6b-b89e-4517-904a-9f7bcd6f830a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:09.754004 kubelet[2916]: E0123 00:06:09.753826 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:06:09.754637 containerd[1664]: time="2026-01-23T00:06:09.754608057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:06:10.163436 containerd[1664]: time="2026-01-23T00:06:10.163225549Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:10.164878 containerd[1664]: time="2026-01-23T00:06:10.164753794Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:06:10.164878 containerd[1664]: time="2026-01-23T00:06:10.164856314Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:06:10.165063 kubelet[2916]: E0123 00:06:10.165027 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:06:10.165140 kubelet[2916]: E0123 00:06:10.165073 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:06:10.165170 kubelet[2916]: E0123 00:06:10.165151 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-45hrp_calico-apiserver(46de2902-bb4c-4eda-81ac-f00ac179b50d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:10.165170 kubelet[2916]: E0123 00:06:10.165184 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:06:11.400751 containerd[1664]: time="2026-01-23T00:06:11.400695580Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 00:06:11.735166 containerd[1664]: time="2026-01-23T00:06:11.735109540Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:11.736435 containerd[1664]: time="2026-01-23T00:06:11.736378664Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 00:06:11.736435 containerd[1664]: time="2026-01-23T00:06:11.736409944Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 00:06:11.736673 kubelet[2916]: E0123 00:06:11.736595 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:06:11.736673 kubelet[2916]: E0123 00:06:11.736665 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:06:11.736996 kubelet[2916]: E0123 00:06:11.736756 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:11.738113 containerd[1664]: time="2026-01-23T00:06:11.737904188Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 00:06:12.076796 containerd[1664]: time="2026-01-23T00:06:12.076530720Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:12.078056 containerd[1664]: time="2026-01-23T00:06:12.078003644Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 00:06:12.078185 containerd[1664]: time="2026-01-23T00:06:12.078082204Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 00:06:12.078369 kubelet[2916]: E0123 00:06:12.078309 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:06:12.078426 kubelet[2916]: E0123 00:06:12.078366 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:06:12.078452 kubelet[2916]: E0123 00:06:12.078433 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:12.078506 kubelet[2916]: E0123 00:06:12.078474 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:06:16.402105 kubelet[2916]: E0123 00:06:16.402051 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:06:19.400993 kubelet[2916]: E0123 00:06:19.400910 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:06:21.401403 kubelet[2916]: E0123 00:06:21.401317 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:06:22.401108 kubelet[2916]: E0123 00:06:22.401063 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:06:23.401413 kubelet[2916]: E0123 00:06:23.401351 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:06:23.403606 kubelet[2916]: E0123 00:06:23.403547 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:06:31.401760 kubelet[2916]: E0123 00:06:31.401672 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:06:34.401143 kubelet[2916]: E0123 00:06:34.401102 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:06:34.401521 kubelet[2916]: E0123 00:06:34.401206 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:06:35.401887 kubelet[2916]: E0123 00:06:35.401822 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:06:37.401894 kubelet[2916]: E0123 00:06:37.401629 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:06:37.403258 kubelet[2916]: E0123 00:06:37.402984 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:06:43.403901 containerd[1664]: time="2026-01-23T00:06:43.403853297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 00:06:43.737644 containerd[1664]: time="2026-01-23T00:06:43.737606415Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:43.739169 containerd[1664]: time="2026-01-23T00:06:43.739133499Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 00:06:43.739223 containerd[1664]: time="2026-01-23T00:06:43.739213660Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 00:06:43.739445 kubelet[2916]: E0123 00:06:43.739397 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:06:43.740089 kubelet[2916]: E0123 00:06:43.739800 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:06:43.740089 kubelet[2916]: E0123 00:06:43.739911 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:43.741622 containerd[1664]: time="2026-01-23T00:06:43.741559626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 00:06:44.097179 containerd[1664]: time="2026-01-23T00:06:44.096938086Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:44.098325 containerd[1664]: time="2026-01-23T00:06:44.098232810Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 00:06:44.098325 containerd[1664]: time="2026-01-23T00:06:44.098287650Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 00:06:44.098577 kubelet[2916]: E0123 00:06:44.098539 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:06:44.098633 kubelet[2916]: E0123 00:06:44.098588 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:06:44.098669 kubelet[2916]: E0123 00:06:44.098656 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:44.098724 kubelet[2916]: E0123 00:06:44.098693 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:06:46.400966 kubelet[2916]: E0123 00:06:46.400907 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:06:46.402034 kubelet[2916]: E0123 00:06:46.400972 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:06:47.403069 kubelet[2916]: E0123 00:06:47.403007 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:06:50.400888 containerd[1664]: time="2026-01-23T00:06:50.400838096Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 00:06:50.743474 containerd[1664]: time="2026-01-23T00:06:50.743363999Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:50.747354 containerd[1664]: time="2026-01-23T00:06:50.747208490Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 00:06:50.747421 containerd[1664]: time="2026-01-23T00:06:50.747313490Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 00:06:50.747515 kubelet[2916]: E0123 00:06:50.747477 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:06:50.747972 kubelet[2916]: E0123 00:06:50.747522 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:06:50.747972 kubelet[2916]: E0123 00:06:50.747596 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-dsv6d_calico-system(fb016e95-3a65-4e3c-b265-30c14a6c50c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:50.747972 kubelet[2916]: E0123 00:06:50.747625 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:06:51.401357 kubelet[2916]: E0123 00:06:51.401291 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:06:56.636036 systemd[1]: Started sshd@11-10.0.0.231:22-20.161.92.111:35920.service - OpenSSH per-connection server daemon (20.161.92.111:35920). Jan 23 00:06:57.258741 sshd[5178]: Accepted publickey for core from 20.161.92.111 port 35920 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:06:57.261443 sshd-session[5178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:06:57.269806 systemd-logind[1635]: New session 12 of user core. Jan 23 00:06:57.273941 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 23 00:06:57.761218 sshd[5188]: Connection closed by 20.161.92.111 port 35920 Jan 23 00:06:57.761881 sshd-session[5178]: pam_unix(sshd:session): session closed for user core Jan 23 00:06:57.766807 systemd[1]: sshd@11-10.0.0.231:22-20.161.92.111:35920.service: Deactivated successfully. Jan 23 00:06:57.769574 systemd[1]: session-12.scope: Deactivated successfully. Jan 23 00:06:57.770528 systemd-logind[1635]: Session 12 logged out. Waiting for processes to exit. Jan 23 00:06:57.772067 systemd-logind[1635]: Removed session 12. Jan 23 00:06:59.402221 containerd[1664]: time="2026-01-23T00:06:59.402164606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:06:59.403877 kubelet[2916]: E0123 00:06:59.402342 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:06:59.745508 containerd[1664]: time="2026-01-23T00:06:59.745450991Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:06:59.746620 containerd[1664]: time="2026-01-23T00:06:59.746579394Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:06:59.746691 containerd[1664]: time="2026-01-23T00:06:59.746637275Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:06:59.746873 kubelet[2916]: E0123 00:06:59.746831 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:06:59.746922 kubelet[2916]: E0123 00:06:59.746883 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:06:59.746989 kubelet[2916]: E0123 00:06:59.746950 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-6qcz2_calico-apiserver(2511aa5d-56b0-481d-8829-6daaa6eae613): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:06:59.747050 kubelet[2916]: E0123 00:06:59.747024 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:07:01.401265 containerd[1664]: time="2026-01-23T00:07:01.401211342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 00:07:01.730766 containerd[1664]: time="2026-01-23T00:07:01.730635768Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:07:01.731918 containerd[1664]: time="2026-01-23T00:07:01.731863731Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 00:07:01.731959 containerd[1664]: time="2026-01-23T00:07:01.731930451Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 00:07:01.732119 kubelet[2916]: E0123 00:07:01.732077 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:07:01.732821 kubelet[2916]: E0123 00:07:01.732130 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:07:01.732821 kubelet[2916]: E0123 00:07:01.732308 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-557fb68f57-qftrq_calico-system(2c86da6b-b89e-4517-904a-9f7bcd6f830a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 00:07:01.732821 kubelet[2916]: E0123 00:07:01.732347 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:07:01.733179 containerd[1664]: time="2026-01-23T00:07:01.732638934Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:07:02.070925 containerd[1664]: time="2026-01-23T00:07:02.070798024Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:07:02.077110 containerd[1664]: time="2026-01-23T00:07:02.076808321Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:07:02.077110 containerd[1664]: time="2026-01-23T00:07:02.076908521Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:07:02.077742 kubelet[2916]: E0123 00:07:02.077694 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:07:02.077836 kubelet[2916]: E0123 00:07:02.077749 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:07:02.077865 kubelet[2916]: E0123 00:07:02.077843 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-45hrp_calico-apiserver(46de2902-bb4c-4eda-81ac-f00ac179b50d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:07:02.077930 kubelet[2916]: E0123 00:07:02.077879 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:07:02.401642 containerd[1664]: time="2026-01-23T00:07:02.401526973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 00:07:02.724665 containerd[1664]: time="2026-01-23T00:07:02.724590900Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:07:02.725884 containerd[1664]: time="2026-01-23T00:07:02.725828024Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 00:07:02.725979 containerd[1664]: time="2026-01-23T00:07:02.725922784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 00:07:02.726256 kubelet[2916]: E0123 00:07:02.726208 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:07:02.726326 kubelet[2916]: E0123 00:07:02.726272 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:07:02.726383 kubelet[2916]: E0123 00:07:02.726349 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 00:07:02.728136 containerd[1664]: time="2026-01-23T00:07:02.728075950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 00:07:02.869067 systemd[1]: Started sshd@12-10.0.0.231:22-20.161.92.111:58410.service - OpenSSH per-connection server daemon (20.161.92.111:58410). Jan 23 00:07:03.060778 containerd[1664]: time="2026-01-23T00:07:03.060425704Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:07:03.062274 containerd[1664]: time="2026-01-23T00:07:03.062219989Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 00:07:03.062475 containerd[1664]: time="2026-01-23T00:07:03.062236349Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 00:07:03.062645 kubelet[2916]: E0123 00:07:03.062577 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:07:03.062645 kubelet[2916]: E0123 00:07:03.062635 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:07:03.063038 kubelet[2916]: E0123 00:07:03.062701 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 00:07:03.063038 kubelet[2916]: E0123 00:07:03.062762 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:07:03.490849 sshd[5218]: Accepted publickey for core from 20.161.92.111 port 58410 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:03.492262 sshd-session[5218]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:03.496845 systemd-logind[1635]: New session 13 of user core. Jan 23 00:07:03.512926 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 23 00:07:03.983142 sshd[5221]: Connection closed by 20.161.92.111 port 58410 Jan 23 00:07:03.983683 sshd-session[5218]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:03.989433 systemd[1]: sshd@12-10.0.0.231:22-20.161.92.111:58410.service: Deactivated successfully. Jan 23 00:07:03.992479 systemd[1]: session-13.scope: Deactivated successfully. Jan 23 00:07:03.993673 systemd-logind[1635]: Session 13 logged out. Waiting for processes to exit. Jan 23 00:07:03.995620 systemd-logind[1635]: Removed session 13. Jan 23 00:07:04.401925 kubelet[2916]: E0123 00:07:04.401442 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:07:09.106110 systemd[1]: Started sshd@13-10.0.0.231:22-20.161.92.111:58414.service - OpenSSH per-connection server daemon (20.161.92.111:58414). Jan 23 00:07:09.742209 sshd[5236]: Accepted publickey for core from 20.161.92.111 port 58414 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:09.743570 sshd-session[5236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:09.747796 systemd-logind[1635]: New session 14 of user core. Jan 23 00:07:09.755880 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 23 00:07:10.248329 sshd[5239]: Connection closed by 20.161.92.111 port 58414 Jan 23 00:07:10.248920 sshd-session[5236]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:10.254132 systemd-logind[1635]: Session 14 logged out. Waiting for processes to exit. Jan 23 00:07:10.254347 systemd[1]: sshd@13-10.0.0.231:22-20.161.92.111:58414.service: Deactivated successfully. Jan 23 00:07:10.258247 systemd[1]: session-14.scope: Deactivated successfully. Jan 23 00:07:10.260055 systemd-logind[1635]: Removed session 14. Jan 23 00:07:10.357545 systemd[1]: Started sshd@14-10.0.0.231:22-20.161.92.111:58422.service - OpenSSH per-connection server daemon (20.161.92.111:58422). Jan 23 00:07:10.401891 kubelet[2916]: E0123 00:07:10.401839 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:07:10.966998 sshd[5254]: Accepted publickey for core from 20.161.92.111 port 58422 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:10.968461 sshd-session[5254]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:10.972522 systemd-logind[1635]: New session 15 of user core. Jan 23 00:07:10.983033 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 23 00:07:11.491982 sshd[5258]: Connection closed by 20.161.92.111 port 58422 Jan 23 00:07:11.492288 sshd-session[5254]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:11.496368 systemd[1]: sshd@14-10.0.0.231:22-20.161.92.111:58422.service: Deactivated successfully. Jan 23 00:07:11.498217 systemd[1]: session-15.scope: Deactivated successfully. Jan 23 00:07:11.500819 systemd-logind[1635]: Session 15 logged out. Waiting for processes to exit. Jan 23 00:07:11.502329 systemd-logind[1635]: Removed session 15. Jan 23 00:07:11.609351 systemd[1]: Started sshd@15-10.0.0.231:22-20.161.92.111:58426.service - OpenSSH per-connection server daemon (20.161.92.111:58426). Jan 23 00:07:12.244056 sshd[5270]: Accepted publickey for core from 20.161.92.111 port 58426 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:12.245577 sshd-session[5270]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:12.251194 systemd-logind[1635]: New session 16 of user core. Jan 23 00:07:12.254914 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 23 00:07:12.400644 kubelet[2916]: E0123 00:07:12.400594 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:07:12.742943 sshd[5274]: Connection closed by 20.161.92.111 port 58426 Jan 23 00:07:12.746290 systemd[1]: sshd@15-10.0.0.231:22-20.161.92.111:58426.service: Deactivated successfully. Jan 23 00:07:12.743450 sshd-session[5270]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:12.748498 systemd[1]: session-16.scope: Deactivated successfully. Jan 23 00:07:12.749698 systemd-logind[1635]: Session 16 logged out. Waiting for processes to exit. Jan 23 00:07:12.750966 systemd-logind[1635]: Removed session 16. Jan 23 00:07:13.401729 kubelet[2916]: E0123 00:07:13.401668 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:07:14.400650 kubelet[2916]: E0123 00:07:14.400591 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:07:15.402577 kubelet[2916]: E0123 00:07:15.402519 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:07:17.857307 systemd[1]: Started sshd@16-10.0.0.231:22-20.161.92.111:43146.service - OpenSSH per-connection server daemon (20.161.92.111:43146). Jan 23 00:07:18.400843 kubelet[2916]: E0123 00:07:18.400563 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:07:18.499958 sshd[5293]: Accepted publickey for core from 20.161.92.111 port 43146 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:18.502835 sshd-session[5293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:18.507004 systemd-logind[1635]: New session 17 of user core. Jan 23 00:07:18.515937 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 23 00:07:18.999362 sshd[5296]: Connection closed by 20.161.92.111 port 43146 Jan 23 00:07:18.999897 sshd-session[5293]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:19.004435 systemd-logind[1635]: Session 17 logged out. Waiting for processes to exit. Jan 23 00:07:19.004710 systemd[1]: sshd@16-10.0.0.231:22-20.161.92.111:43146.service: Deactivated successfully. Jan 23 00:07:19.006363 systemd[1]: session-17.scope: Deactivated successfully. Jan 23 00:07:19.007990 systemd-logind[1635]: Removed session 17. Jan 23 00:07:23.401920 kubelet[2916]: E0123 00:07:23.401874 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:07:24.107280 systemd[1]: Started sshd@17-10.0.0.231:22-20.161.92.111:46880.service - OpenSSH per-connection server daemon (20.161.92.111:46880). Jan 23 00:07:24.724314 sshd[5309]: Accepted publickey for core from 20.161.92.111 port 46880 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:24.725768 sshd-session[5309]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:24.730069 systemd-logind[1635]: New session 18 of user core. Jan 23 00:07:24.740998 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 23 00:07:25.242698 sshd[5337]: Connection closed by 20.161.92.111 port 46880 Jan 23 00:07:25.243301 sshd-session[5309]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:25.247555 systemd-logind[1635]: Session 18 logged out. Waiting for processes to exit. Jan 23 00:07:25.247943 systemd[1]: sshd@17-10.0.0.231:22-20.161.92.111:46880.service: Deactivated successfully. Jan 23 00:07:25.250508 systemd[1]: session-18.scope: Deactivated successfully. Jan 23 00:07:25.252254 systemd-logind[1635]: Removed session 18. Jan 23 00:07:25.401779 kubelet[2916]: E0123 00:07:25.401627 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:07:27.401556 kubelet[2916]: E0123 00:07:27.401501 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:07:29.401811 kubelet[2916]: E0123 00:07:29.401388 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:07:30.353174 systemd[1]: Started sshd@18-10.0.0.231:22-20.161.92.111:46888.service - OpenSSH per-connection server daemon (20.161.92.111:46888). Jan 23 00:07:30.403122 kubelet[2916]: E0123 00:07:30.403068 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:07:30.987819 sshd[5354]: Accepted publickey for core from 20.161.92.111 port 46888 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:30.989243 sshd-session[5354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:30.995585 systemd-logind[1635]: New session 19 of user core. Jan 23 00:07:31.005946 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 23 00:07:31.497860 sshd[5358]: Connection closed by 20.161.92.111 port 46888 Jan 23 00:07:31.498271 sshd-session[5354]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:31.502024 systemd[1]: sshd@18-10.0.0.231:22-20.161.92.111:46888.service: Deactivated successfully. Jan 23 00:07:31.503792 systemd[1]: session-19.scope: Deactivated successfully. Jan 23 00:07:31.504641 systemd-logind[1635]: Session 19 logged out. Waiting for processes to exit. Jan 23 00:07:31.505811 systemd-logind[1635]: Removed session 19. Jan 23 00:07:31.613906 systemd[1]: Started sshd@19-10.0.0.231:22-20.161.92.111:46896.service - OpenSSH per-connection server daemon (20.161.92.111:46896). Jan 23 00:07:32.261223 sshd[5371]: Accepted publickey for core from 20.161.92.111 port 46896 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:32.262287 sshd-session[5371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:32.266716 systemd-logind[1635]: New session 20 of user core. Jan 23 00:07:32.280009 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 23 00:07:32.812874 sshd[5374]: Connection closed by 20.161.92.111 port 46896 Jan 23 00:07:32.813250 sshd-session[5371]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:32.816864 systemd-logind[1635]: Session 20 logged out. Waiting for processes to exit. Jan 23 00:07:32.817044 systemd[1]: sshd@19-10.0.0.231:22-20.161.92.111:46896.service: Deactivated successfully. Jan 23 00:07:32.820206 systemd[1]: session-20.scope: Deactivated successfully. Jan 23 00:07:32.822657 systemd-logind[1635]: Removed session 20. Jan 23 00:07:32.920355 systemd[1]: Started sshd@20-10.0.0.231:22-20.161.92.111:47188.service - OpenSSH per-connection server daemon (20.161.92.111:47188). Jan 23 00:07:33.401461 kubelet[2916]: E0123 00:07:33.401404 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:07:33.538061 sshd[5386]: Accepted publickey for core from 20.161.92.111 port 47188 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:33.539432 sshd-session[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:33.543280 systemd-logind[1635]: New session 21 of user core. Jan 23 00:07:33.559953 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 23 00:07:34.575820 sshd[5389]: Connection closed by 20.161.92.111 port 47188 Jan 23 00:07:34.577182 sshd-session[5386]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:34.583018 systemd[1]: sshd@20-10.0.0.231:22-20.161.92.111:47188.service: Deactivated successfully. Jan 23 00:07:34.586351 systemd[1]: session-21.scope: Deactivated successfully. Jan 23 00:07:34.587806 systemd-logind[1635]: Session 21 logged out. Waiting for processes to exit. Jan 23 00:07:34.589602 systemd-logind[1635]: Removed session 21. Jan 23 00:07:34.689246 systemd[1]: Started sshd@21-10.0.0.231:22-20.161.92.111:47196.service - OpenSSH per-connection server daemon (20.161.92.111:47196). Jan 23 00:07:35.333456 sshd[5407]: Accepted publickey for core from 20.161.92.111 port 47196 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:35.334964 sshd-session[5407]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:35.338864 systemd-logind[1635]: New session 22 of user core. Jan 23 00:07:35.344947 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 23 00:07:35.969092 sshd[5410]: Connection closed by 20.161.92.111 port 47196 Jan 23 00:07:35.969527 sshd-session[5407]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:35.973620 systemd[1]: sshd@21-10.0.0.231:22-20.161.92.111:47196.service: Deactivated successfully. Jan 23 00:07:35.975524 systemd[1]: session-22.scope: Deactivated successfully. Jan 23 00:07:35.976354 systemd-logind[1635]: Session 22 logged out. Waiting for processes to exit. Jan 23 00:07:35.977927 systemd-logind[1635]: Removed session 22. Jan 23 00:07:36.074819 systemd[1]: Started sshd@22-10.0.0.231:22-20.161.92.111:47202.service - OpenSSH per-connection server daemon (20.161.92.111:47202). Jan 23 00:07:36.401294 kubelet[2916]: E0123 00:07:36.401174 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:07:36.704243 sshd[5425]: Accepted publickey for core from 20.161.92.111 port 47202 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:36.705480 sshd-session[5425]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:36.711111 systemd-logind[1635]: New session 23 of user core. Jan 23 00:07:36.719954 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 23 00:07:37.189557 sshd[5428]: Connection closed by 20.161.92.111 port 47202 Jan 23 00:07:37.189449 sshd-session[5425]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:37.193404 systemd-logind[1635]: Session 23 logged out. Waiting for processes to exit. Jan 23 00:07:37.193597 systemd[1]: sshd@22-10.0.0.231:22-20.161.92.111:47202.service: Deactivated successfully. Jan 23 00:07:37.195366 systemd[1]: session-23.scope: Deactivated successfully. Jan 23 00:07:37.196590 systemd-logind[1635]: Removed session 23. Jan 23 00:07:38.401389 kubelet[2916]: E0123 00:07:38.401328 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:07:39.401421 kubelet[2916]: E0123 00:07:39.401239 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:07:42.301352 systemd[1]: Started sshd@23-10.0.0.231:22-20.161.92.111:47210.service - OpenSSH per-connection server daemon (20.161.92.111:47210). Jan 23 00:07:42.400952 kubelet[2916]: E0123 00:07:42.400874 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:07:42.924980 sshd[5445]: Accepted publickey for core from 20.161.92.111 port 47210 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:42.926266 sshd-session[5445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:42.930686 systemd-logind[1635]: New session 24 of user core. Jan 23 00:07:42.941959 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 23 00:07:43.413334 sshd[5448]: Connection closed by 20.161.92.111 port 47210 Jan 23 00:07:43.414302 sshd-session[5445]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:43.419475 systemd[1]: sshd@23-10.0.0.231:22-20.161.92.111:47210.service: Deactivated successfully. Jan 23 00:07:43.421760 systemd[1]: session-24.scope: Deactivated successfully. Jan 23 00:07:43.422564 systemd-logind[1635]: Session 24 logged out. Waiting for processes to exit. Jan 23 00:07:43.423739 systemd-logind[1635]: Removed session 24. Jan 23 00:07:44.401281 kubelet[2916]: E0123 00:07:44.401222 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:07:45.403327 kubelet[2916]: E0123 00:07:45.403261 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:07:47.402442 kubelet[2916]: E0123 00:07:47.402379 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:07:48.532799 systemd[1]: Started sshd@24-10.0.0.231:22-20.161.92.111:45322.service - OpenSSH per-connection server daemon (20.161.92.111:45322). Jan 23 00:07:49.150745 sshd[5461]: Accepted publickey for core from 20.161.92.111 port 45322 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:49.151546 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:49.155792 systemd-logind[1635]: New session 25 of user core. Jan 23 00:07:49.162891 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 23 00:07:49.635457 sshd[5464]: Connection closed by 20.161.92.111 port 45322 Jan 23 00:07:49.635972 sshd-session[5461]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:49.640306 systemd-logind[1635]: Session 25 logged out. Waiting for processes to exit. Jan 23 00:07:49.640576 systemd[1]: sshd@24-10.0.0.231:22-20.161.92.111:45322.service: Deactivated successfully. Jan 23 00:07:49.642242 systemd[1]: session-25.scope: Deactivated successfully. Jan 23 00:07:49.643865 systemd-logind[1635]: Removed session 25. Jan 23 00:07:51.400799 kubelet[2916]: E0123 00:07:51.400739 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:07:54.401603 kubelet[2916]: E0123 00:07:54.401267 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:07:54.750090 systemd[1]: Started sshd@25-10.0.0.231:22-20.161.92.111:37700.service - OpenSSH per-connection server daemon (20.161.92.111:37700). Jan 23 00:07:55.378630 sshd[5503]: Accepted publickey for core from 20.161.92.111 port 37700 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:07:55.380516 sshd-session[5503]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:07:55.386190 systemd-logind[1635]: New session 26 of user core. Jan 23 00:07:55.392917 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 23 00:07:55.864825 sshd[5506]: Connection closed by 20.161.92.111 port 37700 Jan 23 00:07:55.865792 sshd-session[5503]: pam_unix(sshd:session): session closed for user core Jan 23 00:07:55.869759 systemd[1]: sshd@25-10.0.0.231:22-20.161.92.111:37700.service: Deactivated successfully. Jan 23 00:07:55.872347 systemd[1]: session-26.scope: Deactivated successfully. Jan 23 00:07:55.874964 systemd-logind[1635]: Session 26 logged out. Waiting for processes to exit. Jan 23 00:07:55.876577 systemd-logind[1635]: Removed session 26. Jan 23 00:07:57.401474 kubelet[2916]: E0123 00:07:57.401166 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:07:58.401026 kubelet[2916]: E0123 00:07:58.400786 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:07:58.401454 kubelet[2916]: E0123 00:07:58.401377 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:07:59.401825 kubelet[2916]: E0123 00:07:59.401768 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:08:00.978533 systemd[1]: Started sshd@26-10.0.0.231:22-20.161.92.111:37712.service - OpenSSH per-connection server daemon (20.161.92.111:37712). Jan 23 00:08:01.595912 sshd[5520]: Accepted publickey for core from 20.161.92.111 port 37712 ssh2: RSA SHA256:LM3VPVh2rDQ94ANFrDgvQxNqEoarBMSCGjCeA3ld4WI Jan 23 00:08:01.598842 sshd-session[5520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 00:08:01.603791 systemd-logind[1635]: New session 27 of user core. Jan 23 00:08:01.608898 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 23 00:08:02.089134 sshd[5523]: Connection closed by 20.161.92.111 port 37712 Jan 23 00:08:02.089979 sshd-session[5520]: pam_unix(sshd:session): session closed for user core Jan 23 00:08:02.095406 systemd[1]: sshd@26-10.0.0.231:22-20.161.92.111:37712.service: Deactivated successfully. Jan 23 00:08:02.097457 systemd[1]: session-27.scope: Deactivated successfully. Jan 23 00:08:02.098273 systemd-logind[1635]: Session 27 logged out. Waiting for processes to exit. Jan 23 00:08:02.101094 systemd-logind[1635]: Removed session 27. Jan 23 00:08:02.401046 kubelet[2916]: E0123 00:08:02.400746 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:08:09.402281 kubelet[2916]: E0123 00:08:09.402097 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:08:10.401510 kubelet[2916]: E0123 00:08:10.401460 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:08:10.402041 containerd[1664]: time="2026-01-23T00:08:10.402002334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 00:08:10.739750 containerd[1664]: time="2026-01-23T00:08:10.739631266Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:10.740959 containerd[1664]: time="2026-01-23T00:08:10.740904230Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 00:08:10.741038 containerd[1664]: time="2026-01-23T00:08:10.740988950Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Jan 23 00:08:10.741241 kubelet[2916]: E0123 00:08:10.741181 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:08:10.741537 kubelet[2916]: E0123 00:08:10.741245 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 00:08:10.741537 kubelet[2916]: E0123 00:08:10.741321 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:10.742197 containerd[1664]: time="2026-01-23T00:08:10.742176633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 00:08:11.073464 containerd[1664]: time="2026-01-23T00:08:11.073344387Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:11.074715 containerd[1664]: time="2026-01-23T00:08:11.074650831Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 00:08:11.075470 containerd[1664]: time="2026-01-23T00:08:11.074716711Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Jan 23 00:08:11.075644 kubelet[2916]: E0123 00:08:11.075597 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:08:11.075703 kubelet[2916]: E0123 00:08:11.075648 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 00:08:11.075798 kubelet[2916]: E0123 00:08:11.075753 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-59b64767c-5cv6g_calico-system(01440e69-1437-4f1f-8ae8-f18c381a9217): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:11.075938 kubelet[2916]: E0123 00:08:11.075879 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:08:12.402235 kubelet[2916]: E0123 00:08:12.402179 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:08:12.402874 kubelet[2916]: E0123 00:08:12.402804 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:08:14.401233 kubelet[2916]: E0123 00:08:14.401173 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:08:21.402899 containerd[1664]: time="2026-01-23T00:08:21.402584527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 00:08:21.741644 containerd[1664]: time="2026-01-23T00:08:21.741583623Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:21.743009 containerd[1664]: time="2026-01-23T00:08:21.742972667Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 00:08:21.743142 containerd[1664]: time="2026-01-23T00:08:21.743053228Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Jan 23 00:08:21.743343 kubelet[2916]: E0123 00:08:21.743304 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:08:21.743863 kubelet[2916]: E0123 00:08:21.743663 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 00:08:21.743863 kubelet[2916]: E0123 00:08:21.743781 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-dsv6d_calico-system(fb016e95-3a65-4e3c-b265-30c14a6c50c6): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:21.743863 kubelet[2916]: E0123 00:08:21.743826 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:08:23.402215 containerd[1664]: time="2026-01-23T00:08:23.402151125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:08:23.749482 containerd[1664]: time="2026-01-23T00:08:23.749417284Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:23.751162 containerd[1664]: time="2026-01-23T00:08:23.751073849Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:08:23.751162 containerd[1664]: time="2026-01-23T00:08:23.751137609Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:08:23.751362 kubelet[2916]: E0123 00:08:23.751313 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:08:23.751666 kubelet[2916]: E0123 00:08:23.751377 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:08:23.751666 kubelet[2916]: E0123 00:08:23.751463 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-6qcz2_calico-apiserver(2511aa5d-56b0-481d-8829-6daaa6eae613): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:23.751666 kubelet[2916]: E0123 00:08:23.751598 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:08:24.401148 containerd[1664]: time="2026-01-23T00:08:24.401108761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 00:08:24.729513 containerd[1664]: time="2026-01-23T00:08:24.729458746Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:24.730932 containerd[1664]: time="2026-01-23T00:08:24.730890510Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 00:08:24.730988 containerd[1664]: time="2026-01-23T00:08:24.730934230Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Jan 23 00:08:24.731165 kubelet[2916]: E0123 00:08:24.731114 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:08:24.731213 kubelet[2916]: E0123 00:08:24.731167 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 00:08:24.731252 kubelet[2916]: E0123 00:08:24.731240 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-557fb68f57-qftrq_calico-system(2c86da6b-b89e-4517-904a-9f7bcd6f830a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:24.731306 kubelet[2916]: E0123 00:08:24.731271 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:08:25.402010 kubelet[2916]: E0123 00:08:25.401812 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:08:27.401901 containerd[1664]: time="2026-01-23T00:08:27.401762520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 00:08:27.749564 containerd[1664]: time="2026-01-23T00:08:27.748982800Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:27.750513 containerd[1664]: time="2026-01-23T00:08:27.750414444Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 00:08:27.750513 containerd[1664]: time="2026-01-23T00:08:27.750487084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Jan 23 00:08:27.750711 kubelet[2916]: E0123 00:08:27.750667 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:08:27.750982 kubelet[2916]: E0123 00:08:27.750744 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 00:08:27.750982 kubelet[2916]: E0123 00:08:27.750833 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:27.751950 containerd[1664]: time="2026-01-23T00:08:27.751918529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 00:08:28.081802 containerd[1664]: time="2026-01-23T00:08:28.081358637Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:28.082683 containerd[1664]: time="2026-01-23T00:08:28.082613081Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 00:08:28.082683 containerd[1664]: time="2026-01-23T00:08:28.082655761Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Jan 23 00:08:28.083064 kubelet[2916]: E0123 00:08:28.082847 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:08:28.083064 kubelet[2916]: E0123 00:08:28.082892 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 00:08:28.083064 kubelet[2916]: E0123 00:08:28.082971 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-hdvg2_calico-system(e4fbecc4-8903-42d8-8af9-1aa47331d5be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:28.083279 kubelet[2916]: E0123 00:08:28.083237 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:08:29.402513 containerd[1664]: time="2026-01-23T00:08:29.402348641Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 00:08:29.726319 containerd[1664]: time="2026-01-23T00:08:29.726220053Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 00:08:29.727646 containerd[1664]: time="2026-01-23T00:08:29.727599497Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 00:08:29.727736 containerd[1664]: time="2026-01-23T00:08:29.727636497Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Jan 23 00:08:29.727891 kubelet[2916]: E0123 00:08:29.727838 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:08:29.727891 kubelet[2916]: E0123 00:08:29.727887 2916 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 00:08:29.728173 kubelet[2916]: E0123 00:08:29.727956 2916 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-6b477b4fc8-45hrp_calico-apiserver(46de2902-bb4c-4eda-81ac-f00ac179b50d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 00:08:29.728173 kubelet[2916]: E0123 00:08:29.727987 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:08:32.401825 kubelet[2916]: E0123 00:08:32.401772 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:08:38.402653 kubelet[2916]: E0123 00:08:38.402094 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:08:40.401218 kubelet[2916]: E0123 00:08:40.401163 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:08:40.402115 kubelet[2916]: E0123 00:08:40.401905 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:08:41.401497 kubelet[2916]: E0123 00:08:41.401432 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:08:42.401281 kubelet[2916]: E0123 00:08:42.401217 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:08:44.401244 kubelet[2916]: E0123 00:08:44.400854 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:08:49.400781 kubelet[2916]: E0123 00:08:49.400538 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:08:51.400500 kubelet[2916]: E0123 00:08:51.400437 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:08:52.401216 kubelet[2916]: E0123 00:08:52.401162 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:08:53.998200 systemd[1]: cri-containerd-c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280.scope: Deactivated successfully. Jan 23 00:08:53.998497 systemd[1]: cri-containerd-c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280.scope: Consumed 5.067s CPU time, 63.3M memory peak. Jan 23 00:08:54.000369 systemd[1]: cri-containerd-fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441.scope: Deactivated successfully. Jan 23 00:08:54.000799 systemd[1]: cri-containerd-fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441.scope: Consumed 44.037s CPU time, 97.2M memory peak. Jan 23 00:08:54.002439 containerd[1664]: time="2026-01-23T00:08:54.002331230Z" level=info msg="received container exit event container_id:\"c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280\" id:\"c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280\" pid:2761 exit_status:1 exited_at:{seconds:1769126934 nanos:2020749}" Jan 23 00:08:54.002895 containerd[1664]: time="2026-01-23T00:08:54.002732431Z" level=info msg="received container exit event container_id:\"fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441\" id:\"fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441\" pid:3247 exit_status:1 exited_at:{seconds:1769126934 nanos:2540951}" Jan 23 00:08:54.024586 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441-rootfs.mount: Deactivated successfully. Jan 23 00:08:54.031139 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280-rootfs.mount: Deactivated successfully. Jan 23 00:08:54.056433 kubelet[2916]: E0123 00:08:54.056230 2916 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.231:44830->10.0.0.196:2379: read: connection timed out" Jan 23 00:08:55.025923 kubelet[2916]: I0123 00:08:55.025891 2916 scope.go:117] "RemoveContainer" containerID="c60b56b0cdec6bfece7b104291b8e392205f00adb4308b81be4d895a35113280" Jan 23 00:08:55.027574 kubelet[2916]: I0123 00:08:55.027552 2916 scope.go:117] "RemoveContainer" containerID="fca3eafb78ba7fd2477c7fe415426bb37474554df91fd59114ee2ec2d3ff2441" Jan 23 00:08:55.027939 containerd[1664]: time="2026-01-23T00:08:55.027905263Z" level=info msg="CreateContainer within sandbox \"5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 23 00:08:55.029807 containerd[1664]: time="2026-01-23T00:08:55.029084466Z" level=info msg="CreateContainer within sandbox \"83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 23 00:08:55.040981 containerd[1664]: time="2026-01-23T00:08:55.040943621Z" level=info msg="Container f792b71957a1375b551ba820aca2170de06a80b2b19cbbd73e08b10a6365751f: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:08:55.042187 containerd[1664]: time="2026-01-23T00:08:55.041847943Z" level=info msg="Container b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:08:55.059401 containerd[1664]: time="2026-01-23T00:08:55.059345154Z" level=info msg="CreateContainer within sandbox \"5cc3ac3f60778586835b7456ac45ad9cd362b4a3485502a02cd0d7d53dc2a192\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"f792b71957a1375b551ba820aca2170de06a80b2b19cbbd73e08b10a6365751f\"" Jan 23 00:08:55.060058 containerd[1664]: time="2026-01-23T00:08:55.060025756Z" level=info msg="StartContainer for \"f792b71957a1375b551ba820aca2170de06a80b2b19cbbd73e08b10a6365751f\"" Jan 23 00:08:55.061612 containerd[1664]: time="2026-01-23T00:08:55.061586320Z" level=info msg="connecting to shim f792b71957a1375b551ba820aca2170de06a80b2b19cbbd73e08b10a6365751f" address="unix:///run/containerd/s/0a774f6096ca08883c42fb3f9cb746eda10bb0b4739fd2252c053885cc7f235e" protocol=ttrpc version=3 Jan 23 00:08:55.062565 containerd[1664]: time="2026-01-23T00:08:55.062511043Z" level=info msg="CreateContainer within sandbox \"83dad799f6ab2abd564a9081953d19a1eaee626b4923b3de823e694488c3eb9c\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539\"" Jan 23 00:08:55.066265 containerd[1664]: time="2026-01-23T00:08:55.065581412Z" level=info msg="StartContainer for \"b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539\"" Jan 23 00:08:55.067502 containerd[1664]: time="2026-01-23T00:08:55.066567254Z" level=info msg="connecting to shim b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539" address="unix:///run/containerd/s/229a700e0f1566c9b92a9bc93c10e4c05db231d7da447938cfa7e0fc2dc1b4a6" protocol=ttrpc version=3 Jan 23 00:08:55.099084 systemd[1]: Started cri-containerd-b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539.scope - libcontainer container b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539. Jan 23 00:08:55.102657 systemd[1]: Started cri-containerd-f792b71957a1375b551ba820aca2170de06a80b2b19cbbd73e08b10a6365751f.scope - libcontainer container f792b71957a1375b551ba820aca2170de06a80b2b19cbbd73e08b10a6365751f. Jan 23 00:08:55.131606 containerd[1664]: time="2026-01-23T00:08:55.131062440Z" level=info msg="StartContainer for \"b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539\" returns successfully" Jan 23 00:08:55.147368 containerd[1664]: time="2026-01-23T00:08:55.147322007Z" level=info msg="StartContainer for \"f792b71957a1375b551ba820aca2170de06a80b2b19cbbd73e08b10a6365751f\" returns successfully" Jan 23 00:08:55.402580 kubelet[2916]: E0123 00:08:55.402458 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217" Jan 23 00:08:56.402575 kubelet[2916]: E0123 00:08:56.402517 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdvg2" podUID="e4fbecc4-8903-42d8-8af9-1aa47331d5be" Jan 23 00:08:57.401476 kubelet[2916]: E0123 00:08:57.401194 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-dsv6d" podUID="fb016e95-3a65-4e3c-b265-30c14a6c50c6" Jan 23 00:08:58.577714 systemd[1]: cri-containerd-9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0.scope: Deactivated successfully. Jan 23 00:08:58.578017 systemd[1]: cri-containerd-9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0.scope: Consumed 3.938s CPU time, 23.6M memory peak. Jan 23 00:08:58.581333 containerd[1664]: time="2026-01-23T00:08:58.581293374Z" level=info msg="received container exit event container_id:\"9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0\" id:\"9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0\" pid:2749 exit_status:1 exited_at:{seconds:1769126938 nanos:580762453}" Jan 23 00:08:58.600924 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0-rootfs.mount: Deactivated successfully. Jan 23 00:08:59.048681 kubelet[2916]: I0123 00:08:59.048443 2916 scope.go:117] "RemoveContainer" containerID="9f8a78637c3b94016faa79dc84dd1bf4a2748cbb6aad10f73fa1682421c841b0" Jan 23 00:08:59.050072 containerd[1664]: time="2026-01-23T00:08:59.050039004Z" level=info msg="CreateContainer within sandbox \"d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 23 00:08:59.058356 containerd[1664]: time="2026-01-23T00:08:59.058307548Z" level=info msg="Container bdf8949c5935647e5415b347406e8e4f1cc95197c50f42f4fc1e425abb17c6a2: CDI devices from CRI Config.CDIDevices: []" Jan 23 00:08:59.067421 containerd[1664]: time="2026-01-23T00:08:59.067378894Z" level=info msg="CreateContainer within sandbox \"d37ae00b9503ec6f1c07699ade5b44acb8be0353e45d2d2663417fc8b2584720\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"bdf8949c5935647e5415b347406e8e4f1cc95197c50f42f4fc1e425abb17c6a2\"" Jan 23 00:08:59.068904 containerd[1664]: time="2026-01-23T00:08:59.067938575Z" level=info msg="StartContainer for \"bdf8949c5935647e5415b347406e8e4f1cc95197c50f42f4fc1e425abb17c6a2\"" Jan 23 00:08:59.069148 containerd[1664]: time="2026-01-23T00:08:59.069123379Z" level=info msg="connecting to shim bdf8949c5935647e5415b347406e8e4f1cc95197c50f42f4fc1e425abb17c6a2" address="unix:///run/containerd/s/2a5abcca5ef7165ede6bcec3da5d7d66968b82b20e1c55ecb7980c3e0265f0aa" protocol=ttrpc version=3 Jan 23 00:08:59.095482 systemd[1]: Started cri-containerd-bdf8949c5935647e5415b347406e8e4f1cc95197c50f42f4fc1e425abb17c6a2.scope - libcontainer container bdf8949c5935647e5415b347406e8e4f1cc95197c50f42f4fc1e425abb17c6a2. Jan 23 00:08:59.131842 containerd[1664]: time="2026-01-23T00:08:59.131803999Z" level=info msg="StartContainer for \"bdf8949c5935647e5415b347406e8e4f1cc95197c50f42f4fc1e425abb17c6a2\" returns successfully" Jan 23 00:09:02.475774 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Jan 23 00:09:03.402154 kubelet[2916]: E0123 00:09:03.402077 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-45hrp" podUID="46de2902-bb4c-4eda-81ac-f00ac179b50d" Jan 23 00:09:03.402677 kubelet[2916]: E0123 00:09:03.402146 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6b477b4fc8-6qcz2" podUID="2511aa5d-56b0-481d-8829-6daaa6eae613" Jan 23 00:09:04.057135 kubelet[2916]: E0123 00:09:04.057075 2916 controller.go:195] "Failed to update lease" err="Put \"https://10.0.0.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-n-22c0b85714?timeout=10s\": context deadline exceeded" Jan 23 00:09:06.336361 systemd[1]: cri-containerd-b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539.scope: Deactivated successfully. Jan 23 00:09:06.337135 containerd[1664]: time="2026-01-23T00:09:06.336804464Z" level=info msg="received container exit event container_id:\"b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539\" id:\"b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539\" pid:5681 exit_status:1 exited_at:{seconds:1769126946 nanos:336570504}" Jan 23 00:09:06.354175 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b22f5e4b83afc8763ccbb7c7c7c1fd6bbe68e027c8baa432cf13f68a29d60539-rootfs.mount: Deactivated successfully. Jan 23 00:09:06.401256 kubelet[2916]: E0123 00:09:06.401201 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-557fb68f57-qftrq" podUID="2c86da6b-b89e-4517-904a-9f7bcd6f830a" Jan 23 00:09:06.401886 kubelet[2916]: E0123 00:09:06.401836 2916 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-59b64767c-5cv6g" podUID="01440e69-1437-4f1f-8ae8-f18c381a9217"