Dec 12 17:32:01.773185 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 12 17:32:01.773207 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 12 17:32:01.773217 kernel: KASLR enabled Dec 12 17:32:01.773222 kernel: efi: EFI v2.7 by EDK II Dec 12 17:32:01.773228 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Dec 12 17:32:01.773233 kernel: random: crng init done Dec 12 17:32:01.773240 kernel: secureboot: Secure boot disabled Dec 12 17:32:01.773246 kernel: ACPI: Early table checksum verification disabled Dec 12 17:32:01.773252 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Dec 12 17:32:01.773258 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Dec 12 17:32:01.773265 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773271 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773276 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773282 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773289 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773295 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773303 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773310 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773316 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773322 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:32:01.773328 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Dec 12 17:32:01.773334 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 12 17:32:01.773341 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 12 17:32:01.773347 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Dec 12 17:32:01.773353 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Dec 12 17:32:01.773359 kernel: Zone ranges: Dec 12 17:32:01.773366 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 12 17:32:01.773372 kernel: DMA32 empty Dec 12 17:32:01.773378 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 12 17:32:01.773384 kernel: Device empty Dec 12 17:32:01.773390 kernel: Movable zone start for each node Dec 12 17:32:01.773396 kernel: Early memory node ranges Dec 12 17:32:01.773402 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Dec 12 17:32:01.773408 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Dec 12 17:32:01.773414 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Dec 12 17:32:01.773421 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Dec 12 17:32:01.773427 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Dec 12 17:32:01.773433 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Dec 12 17:32:01.773440 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 12 17:32:01.773446 kernel: psci: probing for conduit method from ACPI. Dec 12 17:32:01.773455 kernel: psci: PSCIv1.3 detected in firmware. Dec 12 17:32:01.773461 kernel: psci: Using standard PSCI v0.2 function IDs Dec 12 17:32:01.773468 kernel: psci: Trusted OS migration not required Dec 12 17:32:01.773475 kernel: psci: SMC Calling Convention v1.1 Dec 12 17:32:01.773482 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 12 17:32:01.773488 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 12 17:32:01.773495 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 12 17:32:01.773501 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Dec 12 17:32:01.773508 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Dec 12 17:32:01.773514 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 12 17:32:01.773520 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 12 17:32:01.773527 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Dec 12 17:32:01.773533 kernel: Detected PIPT I-cache on CPU0 Dec 12 17:32:01.773540 kernel: CPU features: detected: GIC system register CPU interface Dec 12 17:32:01.773546 kernel: CPU features: detected: Spectre-v4 Dec 12 17:32:01.773554 kernel: CPU features: detected: Spectre-BHB Dec 12 17:32:01.773560 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 12 17:32:01.773567 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 12 17:32:01.773573 kernel: CPU features: detected: ARM erratum 1418040 Dec 12 17:32:01.773580 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 12 17:32:01.773586 kernel: alternatives: applying boot alternatives Dec 12 17:32:01.773594 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:32:01.773601 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 12 17:32:01.773607 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 12 17:32:01.773614 kernel: Fallback order for Node 0: 0 Dec 12 17:32:01.773621 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Dec 12 17:32:01.773628 kernel: Policy zone: Normal Dec 12 17:32:01.773634 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 17:32:01.773641 kernel: software IO TLB: area num 4. Dec 12 17:32:01.773647 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 12 17:32:01.773654 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Dec 12 17:32:01.773660 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 17:32:01.773667 kernel: rcu: RCU event tracing is enabled. Dec 12 17:32:01.773674 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Dec 12 17:32:01.773680 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 17:32:01.773687 kernel: Tracing variant of Tasks RCU enabled. Dec 12 17:32:01.773693 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 17:32:01.773701 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Dec 12 17:32:01.773708 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 12 17:32:01.773714 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 12 17:32:01.773721 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 12 17:32:01.773727 kernel: GICv3: 256 SPIs implemented Dec 12 17:32:01.773733 kernel: GICv3: 0 Extended SPIs implemented Dec 12 17:32:01.773740 kernel: Root IRQ handler: gic_handle_irq Dec 12 17:32:01.773746 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 12 17:32:01.773752 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 12 17:32:01.773759 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 12 17:32:01.773765 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 12 17:32:01.773772 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Dec 12 17:32:01.773780 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Dec 12 17:32:01.773786 kernel: GICv3: using LPI property table @0x0000000100130000 Dec 12 17:32:01.773793 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Dec 12 17:32:01.773799 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 17:32:01.773806 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:32:01.773812 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 12 17:32:01.773818 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 12 17:32:01.773825 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 12 17:32:01.773832 kernel: arm-pv: using stolen time PV Dec 12 17:32:01.773838 kernel: Console: colour dummy device 80x25 Dec 12 17:32:01.773860 kernel: ACPI: Core revision 20240827 Dec 12 17:32:01.773868 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 12 17:32:01.773874 kernel: pid_max: default: 32768 minimum: 301 Dec 12 17:32:01.773881 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 17:32:01.773888 kernel: landlock: Up and running. Dec 12 17:32:01.773894 kernel: SELinux: Initializing. Dec 12 17:32:01.773901 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:32:01.773908 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:32:01.773915 kernel: rcu: Hierarchical SRCU implementation. Dec 12 17:32:01.773921 kernel: rcu: Max phase no-delay instances is 400. Dec 12 17:32:01.773930 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 17:32:01.773937 kernel: Remapping and enabling EFI services. Dec 12 17:32:01.773944 kernel: smp: Bringing up secondary CPUs ... Dec 12 17:32:01.773968 kernel: Detected PIPT I-cache on CPU1 Dec 12 17:32:01.773975 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 12 17:32:01.773982 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Dec 12 17:32:01.773989 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:32:01.773995 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 12 17:32:01.774002 kernel: Detected PIPT I-cache on CPU2 Dec 12 17:32:01.774016 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Dec 12 17:32:01.774023 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Dec 12 17:32:01.774030 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:32:01.774038 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Dec 12 17:32:01.774045 kernel: Detected PIPT I-cache on CPU3 Dec 12 17:32:01.774052 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Dec 12 17:32:01.774059 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Dec 12 17:32:01.774066 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:32:01.774074 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Dec 12 17:32:01.774081 kernel: smp: Brought up 1 node, 4 CPUs Dec 12 17:32:01.774088 kernel: SMP: Total of 4 processors activated. Dec 12 17:32:01.774095 kernel: CPU: All CPU(s) started at EL1 Dec 12 17:32:01.774102 kernel: CPU features: detected: 32-bit EL0 Support Dec 12 17:32:01.774109 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 12 17:32:01.774116 kernel: CPU features: detected: Common not Private translations Dec 12 17:32:01.774123 kernel: CPU features: detected: CRC32 instructions Dec 12 17:32:01.774130 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 12 17:32:01.774138 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 12 17:32:01.774145 kernel: CPU features: detected: LSE atomic instructions Dec 12 17:32:01.774152 kernel: CPU features: detected: Privileged Access Never Dec 12 17:32:01.774159 kernel: CPU features: detected: RAS Extension Support Dec 12 17:32:01.774166 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 12 17:32:01.774173 kernel: alternatives: applying system-wide alternatives Dec 12 17:32:01.774180 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Dec 12 17:32:01.774187 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Dec 12 17:32:01.774194 kernel: devtmpfs: initialized Dec 12 17:32:01.774203 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 17:32:01.774210 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Dec 12 17:32:01.774217 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 12 17:32:01.774224 kernel: 0 pages in range for non-PLT usage Dec 12 17:32:01.774231 kernel: 508400 pages in range for PLT usage Dec 12 17:32:01.774238 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 17:32:01.774244 kernel: SMBIOS 3.0.0 present. Dec 12 17:32:01.774251 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Dec 12 17:32:01.774259 kernel: DMI: Memory slots populated: 1/1 Dec 12 17:32:01.774267 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 17:32:01.774274 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 12 17:32:01.774281 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 12 17:32:01.774288 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 12 17:32:01.774295 kernel: audit: initializing netlink subsys (disabled) Dec 12 17:32:01.774302 kernel: audit: type=2000 audit(0.041:1): state=initialized audit_enabled=0 res=1 Dec 12 17:32:01.774309 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 17:32:01.774316 kernel: cpuidle: using governor menu Dec 12 17:32:01.774323 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 12 17:32:01.774331 kernel: ASID allocator initialised with 32768 entries Dec 12 17:32:01.774338 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 17:32:01.774345 kernel: Serial: AMBA PL011 UART driver Dec 12 17:32:01.774352 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 17:32:01.774359 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 17:32:01.774366 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 12 17:32:01.774373 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 12 17:32:01.774380 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 17:32:01.774387 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 17:32:01.774395 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 12 17:32:01.774402 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 12 17:32:01.774409 kernel: ACPI: Added _OSI(Module Device) Dec 12 17:32:01.774415 kernel: ACPI: Added _OSI(Processor Device) Dec 12 17:32:01.774422 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 17:32:01.774429 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 17:32:01.774436 kernel: ACPI: Interpreter enabled Dec 12 17:32:01.774443 kernel: ACPI: Using GIC for interrupt routing Dec 12 17:32:01.774450 kernel: ACPI: MCFG table detected, 1 entries Dec 12 17:32:01.774458 kernel: ACPI: CPU0 has been hot-added Dec 12 17:32:01.774465 kernel: ACPI: CPU1 has been hot-added Dec 12 17:32:01.774472 kernel: ACPI: CPU2 has been hot-added Dec 12 17:32:01.774479 kernel: ACPI: CPU3 has been hot-added Dec 12 17:32:01.774486 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 12 17:32:01.774493 kernel: printk: legacy console [ttyAMA0] enabled Dec 12 17:32:01.774500 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 12 17:32:01.774629 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 17:32:01.774695 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 12 17:32:01.774755 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 12 17:32:01.774813 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 12 17:32:01.774871 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 12 17:32:01.774880 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 12 17:32:01.774888 kernel: PCI host bridge to bus 0000:00 Dec 12 17:32:01.774965 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 12 17:32:01.775024 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 12 17:32:01.775080 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 12 17:32:01.775134 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 12 17:32:01.775214 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 12 17:32:01.775291 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.775354 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Dec 12 17:32:01.775414 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 17:32:01.775477 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Dec 12 17:32:01.775538 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 12 17:32:01.775605 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.775667 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Dec 12 17:32:01.775727 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 12 17:32:01.775786 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Dec 12 17:32:01.775852 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.775915 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Dec 12 17:32:01.775988 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 12 17:32:01.776051 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Dec 12 17:32:01.776111 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 12 17:32:01.776177 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.776238 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Dec 12 17:32:01.776298 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 12 17:32:01.776361 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 12 17:32:01.776426 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.776487 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Dec 12 17:32:01.776546 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 12 17:32:01.776606 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Dec 12 17:32:01.776667 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 12 17:32:01.776736 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.776806 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Dec 12 17:32:01.776872 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 12 17:32:01.776940 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Dec 12 17:32:01.777029 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 12 17:32:01.777105 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.777168 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Dec 12 17:32:01.777228 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 12 17:32:01.777317 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.777380 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Dec 12 17:32:01.777439 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 12 17:32:01.777510 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.777571 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Dec 12 17:32:01.777631 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 12 17:32:01.777697 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.777761 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Dec 12 17:32:01.777821 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 12 17:32:01.777906 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.777991 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Dec 12 17:32:01.778054 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 12 17:32:01.778123 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.778188 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Dec 12 17:32:01.778247 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 12 17:32:01.778316 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.778377 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Dec 12 17:32:01.778438 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 12 17:32:01.778506 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.778574 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Dec 12 17:32:01.778640 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 12 17:32:01.778708 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.778769 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Dec 12 17:32:01.778829 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 12 17:32:01.778895 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.778976 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Dec 12 17:32:01.779044 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 12 17:32:01.779112 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.779174 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Dec 12 17:32:01.779234 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 12 17:32:01.779303 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.779365 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Dec 12 17:32:01.779433 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 12 17:32:01.779497 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Dec 12 17:32:01.779559 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Dec 12 17:32:01.779626 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.779689 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Dec 12 17:32:01.779748 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 12 17:32:01.779807 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Dec 12 17:32:01.779866 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Dec 12 17:32:01.779939 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.780013 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Dec 12 17:32:01.780073 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 12 17:32:01.780132 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Dec 12 17:32:01.780192 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Dec 12 17:32:01.780270 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.780339 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Dec 12 17:32:01.780421 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 12 17:32:01.780483 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Dec 12 17:32:01.780542 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Dec 12 17:32:01.780609 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.780669 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Dec 12 17:32:01.780731 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 12 17:32:01.780792 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Dec 12 17:32:01.780859 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Dec 12 17:32:01.780928 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.781008 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Dec 12 17:32:01.781071 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 12 17:32:01.781133 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Dec 12 17:32:01.781198 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Dec 12 17:32:01.781265 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.781326 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Dec 12 17:32:01.781389 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 12 17:32:01.781448 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Dec 12 17:32:01.781510 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Dec 12 17:32:01.781577 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.781639 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Dec 12 17:32:01.781701 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 12 17:32:01.781764 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Dec 12 17:32:01.781828 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:32:01.781929 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.782018 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Dec 12 17:32:01.782086 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 12 17:32:01.782146 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Dec 12 17:32:01.782206 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:32:01.782280 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.782349 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Dec 12 17:32:01.782411 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 12 17:32:01.782474 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Dec 12 17:32:01.782552 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:32:01.782619 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.782682 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Dec 12 17:32:01.782742 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 12 17:32:01.782809 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Dec 12 17:32:01.782869 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:32:01.782937 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.783013 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Dec 12 17:32:01.783077 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 12 17:32:01.783138 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Dec 12 17:32:01.783199 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:32:01.783266 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.783333 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Dec 12 17:32:01.783396 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 12 17:32:01.783461 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Dec 12 17:32:01.783524 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:32:01.783594 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.783655 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Dec 12 17:32:01.783715 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 12 17:32:01.783775 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Dec 12 17:32:01.783835 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:32:01.783919 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.783992 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Dec 12 17:32:01.784057 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 12 17:32:01.784119 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Dec 12 17:32:01.784179 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:32:01.784245 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:32:01.784305 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Dec 12 17:32:01.784366 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 12 17:32:01.784426 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Dec 12 17:32:01.784487 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:32:01.784558 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 17:32:01.784622 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Dec 12 17:32:01.784685 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 12 17:32:01.784746 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 17:32:01.784819 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 12 17:32:01.784882 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Dec 12 17:32:01.784965 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 12 17:32:01.785030 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Dec 12 17:32:01.785092 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 12 17:32:01.785162 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:32:01.785226 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 12 17:32:01.785299 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:32:01.785374 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Dec 12 17:32:01.785439 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 12 17:32:01.785510 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Dec 12 17:32:01.785572 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Dec 12 17:32:01.785634 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 12 17:32:01.785697 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 12 17:32:01.785760 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:32:01.785820 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:32:01.785903 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 12 17:32:01.785983 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 12 17:32:01.786051 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 12 17:32:01.786118 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 12 17:32:01.786179 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:32:01.786239 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:32:01.786304 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 12 17:32:01.786364 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 12 17:32:01.786424 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 12 17:32:01.786487 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 12 17:32:01.786550 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:32:01.786617 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:32:01.786680 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 17:32:01.786741 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:32:01.786802 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:32:01.786865 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 17:32:01.786925 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Dec 12 17:32:01.787009 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Dec 12 17:32:01.787077 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 17:32:01.787138 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:32:01.787198 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:32:01.787262 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 17:32:01.787322 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:32:01.787382 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:32:01.787449 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 12 17:32:01.787509 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Dec 12 17:32:01.787582 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Dec 12 17:32:01.787646 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 12 17:32:01.787706 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 12 17:32:01.787766 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Dec 12 17:32:01.787833 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 12 17:32:01.787894 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Dec 12 17:32:01.787966 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Dec 12 17:32:01.788035 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 12 17:32:01.788096 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Dec 12 17:32:01.788156 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Dec 12 17:32:01.788221 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 12 17:32:01.788283 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Dec 12 17:32:01.788351 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Dec 12 17:32:01.788420 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 12 17:32:01.788486 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Dec 12 17:32:01.788547 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Dec 12 17:32:01.788610 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 12 17:32:01.788675 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Dec 12 17:32:01.788741 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Dec 12 17:32:01.788805 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 12 17:32:01.788866 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Dec 12 17:32:01.788926 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Dec 12 17:32:01.789010 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 12 17:32:01.789077 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Dec 12 17:32:01.789142 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Dec 12 17:32:01.789205 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 12 17:32:01.789266 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Dec 12 17:32:01.789326 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Dec 12 17:32:01.789395 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 12 17:32:01.789457 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Dec 12 17:32:01.789517 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Dec 12 17:32:01.789581 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 12 17:32:01.789642 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Dec 12 17:32:01.789702 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Dec 12 17:32:01.789765 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 12 17:32:01.789826 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Dec 12 17:32:01.789901 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Dec 12 17:32:01.789985 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 12 17:32:01.790052 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Dec 12 17:32:01.790113 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Dec 12 17:32:01.790175 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 12 17:32:01.790236 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Dec 12 17:32:01.790302 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Dec 12 17:32:01.790365 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 12 17:32:01.790426 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Dec 12 17:32:01.790488 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Dec 12 17:32:01.790550 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 12 17:32:01.790610 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Dec 12 17:32:01.790671 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Dec 12 17:32:01.790732 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 12 17:32:01.790795 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Dec 12 17:32:01.790857 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Dec 12 17:32:01.790919 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 12 17:32:01.790992 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Dec 12 17:32:01.791053 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Dec 12 17:32:01.791115 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 12 17:32:01.791176 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Dec 12 17:32:01.791242 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Dec 12 17:32:01.791307 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 12 17:32:01.791368 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Dec 12 17:32:01.791429 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Dec 12 17:32:01.791491 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 12 17:32:01.791553 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Dec 12 17:32:01.791614 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Dec 12 17:32:01.791679 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 12 17:32:01.791742 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Dec 12 17:32:01.791803 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Dec 12 17:32:01.791868 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 12 17:32:01.791929 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Dec 12 17:32:01.792003 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Dec 12 17:32:01.792066 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 12 17:32:01.792128 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 12 17:32:01.792199 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 12 17:32:01.792261 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 12 17:32:01.792326 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 12 17:32:01.792389 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 12 17:32:01.792452 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 12 17:32:01.792513 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 12 17:32:01.792575 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 12 17:32:01.792638 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 12 17:32:01.792700 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 12 17:32:01.792761 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 12 17:32:01.792823 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 12 17:32:01.792885 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 12 17:32:01.792959 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 12 17:32:01.793030 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 12 17:32:01.793097 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 12 17:32:01.793177 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 12 17:32:01.793241 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Dec 12 17:32:01.793304 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Dec 12 17:32:01.793365 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Dec 12 17:32:01.793426 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Dec 12 17:32:01.793486 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Dec 12 17:32:01.793551 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Dec 12 17:32:01.793614 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Dec 12 17:32:01.793676 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Dec 12 17:32:01.793739 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Dec 12 17:32:01.793798 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Dec 12 17:32:01.793868 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Dec 12 17:32:01.793932 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Dec 12 17:32:01.794009 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Dec 12 17:32:01.794076 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Dec 12 17:32:01.794141 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Dec 12 17:32:01.794208 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Dec 12 17:32:01.794271 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Dec 12 17:32:01.794333 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Dec 12 17:32:01.794398 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Dec 12 17:32:01.794459 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Dec 12 17:32:01.794529 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Dec 12 17:32:01.794606 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Dec 12 17:32:01.794670 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Dec 12 17:32:01.794731 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Dec 12 17:32:01.794794 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Dec 12 17:32:01.794856 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Dec 12 17:32:01.794916 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Dec 12 17:32:01.794991 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Dec 12 17:32:01.795054 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Dec 12 17:32:01.795115 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Dec 12 17:32:01.795185 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Dec 12 17:32:01.795247 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Dec 12 17:32:01.795310 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Dec 12 17:32:01.795371 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Dec 12 17:32:01.795433 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Dec 12 17:32:01.795494 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Dec 12 17:32:01.795556 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Dec 12 17:32:01.795617 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Dec 12 17:32:01.795682 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Dec 12 17:32:01.795745 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Dec 12 17:32:01.795808 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Dec 12 17:32:01.795868 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Dec 12 17:32:01.795935 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Dec 12 17:32:01.796008 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Dec 12 17:32:01.796073 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Dec 12 17:32:01.796139 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Dec 12 17:32:01.796204 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Dec 12 17:32:01.796267 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Dec 12 17:32:01.796331 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Dec 12 17:32:01.796393 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:32:01.796475 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Dec 12 17:32:01.796539 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:32:01.796602 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Dec 12 17:32:01.796663 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:32:01.796726 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Dec 12 17:32:01.796791 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:32:01.796854 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Dec 12 17:32:01.796914 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:32:01.796982 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Dec 12 17:32:01.797044 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:32:01.797107 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Dec 12 17:32:01.797167 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:32:01.797232 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Dec 12 17:32:01.797293 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:32:01.797362 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Dec 12 17:32:01.797428 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:32:01.797493 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Dec 12 17:32:01.797555 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Dec 12 17:32:01.797618 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Dec 12 17:32:01.797679 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Dec 12 17:32:01.797750 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Dec 12 17:32:01.797811 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Dec 12 17:32:01.797882 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Dec 12 17:32:01.797958 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Dec 12 17:32:01.798032 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Dec 12 17:32:01.798100 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Dec 12 17:32:01.798165 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Dec 12 17:32:01.798231 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Dec 12 17:32:01.798292 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Dec 12 17:32:01.798352 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.798413 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.798474 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Dec 12 17:32:01.798535 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.798605 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.798668 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Dec 12 17:32:01.798730 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.798796 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.798859 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Dec 12 17:32:01.798927 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.799018 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.799085 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Dec 12 17:32:01.799147 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.799207 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.799273 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Dec 12 17:32:01.799335 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.799396 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.799459 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Dec 12 17:32:01.799526 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.799592 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.799654 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Dec 12 17:32:01.799715 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.799783 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.799846 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Dec 12 17:32:01.799910 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.799984 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.800052 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Dec 12 17:32:01.800116 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.800184 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.800246 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Dec 12 17:32:01.800313 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.800374 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.800436 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Dec 12 17:32:01.800496 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.800557 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.800625 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Dec 12 17:32:01.800686 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.800748 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.800809 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Dec 12 17:32:01.800869 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.800930 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.801017 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Dec 12 17:32:01.801086 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.801151 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.801212 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Dec 12 17:32:01.801273 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.801334 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.801396 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Dec 12 17:32:01.801457 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.801519 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.801581 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Dec 12 17:32:01.801645 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.801706 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.801768 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:32:01.801830 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:32:01.801904 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:32:01.801987 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:32:01.802053 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:32:01.802118 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:32:01.802198 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:32:01.802263 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:32:01.802333 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:32:01.802397 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Dec 12 17:32:01.802461 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Dec 12 17:32:01.802540 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Dec 12 17:32:01.802608 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Dec 12 17:32:01.802672 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Dec 12 17:32:01.802733 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Dec 12 17:32:01.802793 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.802857 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.802927 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.802999 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803067 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.803129 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803190 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.803263 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803325 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.803390 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803454 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.803515 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803577 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.803638 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803699 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.803761 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803822 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.803891 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.803964 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804029 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.804093 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804165 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.804229 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804291 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.804354 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804420 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.804483 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804552 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.804616 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804677 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.804739 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804802 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.804870 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.804934 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.805004 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:32:01.805072 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:32:01.805142 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 12 17:32:01.805214 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 12 17:32:01.805278 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 12 17:32:01.805348 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 17:32:01.805408 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:32:01.805469 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:32:01.805535 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 12 17:32:01.805596 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 12 17:32:01.805658 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:32:01.805720 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:32:01.805786 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 12 17:32:01.805859 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 12 17:32:01.805924 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 12 17:32:01.806021 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:32:01.806083 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:32:01.806152 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 12 17:32:01.806218 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 12 17:32:01.806281 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:32:01.806343 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:32:01.806409 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 12 17:32:01.806476 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 12 17:32:01.806538 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 12 17:32:01.806600 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:32:01.806664 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:32:01.806733 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 12 17:32:01.806797 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 12 17:32:01.806859 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 12 17:32:01.806921 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:32:01.806998 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:32:01.807064 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 12 17:32:01.807128 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:32:01.807191 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:32:01.807252 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 12 17:32:01.807313 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:32:01.807373 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:32:01.807435 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 12 17:32:01.807496 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:32:01.807559 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:32:01.807621 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 12 17:32:01.807682 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Dec 12 17:32:01.807743 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Dec 12 17:32:01.807804 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 12 17:32:01.807865 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Dec 12 17:32:01.807925 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Dec 12 17:32:01.807997 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 12 17:32:01.808062 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Dec 12 17:32:01.808123 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Dec 12 17:32:01.808185 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 12 17:32:01.808245 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Dec 12 17:32:01.808306 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Dec 12 17:32:01.808371 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 12 17:32:01.808432 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Dec 12 17:32:01.808492 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 12 17:32:01.808555 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 12 17:32:01.808616 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Dec 12 17:32:01.808676 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 12 17:32:01.808757 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 12 17:32:01.808820 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Dec 12 17:32:01.808882 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 12 17:32:01.808944 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 12 17:32:01.809023 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Dec 12 17:32:01.809085 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Dec 12 17:32:01.809147 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 12 17:32:01.809212 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Dec 12 17:32:01.809273 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Dec 12 17:32:01.809334 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 12 17:32:01.809395 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Dec 12 17:32:01.809454 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Dec 12 17:32:01.809514 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Dec 12 17:32:01.809576 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 12 17:32:01.809636 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Dec 12 17:32:01.809698 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Dec 12 17:32:01.809759 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Dec 12 17:32:01.809820 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 12 17:32:01.809897 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Dec 12 17:32:01.809983 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Dec 12 17:32:01.810048 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Dec 12 17:32:01.810111 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 12 17:32:01.810172 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Dec 12 17:32:01.810237 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Dec 12 17:32:01.810298 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 12 17:32:01.810360 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 12 17:32:01.810421 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Dec 12 17:32:01.810480 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Dec 12 17:32:01.810540 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 12 17:32:01.810602 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 12 17:32:01.810664 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Dec 12 17:32:01.810726 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Dec 12 17:32:01.810788 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 12 17:32:01.810850 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 12 17:32:01.810912 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Dec 12 17:32:01.810982 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Dec 12 17:32:01.811044 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Dec 12 17:32:01.811106 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 12 17:32:01.811167 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Dec 12 17:32:01.811227 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Dec 12 17:32:01.811290 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Dec 12 17:32:01.811353 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 12 17:32:01.811414 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Dec 12 17:32:01.811475 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Dec 12 17:32:01.811535 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Dec 12 17:32:01.811597 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 12 17:32:01.811658 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Dec 12 17:32:01.811718 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Dec 12 17:32:01.811780 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Dec 12 17:32:01.811843 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 12 17:32:01.811905 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Dec 12 17:32:01.811973 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Dec 12 17:32:01.812035 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Dec 12 17:32:01.812097 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 12 17:32:01.812157 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Dec 12 17:32:01.812218 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Dec 12 17:32:01.812278 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 12 17:32:01.812356 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 12 17:32:01.812419 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Dec 12 17:32:01.812480 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Dec 12 17:32:01.812540 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 12 17:32:01.812605 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 12 17:32:01.812666 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Dec 12 17:32:01.812726 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Dec 12 17:32:01.812788 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 12 17:32:01.812866 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 12 17:32:01.812936 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Dec 12 17:32:01.813014 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Dec 12 17:32:01.813076 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Dec 12 17:32:01.813140 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 12 17:32:01.813195 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 12 17:32:01.813249 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 12 17:32:01.813318 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 12 17:32:01.813378 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:32:01.813441 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 12 17:32:01.813498 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:32:01.813561 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 12 17:32:01.813618 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:32:01.813681 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 12 17:32:01.813741 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:32:01.813811 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 12 17:32:01.813892 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:32:01.814012 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 12 17:32:01.814076 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:32:01.814142 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 12 17:32:01.814203 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:32:01.814274 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 12 17:32:01.814332 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:32:01.814405 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 12 17:32:01.814464 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:32:01.814529 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Dec 12 17:32:01.814591 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Dec 12 17:32:01.814660 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Dec 12 17:32:01.814734 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Dec 12 17:32:01.814797 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Dec 12 17:32:01.814856 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Dec 12 17:32:01.814933 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Dec 12 17:32:01.815013 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Dec 12 17:32:01.815088 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Dec 12 17:32:01.815146 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 12 17:32:01.815209 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Dec 12 17:32:01.815265 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 12 17:32:01.815329 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Dec 12 17:32:01.815389 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 12 17:32:01.815456 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Dec 12 17:32:01.815518 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Dec 12 17:32:01.815581 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Dec 12 17:32:01.815638 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Dec 12 17:32:01.815703 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Dec 12 17:32:01.815765 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Dec 12 17:32:01.815823 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Dec 12 17:32:01.815888 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Dec 12 17:32:01.815964 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Dec 12 17:32:01.816023 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Dec 12 17:32:01.816087 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Dec 12 17:32:01.816147 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Dec 12 17:32:01.816203 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Dec 12 17:32:01.816275 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Dec 12 17:32:01.816336 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Dec 12 17:32:01.816392 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 12 17:32:01.816465 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Dec 12 17:32:01.816522 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Dec 12 17:32:01.816581 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 12 17:32:01.816644 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Dec 12 17:32:01.816702 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Dec 12 17:32:01.816757 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 12 17:32:01.816820 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Dec 12 17:32:01.816883 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Dec 12 17:32:01.816940 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Dec 12 17:32:01.817031 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Dec 12 17:32:01.817088 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Dec 12 17:32:01.817144 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Dec 12 17:32:01.817207 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 12 17:32:01.817263 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Dec 12 17:32:01.817319 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Dec 12 17:32:01.817394 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Dec 12 17:32:01.817456 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Dec 12 17:32:01.817513 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Dec 12 17:32:01.817579 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Dec 12 17:32:01.817637 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Dec 12 17:32:01.817693 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Dec 12 17:32:01.817754 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Dec 12 17:32:01.817814 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Dec 12 17:32:01.817893 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 12 17:32:01.817983 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Dec 12 17:32:01.818042 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Dec 12 17:32:01.818103 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 12 17:32:01.818167 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Dec 12 17:32:01.818228 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Dec 12 17:32:01.818287 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 12 17:32:01.818354 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Dec 12 17:32:01.818411 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Dec 12 17:32:01.818474 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Dec 12 17:32:01.818485 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 12 17:32:01.818492 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 12 17:32:01.818500 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 12 17:32:01.818509 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 12 17:32:01.818517 kernel: iommu: Default domain type: Translated Dec 12 17:32:01.818524 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 12 17:32:01.818532 kernel: efivars: Registered efivars operations Dec 12 17:32:01.818539 kernel: vgaarb: loaded Dec 12 17:32:01.818547 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 12 17:32:01.818558 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 17:32:01.818565 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 17:32:01.818573 kernel: pnp: PnP ACPI init Dec 12 17:32:01.818643 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 12 17:32:01.818656 kernel: pnp: PnP ACPI: found 1 devices Dec 12 17:32:01.818664 kernel: NET: Registered PF_INET protocol family Dec 12 17:32:01.818671 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 17:32:01.818679 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 12 17:32:01.818687 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 17:32:01.818694 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 12 17:32:01.818702 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 12 17:32:01.818709 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 12 17:32:01.818718 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 17:32:01.818726 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 17:32:01.818734 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 17:32:01.818801 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 12 17:32:01.818812 kernel: PCI: CLS 0 bytes, default 64 Dec 12 17:32:01.818820 kernel: kvm [1]: HYP mode not available Dec 12 17:32:01.818827 kernel: Initialise system trusted keyrings Dec 12 17:32:01.818835 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 12 17:32:01.818842 kernel: Key type asymmetric registered Dec 12 17:32:01.818851 kernel: Asymmetric key parser 'x509' registered Dec 12 17:32:01.818858 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 12 17:32:01.818870 kernel: io scheduler mq-deadline registered Dec 12 17:32:01.818878 kernel: io scheduler kyber registered Dec 12 17:32:01.818885 kernel: io scheduler bfq registered Dec 12 17:32:01.818896 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 12 17:32:01.818974 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Dec 12 17:32:01.819038 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Dec 12 17:32:01.819105 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.819172 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Dec 12 17:32:01.819234 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Dec 12 17:32:01.819294 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.819357 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Dec 12 17:32:01.819417 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Dec 12 17:32:01.819482 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.819550 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Dec 12 17:32:01.819614 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Dec 12 17:32:01.819675 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.819737 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Dec 12 17:32:01.819798 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Dec 12 17:32:01.819858 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.819925 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Dec 12 17:32:01.819998 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Dec 12 17:32:01.820062 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.820129 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Dec 12 17:32:01.820190 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Dec 12 17:32:01.820251 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.820317 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Dec 12 17:32:01.820379 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Dec 12 17:32:01.820443 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.820453 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 12 17:32:01.820513 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Dec 12 17:32:01.820577 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Dec 12 17:32:01.820638 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.820706 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Dec 12 17:32:01.820768 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Dec 12 17:32:01.820828 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.820895 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Dec 12 17:32:01.820970 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Dec 12 17:32:01.821033 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.821103 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Dec 12 17:32:01.821165 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Dec 12 17:32:01.821229 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.821299 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Dec 12 17:32:01.821422 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Dec 12 17:32:01.821648 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.821711 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Dec 12 17:32:01.821775 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Dec 12 17:32:01.821840 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.821921 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Dec 12 17:32:01.822005 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Dec 12 17:32:01.822074 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.822139 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Dec 12 17:32:01.822201 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Dec 12 17:32:01.822261 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.822280 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 12 17:32:01.822341 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Dec 12 17:32:01.822402 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Dec 12 17:32:01.822470 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.822533 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Dec 12 17:32:01.822594 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Dec 12 17:32:01.822654 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.822716 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Dec 12 17:32:01.822780 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Dec 12 17:32:01.822840 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.822908 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Dec 12 17:32:01.822985 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Dec 12 17:32:01.823048 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.823117 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Dec 12 17:32:01.823178 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Dec 12 17:32:01.823239 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.823309 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Dec 12 17:32:01.823370 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Dec 12 17:32:01.823430 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.823510 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Dec 12 17:32:01.823572 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Dec 12 17:32:01.823632 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.823700 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Dec 12 17:32:01.823764 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Dec 12 17:32:01.823824 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.823835 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 12 17:32:01.823898 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Dec 12 17:32:01.823974 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Dec 12 17:32:01.824038 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.824100 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Dec 12 17:32:01.824161 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Dec 12 17:32:01.824231 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.824300 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Dec 12 17:32:01.824362 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Dec 12 17:32:01.824423 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.824492 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Dec 12 17:32:01.824554 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Dec 12 17:32:01.824618 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.824680 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Dec 12 17:32:01.824744 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Dec 12 17:32:01.824808 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.824872 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Dec 12 17:32:01.824933 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Dec 12 17:32:01.825007 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.825071 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Dec 12 17:32:01.825135 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Dec 12 17:32:01.825195 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.825261 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Dec 12 17:32:01.825327 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Dec 12 17:32:01.825388 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.825450 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Dec 12 17:32:01.825514 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Dec 12 17:32:01.825575 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:32:01.825585 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 12 17:32:01.825595 kernel: ACPI: button: Power Button [PWRB] Dec 12 17:32:01.825660 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Dec 12 17:32:01.825731 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 12 17:32:01.825742 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 17:32:01.825750 kernel: thunder_xcv, ver 1.0 Dec 12 17:32:01.825758 kernel: thunder_bgx, ver 1.0 Dec 12 17:32:01.825765 kernel: nicpf, ver 1.0 Dec 12 17:32:01.825772 kernel: nicvf, ver 1.0 Dec 12 17:32:01.825853 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 12 17:32:01.825929 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-12T17:32:01 UTC (1765560721) Dec 12 17:32:01.825939 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 17:32:01.825959 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 12 17:32:01.825967 kernel: watchdog: NMI not fully supported Dec 12 17:32:01.825974 kernel: watchdog: Hard watchdog permanently disabled Dec 12 17:32:01.825981 kernel: NET: Registered PF_INET6 protocol family Dec 12 17:32:01.825989 kernel: Segment Routing with IPv6 Dec 12 17:32:01.825996 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 17:32:01.826006 kernel: NET: Registered PF_PACKET protocol family Dec 12 17:32:01.826014 kernel: Key type dns_resolver registered Dec 12 17:32:01.826021 kernel: registered taskstats version 1 Dec 12 17:32:01.826029 kernel: Loading compiled-in X.509 certificates Dec 12 17:32:01.826036 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 12 17:32:01.826043 kernel: Demotion targets for Node 0: null Dec 12 17:32:01.826051 kernel: Key type .fscrypt registered Dec 12 17:32:01.826058 kernel: Key type fscrypt-provisioning registered Dec 12 17:32:01.826065 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 17:32:01.826073 kernel: ima: Allocated hash algorithm: sha1 Dec 12 17:32:01.826082 kernel: ima: No architecture policies found Dec 12 17:32:01.826089 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 12 17:32:01.826100 kernel: clk: Disabling unused clocks Dec 12 17:32:01.826108 kernel: PM: genpd: Disabling unused power domains Dec 12 17:32:01.826115 kernel: Warning: unable to open an initial console. Dec 12 17:32:01.826123 kernel: Freeing unused kernel memory: 39552K Dec 12 17:32:01.826130 kernel: Run /init as init process Dec 12 17:32:01.826137 kernel: with arguments: Dec 12 17:32:01.826145 kernel: /init Dec 12 17:32:01.826154 kernel: with environment: Dec 12 17:32:01.826161 kernel: HOME=/ Dec 12 17:32:01.826168 kernel: TERM=linux Dec 12 17:32:01.826177 systemd[1]: Successfully made /usr/ read-only. Dec 12 17:32:01.826187 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:32:01.826196 systemd[1]: Detected virtualization kvm. Dec 12 17:32:01.826204 systemd[1]: Detected architecture arm64. Dec 12 17:32:01.826213 systemd[1]: Running in initrd. Dec 12 17:32:01.826220 systemd[1]: No hostname configured, using default hostname. Dec 12 17:32:01.826228 systemd[1]: Hostname set to . Dec 12 17:32:01.826236 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:32:01.826244 systemd[1]: Queued start job for default target initrd.target. Dec 12 17:32:01.826252 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:32:01.826267 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:32:01.826280 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 17:32:01.826289 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:32:01.826297 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 17:32:01.826307 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 17:32:01.826316 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 17:32:01.826325 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 17:32:01.826333 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:32:01.826341 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:32:01.826349 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:32:01.826357 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:32:01.826367 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:32:01.826375 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:32:01.826383 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:32:01.826391 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:32:01.826399 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 17:32:01.826407 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 17:32:01.826415 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:32:01.826423 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:32:01.826432 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:32:01.826441 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:32:01.826450 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 17:32:01.826459 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:32:01.826467 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 17:32:01.826476 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 17:32:01.826484 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 17:32:01.826492 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:32:01.826502 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:32:01.826510 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:32:01.826518 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 17:32:01.826527 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:32:01.826535 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 17:32:01.826545 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:32:01.826575 systemd-journald[313]: Collecting audit messages is disabled. Dec 12 17:32:01.826596 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 17:32:01.826605 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:32:01.826614 kernel: Bridge firewalling registered Dec 12 17:32:01.826623 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 17:32:01.826631 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:32:01.826639 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:32:01.826648 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:32:01.826656 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:32:01.826665 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:32:01.826675 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:32:01.826684 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:32:01.826692 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 17:32:01.826703 systemd-journald[313]: Journal started Dec 12 17:32:01.826720 systemd-journald[313]: Runtime Journal (/run/log/journal/a2b400e7537745009e0e9caeba16111b) is 8M, max 319.5M, 311.5M free. Dec 12 17:32:01.766464 systemd-modules-load[314]: Inserted module 'overlay' Dec 12 17:32:01.832018 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:32:01.780797 systemd-modules-load[314]: Inserted module 'br_netfilter' Dec 12 17:32:01.833833 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:32:01.842315 dracut-cmdline[343]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:32:01.851388 systemd-tmpfiles[347]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 17:32:01.854131 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:32:01.857088 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:32:01.897555 systemd-resolved[386]: Positive Trust Anchors: Dec 12 17:32:01.897575 systemd-resolved[386]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:32:01.897605 systemd-resolved[386]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:32:01.903144 systemd-resolved[386]: Defaulting to hostname 'linux'. Dec 12 17:32:01.904159 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:32:01.905429 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:32:01.916978 kernel: SCSI subsystem initialized Dec 12 17:32:01.920956 kernel: Loading iSCSI transport class v2.0-870. Dec 12 17:32:01.928995 kernel: iscsi: registered transport (tcp) Dec 12 17:32:01.941995 kernel: iscsi: registered transport (qla4xxx) Dec 12 17:32:01.942062 kernel: QLogic iSCSI HBA Driver Dec 12 17:32:01.957171 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:32:01.971754 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:32:01.973747 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:32:02.015351 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 17:32:02.017443 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 17:32:02.083013 kernel: raid6: neonx8 gen() 15609 MB/s Dec 12 17:32:02.099989 kernel: raid6: neonx4 gen() 15716 MB/s Dec 12 17:32:02.116964 kernel: raid6: neonx2 gen() 13116 MB/s Dec 12 17:32:02.133997 kernel: raid6: neonx1 gen() 10327 MB/s Dec 12 17:32:02.151048 kernel: raid6: int64x8 gen() 6839 MB/s Dec 12 17:32:02.168001 kernel: raid6: int64x4 gen() 7290 MB/s Dec 12 17:32:02.184964 kernel: raid6: int64x2 gen() 6046 MB/s Dec 12 17:32:02.201978 kernel: raid6: int64x1 gen() 5006 MB/s Dec 12 17:32:02.202013 kernel: raid6: using algorithm neonx4 gen() 15716 MB/s Dec 12 17:32:02.218999 kernel: raid6: .... xor() 12275 MB/s, rmw enabled Dec 12 17:32:02.219048 kernel: raid6: using neon recovery algorithm Dec 12 17:32:02.224205 kernel: xor: measuring software checksum speed Dec 12 17:32:02.224236 kernel: 8regs : 21641 MB/sec Dec 12 17:32:02.225317 kernel: 32regs : 21699 MB/sec Dec 12 17:32:02.225366 kernel: arm64_neon : 27331 MB/sec Dec 12 17:32:02.225399 kernel: xor: using function: arm64_neon (27331 MB/sec) Dec 12 17:32:02.277981 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 17:32:02.284611 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:32:02.287021 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:32:02.314539 systemd-udevd[565]: Using default interface naming scheme 'v255'. Dec 12 17:32:02.318682 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:32:02.321067 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 17:32:02.345801 dracut-pre-trigger[574]: rd.md=0: removing MD RAID activation Dec 12 17:32:02.368612 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:32:02.370786 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:32:02.453398 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:32:02.457162 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 17:32:02.499979 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Dec 12 17:32:02.502587 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 12 17:32:02.509957 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 12 17:32:02.510003 kernel: GPT:17805311 != 104857599 Dec 12 17:32:02.510014 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 12 17:32:02.510024 kernel: GPT:17805311 != 104857599 Dec 12 17:32:02.510033 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 12 17:32:02.512420 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:32:02.544993 kernel: ACPI: bus type USB registered Dec 12 17:32:02.545046 kernel: usbcore: registered new interface driver usbfs Dec 12 17:32:02.546360 kernel: usbcore: registered new interface driver hub Dec 12 17:32:02.546403 kernel: usbcore: registered new device driver usb Dec 12 17:32:02.553592 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:32:02.553672 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:32:02.556742 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:32:02.558503 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:32:02.564971 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:32:02.565137 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 12 17:32:02.565225 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 12 17:32:02.573308 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:32:02.573495 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 12 17:32:02.573626 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 12 17:32:02.573724 kernel: hub 1-0:1.0: USB hub found Dec 12 17:32:02.576114 kernel: hub 1-0:1.0: 4 ports detected Dec 12 17:32:02.577970 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 12 17:32:02.579972 kernel: hub 2-0:1.0: USB hub found Dec 12 17:32:02.580983 kernel: hub 2-0:1.0: 4 ports detected Dec 12 17:32:02.594012 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 12 17:32:02.595192 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:32:02.608854 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 17:32:02.610128 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 17:32:02.617047 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 12 17:32:02.618033 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 12 17:32:02.631847 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 12 17:32:02.632941 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:32:02.634794 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:32:02.636484 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:32:02.638802 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 17:32:02.640452 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 17:32:02.659437 disk-uuid[663]: Primary Header is updated. Dec 12 17:32:02.659437 disk-uuid[663]: Secondary Entries is updated. Dec 12 17:32:02.659437 disk-uuid[663]: Secondary Header is updated. Dec 12 17:32:02.663063 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:32:02.666973 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:32:02.817976 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 12 17:32:02.948387 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 12 17:32:02.948498 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 12 17:32:02.949410 kernel: usbcore: registered new interface driver usbhid Dec 12 17:32:02.949426 kernel: usbhid: USB HID core driver Dec 12 17:32:03.054975 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 12 17:32:03.179983 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 12 17:32:03.231962 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 12 17:32:03.676968 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:32:03.677467 disk-uuid[666]: The operation has completed successfully. Dec 12 17:32:03.723504 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 17:32:03.723620 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 17:32:03.744848 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 17:32:03.758467 sh[685]: Success Dec 12 17:32:03.772492 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 17:32:03.772526 kernel: device-mapper: uevent: version 1.0.3 Dec 12 17:32:03.772537 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 17:32:03.778978 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 12 17:32:03.831383 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 17:32:03.833859 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 17:32:03.855465 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 17:32:03.871994 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (697) Dec 12 17:32:03.874602 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 12 17:32:03.874622 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:32:03.896994 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 17:32:03.897041 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 17:32:03.900337 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 17:32:03.901492 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:32:03.902537 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 17:32:03.903321 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 17:32:03.905916 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 17:32:03.946979 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (730) Dec 12 17:32:03.949689 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:32:03.949725 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:32:03.955463 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:32:03.955505 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:32:03.959964 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:32:03.960854 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 17:32:03.963067 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 17:32:04.005056 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:32:04.010625 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:32:04.050358 systemd-networkd[868]: lo: Link UP Dec 12 17:32:04.050372 systemd-networkd[868]: lo: Gained carrier Dec 12 17:32:04.051342 systemd-networkd[868]: Enumeration completed Dec 12 17:32:04.051705 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:32:04.051761 systemd-networkd[868]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:32:04.051765 systemd-networkd[868]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:32:04.052438 systemd-networkd[868]: eth0: Link UP Dec 12 17:32:04.052523 systemd-networkd[868]: eth0: Gained carrier Dec 12 17:32:04.052531 systemd-networkd[868]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:32:04.052798 systemd[1]: Reached target network.target - Network. Dec 12 17:32:04.071056 systemd-networkd[868]: eth0: DHCPv4 address 10.0.8.78/25, gateway 10.0.8.1 acquired from 10.0.8.1 Dec 12 17:32:04.114341 ignition[805]: Ignition 2.22.0 Dec 12 17:32:04.114355 ignition[805]: Stage: fetch-offline Dec 12 17:32:04.114384 ignition[805]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:32:04.114392 ignition[805]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:32:04.116470 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:32:04.114472 ignition[805]: parsed url from cmdline: "" Dec 12 17:32:04.114474 ignition[805]: no config URL provided Dec 12 17:32:04.114479 ignition[805]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:32:04.119538 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 17:32:04.114485 ignition[805]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:32:04.114489 ignition[805]: failed to fetch config: resource requires networking Dec 12 17:32:04.114629 ignition[805]: Ignition finished successfully Dec 12 17:32:04.155383 ignition[883]: Ignition 2.22.0 Dec 12 17:32:04.155400 ignition[883]: Stage: fetch Dec 12 17:32:04.155545 ignition[883]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:32:04.155554 ignition[883]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:32:04.155639 ignition[883]: parsed url from cmdline: "" Dec 12 17:32:04.155642 ignition[883]: no config URL provided Dec 12 17:32:04.155646 ignition[883]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:32:04.155653 ignition[883]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:32:04.155751 ignition[883]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 12 17:32:04.156072 ignition[883]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 12 17:32:04.156159 ignition[883]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 12 17:32:04.489639 ignition[883]: GET result: OK Dec 12 17:32:04.490010 ignition[883]: parsing config with SHA512: dd0f399fc28202c23ac6f4cf051d6397a38558f596e657c22ff40ce451ab92dbc7167f184ae9461ab8a4723c78b7fe4a19a01f86bc74e859904ebcb9a561ac50 Dec 12 17:32:04.494802 unknown[883]: fetched base config from "system" Dec 12 17:32:04.495651 unknown[883]: fetched base config from "system" Dec 12 17:32:04.496014 ignition[883]: fetch: fetch complete Dec 12 17:32:04.495660 unknown[883]: fetched user config from "openstack" Dec 12 17:32:04.496019 ignition[883]: fetch: fetch passed Dec 12 17:32:04.498054 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 17:32:04.496065 ignition[883]: Ignition finished successfully Dec 12 17:32:04.500063 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 17:32:04.532992 ignition[891]: Ignition 2.22.0 Dec 12 17:32:04.533002 ignition[891]: Stage: kargs Dec 12 17:32:04.533128 ignition[891]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:32:04.533137 ignition[891]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:32:04.536282 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 17:32:04.533814 ignition[891]: kargs: kargs passed Dec 12 17:32:04.533869 ignition[891]: Ignition finished successfully Dec 12 17:32:04.540581 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 17:32:04.572834 ignition[899]: Ignition 2.22.0 Dec 12 17:32:04.572849 ignition[899]: Stage: disks Dec 12 17:32:04.573005 ignition[899]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:32:04.573014 ignition[899]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:32:04.573702 ignition[899]: disks: disks passed Dec 12 17:32:04.575974 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 17:32:04.573741 ignition[899]: Ignition finished successfully Dec 12 17:32:04.577698 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 17:32:04.578844 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 17:32:04.580677 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:32:04.581853 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:32:04.583377 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:32:04.585821 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 17:32:04.624397 systemd-fsck[908]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 17:32:04.627419 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 17:32:04.630407 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 17:32:04.750974 kernel: EXT4-fs (vda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 12 17:32:04.751357 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 17:32:04.752432 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 17:32:04.756835 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:32:04.758732 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 17:32:04.759619 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 12 17:32:04.760195 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 12 17:32:04.762285 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 17:32:04.762312 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:32:04.772654 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 17:32:04.774520 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 17:32:04.786993 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (916) Dec 12 17:32:04.789978 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:32:04.790023 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:32:04.794859 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:32:04.794907 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:32:04.797338 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:32:04.825987 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:04.831499 initrd-setup-root[944]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 17:32:04.836675 initrd-setup-root[951]: cut: /sysroot/etc/group: No such file or directory Dec 12 17:32:04.841572 initrd-setup-root[958]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 17:32:04.846651 initrd-setup-root[965]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 17:32:04.932096 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 17:32:04.934301 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 17:32:04.937195 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 17:32:04.951677 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 17:32:04.953454 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:32:04.971985 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 17:32:04.985991 ignition[1033]: INFO : Ignition 2.22.0 Dec 12 17:32:04.985991 ignition[1033]: INFO : Stage: mount Dec 12 17:32:04.987342 ignition[1033]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:32:04.987342 ignition[1033]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:32:04.987342 ignition[1033]: INFO : mount: mount passed Dec 12 17:32:04.987342 ignition[1033]: INFO : Ignition finished successfully Dec 12 17:32:04.989656 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 17:32:05.254145 systemd-networkd[868]: eth0: Gained IPv6LL Dec 12 17:32:05.864019 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:07.870020 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:11.879047 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:11.881727 coreos-metadata[918]: Dec 12 17:32:11.881 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:32:11.898554 coreos-metadata[918]: Dec 12 17:32:11.898 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 17:32:12.028750 coreos-metadata[918]: Dec 12 17:32:12.028 INFO Fetch successful Dec 12 17:32:12.029726 coreos-metadata[918]: Dec 12 17:32:12.029 INFO wrote hostname ci-4459-2-2-3-c846c80ac0 to /sysroot/etc/hostname Dec 12 17:32:12.031723 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 12 17:32:12.032905 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 12 17:32:12.036260 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 17:32:12.053788 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:32:12.079986 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1052) Dec 12 17:32:12.082390 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:32:12.082421 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:32:12.086977 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:32:12.087021 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:32:12.087904 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:32:12.121543 ignition[1070]: INFO : Ignition 2.22.0 Dec 12 17:32:12.121543 ignition[1070]: INFO : Stage: files Dec 12 17:32:12.122983 ignition[1070]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:32:12.122983 ignition[1070]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:32:12.122983 ignition[1070]: DEBUG : files: compiled without relabeling support, skipping Dec 12 17:32:12.125778 ignition[1070]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 17:32:12.125778 ignition[1070]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 17:32:12.128231 ignition[1070]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 17:32:12.128231 ignition[1070]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 17:32:12.128231 ignition[1070]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 17:32:12.126622 unknown[1070]: wrote ssh authorized keys file for user: core Dec 12 17:32:12.132026 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:32:12.132026 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 12 17:32:12.191662 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 17:32:12.302277 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:32:12.302277 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:32:12.305583 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:32:12.318927 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:32:12.318927 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:32:12.318927 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Dec 12 17:32:12.596812 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 17:32:13.150764 ignition[1070]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 12 17:32:13.150764 ignition[1070]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 17:32:13.154957 ignition[1070]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:32:13.157382 ignition[1070]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:32:13.157382 ignition[1070]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 17:32:13.160902 ignition[1070]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 12 17:32:13.160902 ignition[1070]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 17:32:13.160902 ignition[1070]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:32:13.160902 ignition[1070]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:32:13.160902 ignition[1070]: INFO : files: files passed Dec 12 17:32:13.160902 ignition[1070]: INFO : Ignition finished successfully Dec 12 17:32:13.160798 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 17:32:13.162565 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 17:32:13.165152 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 17:32:13.173664 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 17:32:13.173754 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 17:32:13.179299 initrd-setup-root-after-ignition[1101]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:32:13.179299 initrd-setup-root-after-ignition[1101]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:32:13.182186 initrd-setup-root-after-ignition[1105]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:32:13.181633 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:32:13.183470 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 17:32:13.185736 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 17:32:13.215925 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 17:32:13.216105 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 17:32:13.217803 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 17:32:13.219254 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 17:32:13.220716 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 17:32:13.221533 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 17:32:13.235450 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:32:13.237640 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 17:32:13.258104 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:32:13.259086 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:32:13.260712 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 17:32:13.262118 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 17:32:13.262236 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:32:13.264210 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 17:32:13.265759 systemd[1]: Stopped target basic.target - Basic System. Dec 12 17:32:13.267534 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 17:32:13.268820 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:32:13.270393 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 17:32:13.272054 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:32:13.273938 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 17:32:13.275454 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:32:13.276911 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 17:32:13.278549 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 17:32:13.279855 systemd[1]: Stopped target swap.target - Swaps. Dec 12 17:32:13.281298 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 17:32:13.281414 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:32:13.283305 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:32:13.284840 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:32:13.286438 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 17:32:13.287987 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:32:13.290048 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 17:32:13.290354 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 17:32:13.292394 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 17:32:13.292523 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:32:13.294231 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 17:32:13.294326 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 17:32:13.296601 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 17:32:13.297642 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 17:32:13.297770 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:32:13.300021 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 17:32:13.301459 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 17:32:13.301594 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:32:13.302967 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 17:32:13.303074 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:32:13.308214 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 17:32:13.309436 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 17:32:13.317975 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 17:32:13.325930 ignition[1127]: INFO : Ignition 2.22.0 Dec 12 17:32:13.325930 ignition[1127]: INFO : Stage: umount Dec 12 17:32:13.327301 ignition[1127]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:32:13.327301 ignition[1127]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:32:13.327301 ignition[1127]: INFO : umount: umount passed Dec 12 17:32:13.327301 ignition[1127]: INFO : Ignition finished successfully Dec 12 17:32:13.333156 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 17:32:13.334045 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 17:32:13.336493 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 17:32:13.336541 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 17:32:13.337675 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 17:32:13.337714 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 17:32:13.339161 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 17:32:13.339195 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 17:32:13.340518 systemd[1]: Stopped target network.target - Network. Dec 12 17:32:13.341971 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 17:32:13.342018 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:32:13.343706 systemd[1]: Stopped target paths.target - Path Units. Dec 12 17:32:13.344902 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 17:32:13.346087 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:32:13.347211 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 17:32:13.348693 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 17:32:13.350177 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 17:32:13.350211 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:32:13.353312 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 17:32:13.353381 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:32:13.354800 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 17:32:13.354851 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 17:32:13.356790 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 17:32:13.356828 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 17:32:13.358228 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 17:32:13.359595 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 17:32:13.361098 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 17:32:13.361180 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 17:32:13.362643 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 17:32:13.362722 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 17:32:13.367655 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 17:32:13.367753 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 17:32:13.371029 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 17:32:13.371207 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 17:32:13.371288 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 17:32:13.373547 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 17:32:13.374123 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 17:32:13.375316 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 17:32:13.375363 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:32:13.377639 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 17:32:13.378484 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 17:32:13.378534 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:32:13.380054 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 17:32:13.380090 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:32:13.382391 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 17:32:13.382429 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 17:32:13.383922 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 17:32:13.384569 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:32:13.386367 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:32:13.388430 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 17:32:13.388479 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:32:13.395528 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 17:32:13.395646 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:32:13.396815 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 17:32:13.396846 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 17:32:13.398364 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 17:32:13.398394 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:32:13.400199 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 17:32:13.400240 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:32:13.403192 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 17:32:13.403229 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 17:32:13.405585 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 17:32:13.405628 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:32:13.408837 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 17:32:13.410274 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 17:32:13.410331 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:32:13.412742 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 17:32:13.412780 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:32:13.415338 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:32:13.415378 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:32:13.418884 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 12 17:32:13.418930 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 12 17:32:13.419018 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:32:13.419295 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 17:32:13.419386 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 17:32:13.424379 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 17:32:13.424472 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 17:32:13.426313 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 17:32:13.428529 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 17:32:13.463239 systemd[1]: Switching root. Dec 12 17:32:13.500719 systemd-journald[313]: Journal stopped Dec 12 17:32:14.289603 systemd-journald[313]: Received SIGTERM from PID 1 (systemd). Dec 12 17:32:14.289672 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 17:32:14.289688 kernel: SELinux: policy capability open_perms=1 Dec 12 17:32:14.289700 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 17:32:14.289712 kernel: SELinux: policy capability always_check_network=0 Dec 12 17:32:14.289725 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 17:32:14.289739 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 17:32:14.289748 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 17:32:14.289760 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 17:32:14.289769 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 17:32:14.289792 kernel: audit: type=1403 audit(1765560733.616:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 17:32:14.289813 systemd[1]: Successfully loaded SELinux policy in 60.056ms. Dec 12 17:32:14.289839 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.728ms. Dec 12 17:32:14.289852 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:32:14.289866 systemd[1]: Detected virtualization kvm. Dec 12 17:32:14.289876 systemd[1]: Detected architecture arm64. Dec 12 17:32:14.289885 systemd[1]: Detected first boot. Dec 12 17:32:14.289895 systemd[1]: Hostname set to . Dec 12 17:32:14.289909 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:32:14.289923 zram_generator::config[1177]: No configuration found. Dec 12 17:32:14.289934 kernel: NET: Registered PF_VSOCK protocol family Dec 12 17:32:14.290063 systemd[1]: Populated /etc with preset unit settings. Dec 12 17:32:14.290079 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 17:32:14.290090 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 17:32:14.290100 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 17:32:14.290114 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 17:32:14.290124 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 17:32:14.290135 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 17:32:14.290145 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 17:32:14.290155 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 17:32:14.290167 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 17:32:14.290177 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 17:32:14.290191 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 17:32:14.290202 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 17:32:14.290212 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:32:14.290222 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:32:14.290236 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 17:32:14.290246 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 17:32:14.290258 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 17:32:14.290271 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:32:14.290284 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 12 17:32:14.290295 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:32:14.290305 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:32:14.290315 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 17:32:14.290326 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 17:32:14.290338 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 17:32:14.290348 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 17:32:14.290359 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:32:14.290372 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:32:14.290382 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:32:14.290393 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:32:14.290403 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 17:32:14.290414 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 17:32:14.290425 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 17:32:14.290436 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:32:14.290447 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:32:14.290457 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:32:14.290468 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 17:32:14.290478 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 17:32:14.290488 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 17:32:14.290499 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 17:32:14.290510 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 17:32:14.290520 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 17:32:14.290547 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 17:32:14.290558 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 17:32:14.290569 systemd[1]: Reached target machines.target - Containers. Dec 12 17:32:14.290580 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 17:32:14.290590 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:32:14.290600 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:32:14.290610 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 17:32:14.290620 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:32:14.290632 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:32:14.290643 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:32:14.290653 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 17:32:14.290663 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:32:14.290674 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 17:32:14.290684 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 17:32:14.290695 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 17:32:14.290707 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 17:32:14.290717 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 17:32:14.290728 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:32:14.290738 kernel: loop: module loaded Dec 12 17:32:14.290748 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:32:14.290758 kernel: fuse: init (API version 7.41) Dec 12 17:32:14.290769 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:32:14.290779 kernel: ACPI: bus type drm_connector registered Dec 12 17:32:14.290788 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:32:14.290799 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 17:32:14.290812 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 17:32:14.290823 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:32:14.290834 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 17:32:14.290844 systemd[1]: Stopped verity-setup.service. Dec 12 17:32:14.290877 systemd-journald[1245]: Collecting audit messages is disabled. Dec 12 17:32:14.290901 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 17:32:14.290912 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 17:32:14.290923 systemd-journald[1245]: Journal started Dec 12 17:32:14.290959 systemd-journald[1245]: Runtime Journal (/run/log/journal/a2b400e7537745009e0e9caeba16111b) is 8M, max 319.5M, 311.5M free. Dec 12 17:32:14.080679 systemd[1]: Queued start job for default target multi-user.target. Dec 12 17:32:14.101254 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 12 17:32:14.101635 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 17:32:14.294450 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:32:14.295071 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 17:32:14.296322 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 17:32:14.297426 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 17:32:14.298464 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 17:32:14.304035 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:32:14.305455 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 17:32:14.306691 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 17:32:14.307997 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 17:32:14.309143 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:32:14.309307 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:32:14.310471 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:32:14.310634 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:32:14.311744 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:32:14.311895 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:32:14.313340 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 17:32:14.313500 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 17:32:14.314665 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:32:14.314850 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:32:14.316259 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:32:14.317435 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:32:14.319058 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 17:32:14.320298 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 17:32:14.332378 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:32:14.334524 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 17:32:14.336510 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 17:32:14.337445 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 17:32:14.337486 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:32:14.339271 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 17:32:14.347130 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 17:32:14.348054 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:32:14.349262 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 17:32:14.351297 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 17:32:14.352269 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:32:14.353245 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 17:32:14.356165 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:32:14.358182 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:32:14.361268 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 17:32:14.364138 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 17:32:14.366597 systemd-journald[1245]: Time spent on flushing to /var/log/journal/a2b400e7537745009e0e9caeba16111b is 28.979ms for 1684 entries. Dec 12 17:32:14.366597 systemd-journald[1245]: System Journal (/var/log/journal/a2b400e7537745009e0e9caeba16111b) is 8M, max 584.8M, 576.8M free. Dec 12 17:32:14.418525 systemd-journald[1245]: Received client request to flush runtime journal. Dec 12 17:32:14.418582 kernel: loop0: detected capacity change from 0 to 211168 Dec 12 17:32:14.368987 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 17:32:14.369996 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 17:32:14.379806 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 17:32:14.381107 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 17:32:14.383707 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 17:32:14.390304 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:32:14.395332 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:32:14.420859 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 17:32:14.423877 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 17:32:14.430921 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 17:32:14.440331 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 17:32:14.442610 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:32:14.447974 kernel: loop1: detected capacity change from 0 to 1632 Dec 12 17:32:14.469579 systemd-tmpfiles[1313]: ACLs are not supported, ignoring. Dec 12 17:32:14.469601 systemd-tmpfiles[1313]: ACLs are not supported, ignoring. Dec 12 17:32:14.472719 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:32:14.473814 kernel: loop2: detected capacity change from 0 to 119840 Dec 12 17:32:14.517000 kernel: loop3: detected capacity change from 0 to 100632 Dec 12 17:32:14.566995 kernel: loop4: detected capacity change from 0 to 211168 Dec 12 17:32:14.586977 kernel: loop5: detected capacity change from 0 to 1632 Dec 12 17:32:14.594013 kernel: loop6: detected capacity change from 0 to 119840 Dec 12 17:32:14.610978 kernel: loop7: detected capacity change from 0 to 100632 Dec 12 17:32:14.626209 (sd-merge)[1321]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 12 17:32:14.626640 (sd-merge)[1321]: Merged extensions into '/usr'. Dec 12 17:32:14.630692 systemd[1]: Reload requested from client PID 1296 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 17:32:14.630712 systemd[1]: Reloading... Dec 12 17:32:14.687993 zram_generator::config[1347]: No configuration found. Dec 12 17:32:14.847910 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 17:32:14.848165 systemd[1]: Reloading finished in 216 ms. Dec 12 17:32:14.851036 ldconfig[1291]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 17:32:14.882859 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 17:32:14.884229 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 17:32:14.886747 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 17:32:14.905644 systemd[1]: Starting ensure-sysext.service... Dec 12 17:32:14.907480 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:32:14.910129 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:32:14.917215 systemd[1]: Reload requested from client PID 1385 ('systemctl') (unit ensure-sysext.service)... Dec 12 17:32:14.917233 systemd[1]: Reloading... Dec 12 17:32:14.924099 systemd-tmpfiles[1386]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 17:32:14.924130 systemd-tmpfiles[1386]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 17:32:14.924389 systemd-tmpfiles[1386]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 17:32:14.924622 systemd-tmpfiles[1386]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 17:32:14.925284 systemd-tmpfiles[1386]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 17:32:14.925493 systemd-tmpfiles[1386]: ACLs are not supported, ignoring. Dec 12 17:32:14.925540 systemd-tmpfiles[1386]: ACLs are not supported, ignoring. Dec 12 17:32:14.930617 systemd-tmpfiles[1386]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:32:14.930630 systemd-tmpfiles[1386]: Skipping /boot Dec 12 17:32:14.938332 systemd-udevd[1387]: Using default interface naming scheme 'v255'. Dec 12 17:32:14.938433 systemd-tmpfiles[1386]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:32:14.938439 systemd-tmpfiles[1386]: Skipping /boot Dec 12 17:32:14.975412 zram_generator::config[1415]: No configuration found. Dec 12 17:32:15.094984 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 17:32:15.156387 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Dec 12 17:32:15.156477 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 12 17:32:15.156493 kernel: [drm] features: -context_init Dec 12 17:32:15.160258 kernel: [drm] number of scanouts: 1 Dec 12 17:32:15.160325 kernel: [drm] number of cap sets: 0 Dec 12 17:32:15.161969 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Dec 12 17:32:15.168999 kernel: Console: switching to colour frame buffer device 160x50 Dec 12 17:32:15.173825 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 17:32:15.212974 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 12 17:32:15.216203 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 12 17:32:15.216520 systemd[1]: Reloading finished in 299 ms. Dec 12 17:32:15.233630 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:32:15.239350 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:32:15.261074 systemd[1]: Finished ensure-sysext.service. Dec 12 17:32:15.277671 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:32:15.280134 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 17:32:15.281075 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:32:15.295983 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:32:15.297685 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:32:15.299393 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:32:15.301545 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:32:15.305596 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 12 17:32:15.307280 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:32:15.308610 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 17:32:15.309999 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:32:15.311285 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 17:32:15.313851 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 12 17:32:15.313906 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 12 17:32:15.316964 kernel: PTP clock support registered Dec 12 17:32:15.317092 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:32:15.319917 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:32:15.321170 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 17:32:15.323305 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 17:32:15.326060 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:32:15.327902 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:32:15.328173 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:32:15.329466 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:32:15.331189 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:32:15.332629 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:32:15.340690 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:32:15.342841 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:32:15.343279 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:32:15.346445 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 12 17:32:15.346724 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 12 17:32:15.348288 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 17:32:15.353981 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 17:32:15.360094 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:32:15.360230 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:32:15.361519 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 17:32:15.363500 augenrules[1552]: No rules Dec 12 17:32:15.366973 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 17:32:15.368297 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:32:15.376187 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:32:15.378126 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 17:32:15.385664 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 17:32:15.403858 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:32:15.405238 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 17:32:15.408756 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 17:32:15.410803 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:32:15.452142 systemd-networkd[1525]: lo: Link UP Dec 12 17:32:15.452151 systemd-networkd[1525]: lo: Gained carrier Dec 12 17:32:15.453163 systemd-networkd[1525]: Enumeration completed Dec 12 17:32:15.453316 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:32:15.453588 systemd-networkd[1525]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:32:15.453591 systemd-networkd[1525]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:32:15.454106 systemd-networkd[1525]: eth0: Link UP Dec 12 17:32:15.454196 systemd-networkd[1525]: eth0: Gained carrier Dec 12 17:32:15.454211 systemd-networkd[1525]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:32:15.455184 systemd-resolved[1526]: Positive Trust Anchors: Dec 12 17:32:15.455201 systemd-resolved[1526]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:32:15.455236 systemd-resolved[1526]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:32:15.456310 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 17:32:15.458252 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 17:32:15.460609 systemd-resolved[1526]: Using system hostname 'ci-4459-2-2-3-c846c80ac0'. Dec 12 17:32:15.462038 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:32:15.462971 systemd[1]: Reached target network.target - Network. Dec 12 17:32:15.463640 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:32:15.464560 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:32:15.465448 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 17:32:15.466537 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 17:32:15.467858 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 17:32:15.469057 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 17:32:15.470121 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 17:32:15.470285 systemd-networkd[1525]: eth0: DHCPv4 address 10.0.8.78/25, gateway 10.0.8.1 acquired from 10.0.8.1 Dec 12 17:32:15.471099 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 17:32:15.471128 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:32:15.471821 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:32:15.473439 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 17:32:15.475695 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 17:32:15.478142 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 17:32:15.479232 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 17:32:15.480186 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 17:32:15.482966 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 17:32:15.484081 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 17:32:15.486995 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 17:32:15.488134 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 17:32:15.489555 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:32:15.490399 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:32:15.491181 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:32:15.491216 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:32:15.494121 systemd[1]: Starting chronyd.service - NTP client/server... Dec 12 17:32:15.495719 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 17:32:15.497717 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 17:32:15.508120 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 17:32:15.508976 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:15.511171 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 17:32:15.513085 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 17:32:15.514959 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 17:32:15.515791 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 17:32:15.516836 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 17:32:15.529520 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 17:32:15.531706 jq[1588]: false Dec 12 17:32:15.533023 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 17:32:15.535179 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 17:32:15.538908 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 17:32:15.542221 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 17:32:15.542818 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 17:32:15.544189 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 17:32:15.546929 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 17:32:15.552339 chronyd[1581]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 12 17:32:15.553470 chronyd[1581]: Loaded seccomp filter (level 2) Dec 12 17:32:15.553840 extend-filesystems[1589]: Found /dev/vda6 Dec 12 17:32:15.555227 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 17:32:15.556861 systemd[1]: Started chronyd.service - NTP client/server. Dec 12 17:32:15.558306 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 17:32:15.558628 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 17:32:15.558918 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 17:32:15.559170 jq[1606]: true Dec 12 17:32:15.559281 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 17:32:15.560135 extend-filesystems[1589]: Found /dev/vda9 Dec 12 17:32:15.561501 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 17:32:15.561708 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 17:32:15.564414 extend-filesystems[1589]: Checking size of /dev/vda9 Dec 12 17:32:15.577021 jq[1613]: true Dec 12 17:32:15.581573 (ntainerd)[1621]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 17:32:15.581820 update_engine[1601]: I20251212 17:32:15.580729 1601 main.cc:92] Flatcar Update Engine starting Dec 12 17:32:15.585995 extend-filesystems[1589]: Resized partition /dev/vda9 Dec 12 17:32:15.588836 tar[1612]: linux-arm64/LICENSE Dec 12 17:32:15.588836 tar[1612]: linux-arm64/helm Dec 12 17:32:15.593623 extend-filesystems[1630]: resize2fs 1.47.3 (8-Jul-2025) Dec 12 17:32:15.603641 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 12 17:32:15.635919 dbus-daemon[1584]: [system] SELinux support is enabled Dec 12 17:32:15.636694 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 17:32:15.640993 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 17:32:15.641030 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 17:32:15.642588 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 17:32:15.642613 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 17:32:15.645992 systemd[1]: Started update-engine.service - Update Engine. Dec 12 17:32:15.646749 update_engine[1601]: I20251212 17:32:15.646288 1601 update_check_scheduler.cc:74] Next update check in 2m58s Dec 12 17:32:15.652275 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 17:32:15.659844 systemd-logind[1599]: New seat seat0. Dec 12 17:32:15.708684 locksmithd[1646]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 17:32:15.714221 systemd-logind[1599]: Watching system buttons on /dev/input/event0 (Power Button) Dec 12 17:32:15.714237 systemd-logind[1599]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 12 17:32:15.714497 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 17:32:15.746597 bash[1647]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:32:15.748922 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 17:32:15.754552 systemd[1]: Starting sshkeys.service... Dec 12 17:32:15.763537 containerd[1621]: time="2025-12-12T17:32:15Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 17:32:15.764067 containerd[1621]: time="2025-12-12T17:32:15.763879240Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 17:32:15.775927 containerd[1621]: time="2025-12-12T17:32:15.775874320Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11µs" Dec 12 17:32:15.775927 containerd[1621]: time="2025-12-12T17:32:15.775914920Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 17:32:15.776034 containerd[1621]: time="2025-12-12T17:32:15.775933320Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776094680Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776119280Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776143880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776208880Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776221840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776431480Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776445320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776455720Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776463480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776526880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 17:32:15.776834 containerd[1621]: time="2025-12-12T17:32:15.776703800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:32:15.777930 containerd[1621]: time="2025-12-12T17:32:15.776729360Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:32:15.777930 containerd[1621]: time="2025-12-12T17:32:15.776741160Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 17:32:15.777930 containerd[1621]: time="2025-12-12T17:32:15.776766560Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 17:32:15.780821 containerd[1621]: time="2025-12-12T17:32:15.780152320Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 17:32:15.780821 containerd[1621]: time="2025-12-12T17:32:15.780274440Z" level=info msg="metadata content store policy set" policy=shared Dec 12 17:32:15.781151 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 12 17:32:15.783967 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 12 17:32:15.800974 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814445440Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814538120Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814561640Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814574760Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814587040Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814597640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814611160Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814626360Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814639040Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814649080Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814658800Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 17:32:15.814606 containerd[1621]: time="2025-12-12T17:32:15.814671960Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814826520Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814848240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814862760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814874720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814885760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814896520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814907840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814917760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814928600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814938840Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.814971960Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.815189360Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.815205120Z" level=info msg="Start snapshots syncer" Dec 12 17:32:15.817228 containerd[1621]: time="2025-12-12T17:32:15.815231040Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 17:32:15.819385 containerd[1621]: time="2025-12-12T17:32:15.815515480Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 17:32:15.819385 containerd[1621]: time="2025-12-12T17:32:15.815569920Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815630400Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815735040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815759440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815770160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815781600Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815793400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815827080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815839680Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815862760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815876680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815899840Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.815931320Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.816042760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:32:15.819626 containerd[1621]: time="2025-12-12T17:32:15.816058480Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816069920Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816078040Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816087520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816098400Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816237920Z" level=info msg="runtime interface created" Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816249160Z" level=info msg="created NRI interface" Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816258360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816271840Z" level=info msg="Connect containerd service" Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.816294080Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 17:32:15.819848 containerd[1621]: time="2025-12-12T17:32:15.817039800Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:32:15.904889 containerd[1621]: time="2025-12-12T17:32:15.904786240Z" level=info msg="Start subscribing containerd event" Dec 12 17:32:15.904889 containerd[1621]: time="2025-12-12T17:32:15.904898280Z" level=info msg="Start recovering state" Dec 12 17:32:15.905063 containerd[1621]: time="2025-12-12T17:32:15.905037840Z" level=info msg="Start event monitor" Dec 12 17:32:15.905094 containerd[1621]: time="2025-12-12T17:32:15.905068080Z" level=info msg="Start cni network conf syncer for default" Dec 12 17:32:15.905094 containerd[1621]: time="2025-12-12T17:32:15.905080040Z" level=info msg="Start streaming server" Dec 12 17:32:15.905094 containerd[1621]: time="2025-12-12T17:32:15.905088680Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 17:32:15.905145 containerd[1621]: time="2025-12-12T17:32:15.905095440Z" level=info msg="runtime interface starting up..." Dec 12 17:32:15.905145 containerd[1621]: time="2025-12-12T17:32:15.905101000Z" level=info msg="starting plugins..." Dec 12 17:32:15.905145 containerd[1621]: time="2025-12-12T17:32:15.905115120Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 17:32:15.905145 containerd[1621]: time="2025-12-12T17:32:15.905130400Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 17:32:15.907100 containerd[1621]: time="2025-12-12T17:32:15.905175240Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 17:32:15.907100 containerd[1621]: time="2025-12-12T17:32:15.905226680Z" level=info msg="containerd successfully booted in 0.142312s" Dec 12 17:32:15.905327 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 17:32:15.930981 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 12 17:32:15.952869 extend-filesystems[1630]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 12 17:32:15.952869 extend-filesystems[1630]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 12 17:32:15.952869 extend-filesystems[1630]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 12 17:32:15.958890 extend-filesystems[1589]: Resized filesystem in /dev/vda9 Dec 12 17:32:15.954239 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 17:32:15.955321 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 17:32:16.034193 tar[1612]: linux-arm64/README.md Dec 12 17:32:16.052058 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 17:32:16.513534 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 17:32:16.518982 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:16.582264 systemd-networkd[1525]: eth0: Gained IPv6LL Dec 12 17:32:16.585134 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 17:32:16.587050 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 17:32:16.589667 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:32:16.591753 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 17:32:16.622992 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 17:32:16.771912 sshd_keygen[1610]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 17:32:16.791837 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 17:32:16.795326 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 17:32:16.797039 systemd[1]: Started sshd@0-10.0.8.78:22-147.75.109.163:50380.service - OpenSSH per-connection server daemon (147.75.109.163:50380). Dec 12 17:32:16.814010 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:16.815606 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 17:32:16.817097 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 17:32:16.823217 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 17:32:16.844776 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 17:32:16.847922 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 17:32:16.851669 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 12 17:32:16.853172 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 17:32:17.423720 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:32:17.427587 (kubelet)[1724]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:32:17.795411 sshd[1708]: Accepted publickey for core from 147.75.109.163 port 50380 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:17.797671 sshd-session[1708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:17.805150 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 17:32:17.807185 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 17:32:17.814522 systemd-logind[1599]: New session 1 of user core. Dec 12 17:32:17.835585 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 17:32:17.839346 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 17:32:17.855230 (systemd)[1733]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 17:32:17.857568 systemd-logind[1599]: New session c1 of user core. Dec 12 17:32:17.982802 kubelet[1724]: E1212 17:32:17.982720 1724 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:32:17.985060 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:32:17.985180 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:32:17.986628 systemd[1733]: Queued start job for default target default.target. Dec 12 17:32:17.987032 systemd[1]: kubelet.service: Consumed 769ms CPU time, 258.8M memory peak. Dec 12 17:32:17.998259 systemd[1733]: Created slice app.slice - User Application Slice. Dec 12 17:32:17.998393 systemd[1733]: Reached target paths.target - Paths. Dec 12 17:32:17.998492 systemd[1733]: Reached target timers.target - Timers. Dec 12 17:32:17.999720 systemd[1733]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 17:32:18.009198 systemd[1733]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 17:32:18.009262 systemd[1733]: Reached target sockets.target - Sockets. Dec 12 17:32:18.009296 systemd[1733]: Reached target basic.target - Basic System. Dec 12 17:32:18.009327 systemd[1733]: Reached target default.target - Main User Target. Dec 12 17:32:18.009351 systemd[1733]: Startup finished in 145ms. Dec 12 17:32:18.009526 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 17:32:18.011825 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 17:32:18.530004 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:18.703761 systemd[1]: Started sshd@1-10.0.8.78:22-147.75.109.163:50396.service - OpenSSH per-connection server daemon (147.75.109.163:50396). Dec 12 17:32:18.828985 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:19.711893 sshd[1747]: Accepted publickey for core from 147.75.109.163 port 50396 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:19.713104 sshd-session[1747]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:19.717849 systemd-logind[1599]: New session 2 of user core. Dec 12 17:32:19.727353 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 17:32:20.393403 sshd[1751]: Connection closed by 147.75.109.163 port 50396 Dec 12 17:32:20.394107 sshd-session[1747]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:20.397573 systemd[1]: sshd@1-10.0.8.78:22-147.75.109.163:50396.service: Deactivated successfully. Dec 12 17:32:20.399221 systemd[1]: session-2.scope: Deactivated successfully. Dec 12 17:32:20.399911 systemd-logind[1599]: Session 2 logged out. Waiting for processes to exit. Dec 12 17:32:20.400924 systemd-logind[1599]: Removed session 2. Dec 12 17:32:20.558654 systemd[1]: Started sshd@2-10.0.8.78:22-147.75.109.163:50406.service - OpenSSH per-connection server daemon (147.75.109.163:50406). Dec 12 17:32:21.533286 sshd[1757]: Accepted publickey for core from 147.75.109.163 port 50406 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:21.534516 sshd-session[1757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:21.538060 systemd-logind[1599]: New session 3 of user core. Dec 12 17:32:21.553382 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 17:32:22.195811 sshd[1760]: Connection closed by 147.75.109.163 port 50406 Dec 12 17:32:22.196458 sshd-session[1757]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:22.199636 systemd[1]: sshd@2-10.0.8.78:22-147.75.109.163:50406.service: Deactivated successfully. Dec 12 17:32:22.201316 systemd[1]: session-3.scope: Deactivated successfully. Dec 12 17:32:22.202697 systemd-logind[1599]: Session 3 logged out. Waiting for processes to exit. Dec 12 17:32:22.203681 systemd-logind[1599]: Removed session 3. Dec 12 17:32:22.545027 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:22.554384 coreos-metadata[1583]: Dec 12 17:32:22.554 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:32:22.570264 coreos-metadata[1583]: Dec 12 17:32:22.570 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 12 17:32:22.840022 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:32:22.844685 coreos-metadata[1662]: Dec 12 17:32:22.844 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:32:22.857677 coreos-metadata[1662]: Dec 12 17:32:22.857 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 12 17:32:22.935979 coreos-metadata[1583]: Dec 12 17:32:22.935 INFO Fetch successful Dec 12 17:32:22.936249 coreos-metadata[1583]: Dec 12 17:32:22.936 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 17:32:23.081165 coreos-metadata[1662]: Dec 12 17:32:23.081 INFO Fetch successful Dec 12 17:32:23.081165 coreos-metadata[1662]: Dec 12 17:32:23.081 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 12 17:32:23.160218 coreos-metadata[1583]: Dec 12 17:32:23.160 INFO Fetch successful Dec 12 17:32:23.160218 coreos-metadata[1583]: Dec 12 17:32:23.160 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 12 17:32:23.305554 coreos-metadata[1662]: Dec 12 17:32:23.305 INFO Fetch successful Dec 12 17:32:23.307529 unknown[1662]: wrote ssh authorized keys file for user: core Dec 12 17:32:23.316264 coreos-metadata[1583]: Dec 12 17:32:23.316 INFO Fetch successful Dec 12 17:32:23.316453 coreos-metadata[1583]: Dec 12 17:32:23.316 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 12 17:32:23.336293 update-ssh-keys[1774]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:32:23.337375 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 12 17:32:23.339092 systemd[1]: Finished sshkeys.service. Dec 12 17:32:23.445823 coreos-metadata[1583]: Dec 12 17:32:23.445 INFO Fetch successful Dec 12 17:32:23.445823 coreos-metadata[1583]: Dec 12 17:32:23.445 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 12 17:32:24.958887 coreos-metadata[1583]: Dec 12 17:32:24.958 INFO Fetch successful Dec 12 17:32:24.958887 coreos-metadata[1583]: Dec 12 17:32:24.958 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 12 17:32:25.075592 coreos-metadata[1583]: Dec 12 17:32:25.075 INFO Fetch successful Dec 12 17:32:25.099105 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 17:32:25.099531 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 17:32:25.099654 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 17:32:25.099774 systemd[1]: Startup finished in 3.112s (kernel) + 12.005s (initrd) + 11.542s (userspace) = 26.660s. Dec 12 17:32:28.181307 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 17:32:28.182844 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:32:28.309788 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:32:28.313278 (kubelet)[1790]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:32:28.349053 kubelet[1790]: E1212 17:32:28.348993 1790 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:32:28.352343 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:32:28.352472 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:32:28.354221 systemd[1]: kubelet.service: Consumed 140ms CPU time, 108M memory peak. Dec 12 17:32:32.363837 systemd[1]: Started sshd@3-10.0.8.78:22-147.75.109.163:41024.service - OpenSSH per-connection server daemon (147.75.109.163:41024). Dec 12 17:32:33.342618 sshd[1800]: Accepted publickey for core from 147.75.109.163 port 41024 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:33.343872 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:33.348060 systemd-logind[1599]: New session 4 of user core. Dec 12 17:32:33.356223 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 17:32:34.011039 sshd[1803]: Connection closed by 147.75.109.163 port 41024 Dec 12 17:32:34.011556 sshd-session[1800]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:34.014988 systemd[1]: sshd@3-10.0.8.78:22-147.75.109.163:41024.service: Deactivated successfully. Dec 12 17:32:34.016506 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 17:32:34.018807 systemd-logind[1599]: Session 4 logged out. Waiting for processes to exit. Dec 12 17:32:34.019873 systemd-logind[1599]: Removed session 4. Dec 12 17:32:34.181329 systemd[1]: Started sshd@4-10.0.8.78:22-147.75.109.163:53394.service - OpenSSH per-connection server daemon (147.75.109.163:53394). Dec 12 17:32:35.150984 sshd[1809]: Accepted publickey for core from 147.75.109.163 port 53394 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:35.152311 sshd-session[1809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:35.156510 systemd-logind[1599]: New session 5 of user core. Dec 12 17:32:35.171325 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 17:32:35.817639 sshd[1812]: Connection closed by 147.75.109.163 port 53394 Dec 12 17:32:35.818299 sshd-session[1809]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:35.821581 systemd[1]: sshd@4-10.0.8.78:22-147.75.109.163:53394.service: Deactivated successfully. Dec 12 17:32:35.823094 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 17:32:35.823757 systemd-logind[1599]: Session 5 logged out. Waiting for processes to exit. Dec 12 17:32:35.824716 systemd-logind[1599]: Removed session 5. Dec 12 17:32:35.997782 systemd[1]: Started sshd@5-10.0.8.78:22-147.75.109.163:53410.service - OpenSSH per-connection server daemon (147.75.109.163:53410). Dec 12 17:32:37.002241 sshd[1818]: Accepted publickey for core from 147.75.109.163 port 53410 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:37.014523 sshd-session[1818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:37.019013 systemd-logind[1599]: New session 6 of user core. Dec 12 17:32:37.029360 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 17:32:37.690162 sshd[1821]: Connection closed by 147.75.109.163 port 53410 Dec 12 17:32:37.690467 sshd-session[1818]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:37.693867 systemd[1]: sshd@5-10.0.8.78:22-147.75.109.163:53410.service: Deactivated successfully. Dec 12 17:32:37.696651 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 17:32:37.697267 systemd-logind[1599]: Session 6 logged out. Waiting for processes to exit. Dec 12 17:32:37.698578 systemd-logind[1599]: Removed session 6. Dec 12 17:32:37.863885 systemd[1]: Started sshd@6-10.0.8.78:22-147.75.109.163:53416.service - OpenSSH per-connection server daemon (147.75.109.163:53416). Dec 12 17:32:38.431258 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 17:32:38.432604 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:32:38.576214 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:32:38.579696 (kubelet)[1838]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:32:38.613223 kubelet[1838]: E1212 17:32:38.613173 1838 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:32:38.616086 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:32:38.616213 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:32:38.618026 systemd[1]: kubelet.service: Consumed 137ms CPU time, 105.5M memory peak. Dec 12 17:32:38.856635 sshd[1827]: Accepted publickey for core from 147.75.109.163 port 53416 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:38.858193 sshd-session[1827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:38.862237 systemd-logind[1599]: New session 7 of user core. Dec 12 17:32:38.880175 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 17:32:39.337886 chronyd[1581]: Selected source PHC0 Dec 12 17:32:39.379743 sudo[1847]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 17:32:39.380011 sudo[1847]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:32:39.392839 sudo[1847]: pam_unix(sudo:session): session closed for user root Dec 12 17:32:39.537168 sshd[1846]: Connection closed by 147.75.109.163 port 53416 Dec 12 17:32:39.538021 sshd-session[1827]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:39.541542 systemd[1]: sshd@6-10.0.8.78:22-147.75.109.163:53416.service: Deactivated successfully. Dec 12 17:32:39.542899 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 17:32:39.545625 systemd-logind[1599]: Session 7 logged out. Waiting for processes to exit. Dec 12 17:32:39.546552 systemd-logind[1599]: Removed session 7. Dec 12 17:32:39.699887 systemd[1]: Started sshd@7-10.0.8.78:22-147.75.109.163:53424.service - OpenSSH per-connection server daemon (147.75.109.163:53424). Dec 12 17:32:40.624312 sshd[1853]: Accepted publickey for core from 147.75.109.163 port 53424 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:40.625494 sshd-session[1853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:40.629000 systemd-logind[1599]: New session 8 of user core. Dec 12 17:32:40.638256 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 17:32:41.102912 sudo[1858]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 17:32:41.103420 sudo[1858]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:32:41.107635 sudo[1858]: pam_unix(sudo:session): session closed for user root Dec 12 17:32:41.111662 sudo[1857]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 17:32:41.111893 sudo[1857]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:32:41.120593 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:32:41.158657 augenrules[1880]: No rules Dec 12 17:32:41.159714 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:32:41.159910 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:32:41.160784 sudo[1857]: pam_unix(sudo:session): session closed for user root Dec 12 17:32:41.307654 sshd[1856]: Connection closed by 147.75.109.163 port 53424 Dec 12 17:32:41.307562 sshd-session[1853]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:41.311088 systemd[1]: sshd@7-10.0.8.78:22-147.75.109.163:53424.service: Deactivated successfully. Dec 12 17:32:41.312485 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 17:32:41.314928 systemd-logind[1599]: Session 8 logged out. Waiting for processes to exit. Dec 12 17:32:41.315962 systemd-logind[1599]: Removed session 8. Dec 12 17:32:41.457547 systemd[1]: Started sshd@8-10.0.8.78:22-147.75.109.163:53438.service - OpenSSH per-connection server daemon (147.75.109.163:53438). Dec 12 17:32:42.334275 sshd[1889]: Accepted publickey for core from 147.75.109.163 port 53438 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:42.335935 sshd-session[1889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:42.339978 systemd-logind[1599]: New session 9 of user core. Dec 12 17:32:42.351251 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 17:32:42.798522 sudo[1893]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 17:32:42.798749 sudo[1893]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:32:43.091217 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 17:32:43.103427 (dockerd)[1913]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 17:32:43.308451 dockerd[1913]: time="2025-12-12T17:32:43.308392846Z" level=info msg="Starting up" Dec 12 17:32:43.309253 dockerd[1913]: time="2025-12-12T17:32:43.309236183Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 17:32:43.318664 dockerd[1913]: time="2025-12-12T17:32:43.318635548Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 17:32:43.350383 dockerd[1913]: time="2025-12-12T17:32:43.350045511Z" level=info msg="Loading containers: start." Dec 12 17:32:43.359958 kernel: Initializing XFRM netlink socket Dec 12 17:32:43.549848 systemd-networkd[1525]: docker0: Link UP Dec 12 17:32:43.554123 dockerd[1913]: time="2025-12-12T17:32:43.554083971Z" level=info msg="Loading containers: done." Dec 12 17:32:43.564824 dockerd[1913]: time="2025-12-12T17:32:43.564775072Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 17:32:43.564959 dockerd[1913]: time="2025-12-12T17:32:43.564849395Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 17:32:43.564959 dockerd[1913]: time="2025-12-12T17:32:43.564922399Z" level=info msg="Initializing buildkit" Dec 12 17:32:43.586400 dockerd[1913]: time="2025-12-12T17:32:43.586319598Z" level=info msg="Completed buildkit initialization" Dec 12 17:32:43.590624 dockerd[1913]: time="2025-12-12T17:32:43.590590809Z" level=info msg="Daemon has completed initialization" Dec 12 17:32:43.590770 dockerd[1913]: time="2025-12-12T17:32:43.590646396Z" level=info msg="API listen on /run/docker.sock" Dec 12 17:32:43.590823 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 17:32:44.686170 containerd[1621]: time="2025-12-12T17:32:44.686131640Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Dec 12 17:32:45.335400 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1167562656.mount: Deactivated successfully. Dec 12 17:32:46.705521 containerd[1621]: time="2025-12-12T17:32:46.705466433Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:46.706988 containerd[1621]: time="2025-12-12T17:32:46.706823920Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=27387379" Dec 12 17:32:46.707990 containerd[1621]: time="2025-12-12T17:32:46.707939645Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:46.711230 containerd[1621]: time="2025-12-12T17:32:46.711177982Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:46.712177 containerd[1621]: time="2025-12-12T17:32:46.712151427Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 2.025977306s" Dec 12 17:32:46.712228 containerd[1621]: time="2025-12-12T17:32:46.712183867Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Dec 12 17:32:46.713868 containerd[1621]: time="2025-12-12T17:32:46.713846875Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Dec 12 17:32:48.073268 containerd[1621]: time="2025-12-12T17:32:48.073191298Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:48.074102 containerd[1621]: time="2025-12-12T17:32:48.074079583Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23553101" Dec 12 17:32:48.075197 containerd[1621]: time="2025-12-12T17:32:48.075174068Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:48.078907 containerd[1621]: time="2025-12-12T17:32:48.078651047Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:48.079577 containerd[1621]: time="2025-12-12T17:32:48.079537172Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.365660937s" Dec 12 17:32:48.079577 containerd[1621]: time="2025-12-12T17:32:48.079572172Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Dec 12 17:32:48.079990 containerd[1621]: time="2025-12-12T17:32:48.079964294Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Dec 12 17:32:48.680996 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 17:32:48.682720 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:32:48.821903 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:32:48.825605 (kubelet)[2202]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:32:48.864754 kubelet[2202]: E1212 17:32:48.864705 2202 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:32:48.867445 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:32:48.867606 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:32:48.868982 systemd[1]: kubelet.service: Consumed 138ms CPU time, 107.6M memory peak. Dec 12 17:32:49.084661 containerd[1621]: time="2025-12-12T17:32:49.084473044Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:49.086255 containerd[1621]: time="2025-12-12T17:32:49.086214613Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18298087" Dec 12 17:32:49.087457 containerd[1621]: time="2025-12-12T17:32:49.087400859Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:49.091114 containerd[1621]: time="2025-12-12T17:32:49.091079918Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:49.092768 containerd[1621]: time="2025-12-12T17:32:49.092583886Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.012587832s" Dec 12 17:32:49.092768 containerd[1621]: time="2025-12-12T17:32:49.092625246Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Dec 12 17:32:49.093261 containerd[1621]: time="2025-12-12T17:32:49.093226969Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Dec 12 17:32:50.005190 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1372558877.mount: Deactivated successfully. Dec 12 17:32:50.281054 containerd[1621]: time="2025-12-12T17:32:50.280574001Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:50.281882 containerd[1621]: time="2025-12-12T17:32:50.281829727Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=28258699" Dec 12 17:32:50.282825 containerd[1621]: time="2025-12-12T17:32:50.282773972Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:50.284885 containerd[1621]: time="2025-12-12T17:32:50.284840462Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:50.285512 containerd[1621]: time="2025-12-12T17:32:50.285477305Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.192212696s" Dec 12 17:32:50.285562 containerd[1621]: time="2025-12-12T17:32:50.285511306Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Dec 12 17:32:50.286007 containerd[1621]: time="2025-12-12T17:32:50.285978348Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Dec 12 17:32:50.899454 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3195166870.mount: Deactivated successfully. Dec 12 17:32:51.657669 containerd[1621]: time="2025-12-12T17:32:51.657588236Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:51.658987 containerd[1621]: time="2025-12-12T17:32:51.658917562Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Dec 12 17:32:51.660032 containerd[1621]: time="2025-12-12T17:32:51.660000968Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:51.662543 containerd[1621]: time="2025-12-12T17:32:51.662487821Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:51.663403 containerd[1621]: time="2025-12-12T17:32:51.663366785Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.377355197s" Dec 12 17:32:51.663444 containerd[1621]: time="2025-12-12T17:32:51.663401625Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Dec 12 17:32:51.664181 containerd[1621]: time="2025-12-12T17:32:51.664125949Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 12 17:32:52.189692 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount334461028.mount: Deactivated successfully. Dec 12 17:32:52.194975 containerd[1621]: time="2025-12-12T17:32:52.194716444Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:32:52.196350 containerd[1621]: time="2025-12-12T17:32:52.196110971Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Dec 12 17:32:52.197510 containerd[1621]: time="2025-12-12T17:32:52.197466458Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:32:52.199614 containerd[1621]: time="2025-12-12T17:32:52.199585149Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:32:52.200192 containerd[1621]: time="2025-12-12T17:32:52.200157312Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 536.002603ms" Dec 12 17:32:52.200192 containerd[1621]: time="2025-12-12T17:32:52.200190712Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Dec 12 17:32:52.200702 containerd[1621]: time="2025-12-12T17:32:52.200611754Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Dec 12 17:32:52.745901 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount834086491.mount: Deactivated successfully. Dec 12 17:32:54.338264 containerd[1621]: time="2025-12-12T17:32:54.338215733Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:54.339838 containerd[1621]: time="2025-12-12T17:32:54.339810261Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=70013713" Dec 12 17:32:54.341499 containerd[1621]: time="2025-12-12T17:32:54.341463550Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:54.345330 containerd[1621]: time="2025-12-12T17:32:54.345296569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:32:54.346532 containerd[1621]: time="2025-12-12T17:32:54.346370535Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.145730341s" Dec 12 17:32:54.346532 containerd[1621]: time="2025-12-12T17:32:54.346423335Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Dec 12 17:32:58.931297 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 12 17:32:58.933075 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:32:59.057969 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:32:59.073382 (kubelet)[2365]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:32:59.105048 kubelet[2365]: E1212 17:32:59.104981 2365 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:32:59.107683 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:32:59.107809 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:32:59.108325 systemd[1]: kubelet.service: Consumed 131ms CPU time, 106.9M memory peak. Dec 12 17:32:59.928465 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:32:59.928610 systemd[1]: kubelet.service: Consumed 131ms CPU time, 106.9M memory peak. Dec 12 17:32:59.930551 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:32:59.949329 systemd[1]: Reload requested from client PID 2379 ('systemctl') (unit session-9.scope)... Dec 12 17:32:59.949347 systemd[1]: Reloading... Dec 12 17:33:00.036084 zram_generator::config[2423]: No configuration found. Dec 12 17:33:00.199238 systemd[1]: Reloading finished in 249 ms. Dec 12 17:33:00.241746 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 17:33:00.241821 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 17:33:00.242099 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:33:00.242143 systemd[1]: kubelet.service: Consumed 94ms CPU time, 95.1M memory peak. Dec 12 17:33:00.243575 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:33:00.380670 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:33:00.385882 (kubelet)[2470]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:33:00.430053 kubelet[2470]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:33:00.430053 kubelet[2470]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:33:00.430053 kubelet[2470]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:33:00.430414 kubelet[2470]: I1212 17:33:00.430096 2470 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:33:00.457273 update_engine[1601]: I20251212 17:33:00.456050 1601 update_attempter.cc:509] Updating boot flags... Dec 12 17:33:00.870703 kubelet[2470]: I1212 17:33:00.870592 2470 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 12 17:33:00.870832 kubelet[2470]: I1212 17:33:00.870820 2470 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:33:00.871416 kubelet[2470]: I1212 17:33:00.871391 2470 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:33:00.914577 kubelet[2470]: E1212 17:33:00.914536 2470 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.8.78:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.8.78:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 12 17:33:00.914742 kubelet[2470]: I1212 17:33:00.914656 2470 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:33:00.927753 kubelet[2470]: I1212 17:33:00.927728 2470 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:33:00.930840 kubelet[2470]: I1212 17:33:00.930814 2470 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:33:00.932736 kubelet[2470]: I1212 17:33:00.932688 2470 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:33:00.932900 kubelet[2470]: I1212 17:33:00.932738 2470 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-3-c846c80ac0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:33:00.933021 kubelet[2470]: I1212 17:33:00.933002 2470 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:33:00.933021 kubelet[2470]: I1212 17:33:00.933013 2470 container_manager_linux.go:303] "Creating device plugin manager" Dec 12 17:33:00.934280 kubelet[2470]: I1212 17:33:00.934238 2470 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:33:00.943298 kubelet[2470]: I1212 17:33:00.943219 2470 kubelet.go:480] "Attempting to sync node with API server" Dec 12 17:33:00.943298 kubelet[2470]: I1212 17:33:00.943272 2470 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:33:00.943517 kubelet[2470]: I1212 17:33:00.943335 2470 kubelet.go:386] "Adding apiserver pod source" Dec 12 17:33:00.945846 kubelet[2470]: I1212 17:33:00.945796 2470 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:33:00.947318 kubelet[2470]: I1212 17:33:00.947046 2470 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:33:00.953291 kubelet[2470]: I1212 17:33:00.953243 2470 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:33:00.953437 kubelet[2470]: W1212 17:33:00.953420 2470 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 17:33:00.954657 kubelet[2470]: E1212 17:33:00.954589 2470 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.8.78:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-3-c846c80ac0&limit=500&resourceVersion=0\": dial tcp 10.0.8.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:33:00.954727 kubelet[2470]: E1212 17:33:00.954675 2470 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.8.78:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.8.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:33:00.956714 kubelet[2470]: I1212 17:33:00.956693 2470 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:33:00.956798 kubelet[2470]: I1212 17:33:00.956749 2470 server.go:1289] "Started kubelet" Dec 12 17:33:00.956937 kubelet[2470]: I1212 17:33:00.956879 2470 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:33:00.957277 kubelet[2470]: I1212 17:33:00.957261 2470 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:33:00.957364 kubelet[2470]: I1212 17:33:00.957342 2470 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:33:00.961055 kubelet[2470]: I1212 17:33:00.961029 2470 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:33:00.961872 kubelet[2470]: I1212 17:33:00.961583 2470 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:33:00.961872 kubelet[2470]: I1212 17:33:00.961658 2470 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:33:00.961985 kubelet[2470]: I1212 17:33:00.961934 2470 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:33:00.963009 kubelet[2470]: E1212 17:33:00.961678 2470 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" Dec 12 17:33:00.963083 kubelet[2470]: I1212 17:33:00.963053 2470 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:33:00.964157 kubelet[2470]: E1212 17:33:00.964111 2470 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-3-c846c80ac0?timeout=10s\": dial tcp 10.0.8.78:6443: connect: connection refused" interval="200ms" Dec 12 17:33:00.964157 kubelet[2470]: E1212 17:33:00.964142 2470 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.8.78:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.8.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 17:33:00.964960 kubelet[2470]: I1212 17:33:00.964629 2470 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:33:00.964960 kubelet[2470]: I1212 17:33:00.964727 2470 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:33:00.966195 kubelet[2470]: I1212 17:33:00.966161 2470 server.go:317] "Adding debug handlers to kubelet server" Dec 12 17:33:00.967925 kubelet[2470]: I1212 17:33:00.967861 2470 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 12 17:33:00.967925 kubelet[2470]: I1212 17:33:00.967894 2470 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:33:00.969513 kubelet[2470]: E1212 17:33:00.967249 2470 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.8.78:6443/api/v1/namespaces/default/events\": dial tcp 10.0.8.78:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-3-c846c80ac0.1880882d0a495029 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-3-c846c80ac0,UID:ci-4459-2-2-3-c846c80ac0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-3-c846c80ac0,},FirstTimestamp:2025-12-12 17:33:00.956717097 +0000 UTC m=+0.565872916,LastTimestamp:2025-12-12 17:33:00.956717097 +0000 UTC m=+0.565872916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-3-c846c80ac0,}" Dec 12 17:33:00.969621 kubelet[2470]: E1212 17:33:00.969583 2470 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:33:00.977978 kubelet[2470]: I1212 17:33:00.977906 2470 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:33:00.977978 kubelet[2470]: I1212 17:33:00.977929 2470 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:33:00.977978 kubelet[2470]: I1212 17:33:00.977964 2470 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:33:00.982473 kubelet[2470]: I1212 17:33:00.982407 2470 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 12 17:33:00.982473 kubelet[2470]: I1212 17:33:00.982457 2470 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 12 17:33:00.982473 kubelet[2470]: I1212 17:33:00.982477 2470 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:33:00.982620 kubelet[2470]: I1212 17:33:00.982488 2470 kubelet.go:2436] "Starting kubelet main sync loop" Dec 12 17:33:00.982620 kubelet[2470]: E1212 17:33:00.982527 2470 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:33:00.983577 kubelet[2470]: I1212 17:33:00.983305 2470 policy_none.go:49] "None policy: Start" Dec 12 17:33:00.983577 kubelet[2470]: I1212 17:33:00.983331 2470 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:33:00.983577 kubelet[2470]: I1212 17:33:00.983342 2470 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:33:00.989174 kubelet[2470]: E1212 17:33:00.989147 2470 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.8.78:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.8.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:33:00.992304 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 17:33:01.008118 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 17:33:01.011393 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 17:33:01.028149 kubelet[2470]: E1212 17:33:01.028123 2470 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:33:01.028464 kubelet[2470]: I1212 17:33:01.028444 2470 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:33:01.028570 kubelet[2470]: I1212 17:33:01.028536 2470 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:33:01.028921 kubelet[2470]: I1212 17:33:01.028885 2470 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:33:01.030637 kubelet[2470]: E1212 17:33:01.030606 2470 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:33:01.030720 kubelet[2470]: E1212 17:33:01.030647 2470 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-3-c846c80ac0\" not found" Dec 12 17:33:01.094235 systemd[1]: Created slice kubepods-burstable-podfa0e1379d862bcb6f5c7bd03139d1cf3.slice - libcontainer container kubepods-burstable-podfa0e1379d862bcb6f5c7bd03139d1cf3.slice. Dec 12 17:33:01.105105 kubelet[2470]: E1212 17:33:01.105052 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.109694 systemd[1]: Created slice kubepods-burstable-pod5561443558f42886144f76f0f7c5ab97.slice - libcontainer container kubepods-burstable-pod5561443558f42886144f76f0f7c5ab97.slice. Dec 12 17:33:01.111906 kubelet[2470]: E1212 17:33:01.111733 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.113170 systemd[1]: Created slice kubepods-burstable-pod8e0dd4161c776d02dbeafab2c8bc0682.slice - libcontainer container kubepods-burstable-pod8e0dd4161c776d02dbeafab2c8bc0682.slice. Dec 12 17:33:01.115142 kubelet[2470]: E1212 17:33:01.114933 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.132891 kubelet[2470]: I1212 17:33:01.132805 2470 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.133568 kubelet[2470]: E1212 17:33:01.133471 2470 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.78:6443/api/v1/nodes\": dial tcp 10.0.8.78:6443: connect: connection refused" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.164194 kubelet[2470]: I1212 17:33:01.164121 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.164691 kubelet[2470]: I1212 17:33:01.164490 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.164691 kubelet[2470]: I1212 17:33:01.164566 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.164691 kubelet[2470]: I1212 17:33:01.164616 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.164691 kubelet[2470]: I1212 17:33:01.164637 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.164691 kubelet[2470]: I1212 17:33:01.164656 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8e0dd4161c776d02dbeafab2c8bc0682-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-3-c846c80ac0\" (UID: \"8e0dd4161c776d02dbeafab2c8bc0682\") " pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.164813 kubelet[2470]: I1212 17:33:01.164671 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fa0e1379d862bcb6f5c7bd03139d1cf3-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" (UID: \"fa0e1379d862bcb6f5c7bd03139d1cf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.165022 kubelet[2470]: E1212 17:33:01.164995 2470 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-3-c846c80ac0?timeout=10s\": dial tcp 10.0.8.78:6443: connect: connection refused" interval="400ms" Dec 12 17:33:01.165185 kubelet[2470]: I1212 17:33:01.165125 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fa0e1379d862bcb6f5c7bd03139d1cf3-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" (UID: \"fa0e1379d862bcb6f5c7bd03139d1cf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.165185 kubelet[2470]: I1212 17:33:01.165150 2470 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fa0e1379d862bcb6f5c7bd03139d1cf3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" (UID: \"fa0e1379d862bcb6f5c7bd03139d1cf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.335781 kubelet[2470]: I1212 17:33:01.335419 2470 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.335781 kubelet[2470]: E1212 17:33:01.335712 2470 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.78:6443/api/v1/nodes\": dial tcp 10.0.8.78:6443: connect: connection refused" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.407212 containerd[1621]: time="2025-12-12T17:33:01.407110265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-3-c846c80ac0,Uid:fa0e1379d862bcb6f5c7bd03139d1cf3,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:01.412669 containerd[1621]: time="2025-12-12T17:33:01.412617413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-3-c846c80ac0,Uid:5561443558f42886144f76f0f7c5ab97,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:01.416405 containerd[1621]: time="2025-12-12T17:33:01.416310072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-3-c846c80ac0,Uid:8e0dd4161c776d02dbeafab2c8bc0682,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:01.434687 containerd[1621]: time="2025-12-12T17:33:01.434626325Z" level=info msg="connecting to shim 625d40f9ff39429c6a86bbe8b46267f22848b92a19f2374d1cedde4f86acd6cf" address="unix:///run/containerd/s/7c791057f13aee9b44c6389fed56ab6f95ad718f77d3658282ec866da130eb62" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:01.454018 containerd[1621]: time="2025-12-12T17:33:01.453969343Z" level=info msg="connecting to shim 7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c" address="unix:///run/containerd/s/e2c53c410e026fd3f5cc3108996155643c442317bd06fd82bcc2b03e93271bf4" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:01.462838 containerd[1621]: time="2025-12-12T17:33:01.462769828Z" level=info msg="connecting to shim abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d" address="unix:///run/containerd/s/c7dd64f998353b9179b47ec62c5cbb78f56d9b0b32b9613932a3126ee76c3baa" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:01.464184 systemd[1]: Started cri-containerd-625d40f9ff39429c6a86bbe8b46267f22848b92a19f2374d1cedde4f86acd6cf.scope - libcontainer container 625d40f9ff39429c6a86bbe8b46267f22848b92a19f2374d1cedde4f86acd6cf. Dec 12 17:33:01.488314 systemd[1]: Started cri-containerd-7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c.scope - libcontainer container 7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c. Dec 12 17:33:01.491601 systemd[1]: Started cri-containerd-abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d.scope - libcontainer container abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d. Dec 12 17:33:01.511273 containerd[1621]: time="2025-12-12T17:33:01.511225514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-3-c846c80ac0,Uid:fa0e1379d862bcb6f5c7bd03139d1cf3,Namespace:kube-system,Attempt:0,} returns sandbox id \"625d40f9ff39429c6a86bbe8b46267f22848b92a19f2374d1cedde4f86acd6cf\"" Dec 12 17:33:01.518974 containerd[1621]: time="2025-12-12T17:33:01.518916673Z" level=info msg="CreateContainer within sandbox \"625d40f9ff39429c6a86bbe8b46267f22848b92a19f2374d1cedde4f86acd6cf\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 17:33:01.528025 containerd[1621]: time="2025-12-12T17:33:01.527979799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-3-c846c80ac0,Uid:5561443558f42886144f76f0f7c5ab97,Namespace:kube-system,Attempt:0,} returns sandbox id \"7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c\"" Dec 12 17:33:01.531967 containerd[1621]: time="2025-12-12T17:33:01.531903659Z" level=info msg="Container b3929dd2e10b6d694f7129cd534350cefec5f0f6509a8aa768f95c0bf65821dd: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:01.532247 containerd[1621]: time="2025-12-12T17:33:01.532223541Z" level=info msg="CreateContainer within sandbox \"7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 17:33:01.536488 containerd[1621]: time="2025-12-12T17:33:01.536451922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-3-c846c80ac0,Uid:8e0dd4161c776d02dbeafab2c8bc0682,Namespace:kube-system,Attempt:0,} returns sandbox id \"abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d\"" Dec 12 17:33:01.541548 containerd[1621]: time="2025-12-12T17:33:01.541517108Z" level=info msg="CreateContainer within sandbox \"abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 17:33:01.546299 containerd[1621]: time="2025-12-12T17:33:01.546253252Z" level=info msg="CreateContainer within sandbox \"625d40f9ff39429c6a86bbe8b46267f22848b92a19f2374d1cedde4f86acd6cf\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b3929dd2e10b6d694f7129cd534350cefec5f0f6509a8aa768f95c0bf65821dd\"" Dec 12 17:33:01.547219 containerd[1621]: time="2025-12-12T17:33:01.547190257Z" level=info msg="StartContainer for \"b3929dd2e10b6d694f7129cd534350cefec5f0f6509a8aa768f95c0bf65821dd\"" Dec 12 17:33:01.549865 containerd[1621]: time="2025-12-12T17:33:01.549787030Z" level=info msg="connecting to shim b3929dd2e10b6d694f7129cd534350cefec5f0f6509a8aa768f95c0bf65821dd" address="unix:///run/containerd/s/7c791057f13aee9b44c6389fed56ab6f95ad718f77d3658282ec866da130eb62" protocol=ttrpc version=3 Dec 12 17:33:01.553567 containerd[1621]: time="2025-12-12T17:33:01.553429528Z" level=info msg="Container 203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:01.554546 containerd[1621]: time="2025-12-12T17:33:01.554521174Z" level=info msg="Container f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:01.563079 containerd[1621]: time="2025-12-12T17:33:01.563047857Z" level=info msg="CreateContainer within sandbox \"7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4\"" Dec 12 17:33:01.564032 containerd[1621]: time="2025-12-12T17:33:01.564000822Z" level=info msg="StartContainer for \"203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4\"" Dec 12 17:33:01.564577 containerd[1621]: time="2025-12-12T17:33:01.564545385Z" level=info msg="CreateContainer within sandbox \"abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0\"" Dec 12 17:33:01.564937 containerd[1621]: time="2025-12-12T17:33:01.564898227Z" level=info msg="StartContainer for \"f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0\"" Dec 12 17:33:01.565021 containerd[1621]: time="2025-12-12T17:33:01.564991467Z" level=info msg="connecting to shim 203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4" address="unix:///run/containerd/s/e2c53c410e026fd3f5cc3108996155643c442317bd06fd82bcc2b03e93271bf4" protocol=ttrpc version=3 Dec 12 17:33:01.565833 kubelet[2470]: E1212 17:33:01.565791 2470 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-3-c846c80ac0?timeout=10s\": dial tcp 10.0.8.78:6443: connect: connection refused" interval="800ms" Dec 12 17:33:01.566406 containerd[1621]: time="2025-12-12T17:33:01.566374954Z" level=info msg="connecting to shim f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0" address="unix:///run/containerd/s/c7dd64f998353b9179b47ec62c5cbb78f56d9b0b32b9613932a3126ee76c3baa" protocol=ttrpc version=3 Dec 12 17:33:01.576103 systemd[1]: Started cri-containerd-b3929dd2e10b6d694f7129cd534350cefec5f0f6509a8aa768f95c0bf65821dd.scope - libcontainer container b3929dd2e10b6d694f7129cd534350cefec5f0f6509a8aa768f95c0bf65821dd. Dec 12 17:33:01.586105 systemd[1]: Started cri-containerd-203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4.scope - libcontainer container 203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4. Dec 12 17:33:01.589914 systemd[1]: Started cri-containerd-f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0.scope - libcontainer container f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0. Dec 12 17:33:01.623629 containerd[1621]: time="2025-12-12T17:33:01.623512644Z" level=info msg="StartContainer for \"b3929dd2e10b6d694f7129cd534350cefec5f0f6509a8aa768f95c0bf65821dd\" returns successfully" Dec 12 17:33:01.634637 containerd[1621]: time="2025-12-12T17:33:01.634575661Z" level=info msg="StartContainer for \"203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4\" returns successfully" Dec 12 17:33:01.641011 containerd[1621]: time="2025-12-12T17:33:01.640834412Z" level=info msg="StartContainer for \"f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0\" returns successfully" Dec 12 17:33:01.737431 kubelet[2470]: I1212 17:33:01.737060 2470 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:01.993974 kubelet[2470]: E1212 17:33:01.993867 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:02.000311 kubelet[2470]: E1212 17:33:02.000279 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:02.002994 kubelet[2470]: E1212 17:33:02.002777 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.006479 kubelet[2470]: E1212 17:33:03.006122 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.006479 kubelet[2470]: E1212 17:33:03.006304 2470 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.676272 kubelet[2470]: E1212 17:33:03.676231 2470 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-3-c846c80ac0\" not found" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.744653 kubelet[2470]: I1212 17:33:03.744613 2470 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.762216 kubelet[2470]: I1212 17:33:03.762142 2470 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.785317 kubelet[2470]: E1212 17:33:03.785218 2470 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4459-2-2-3-c846c80ac0.1880882d0a495029 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-3-c846c80ac0,UID:ci-4459-2-2-3-c846c80ac0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-3-c846c80ac0,},FirstTimestamp:2025-12-12 17:33:00.956717097 +0000 UTC m=+0.565872916,LastTimestamp:2025-12-12 17:33:00.956717097 +0000 UTC m=+0.565872916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-3-c846c80ac0,}" Dec 12 17:33:03.829984 kubelet[2470]: E1212 17:33:03.828455 2470 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.829984 kubelet[2470]: I1212 17:33:03.828496 2470 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.831355 kubelet[2470]: E1212 17:33:03.831324 2470 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.831355 kubelet[2470]: I1212 17:33:03.831354 2470 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.833029 kubelet[2470]: E1212 17:33:03.832997 2470 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-3-c846c80ac0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.877426 kubelet[2470]: I1212 17:33:03.877363 2470 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.880983 kubelet[2470]: E1212 17:33:03.880642 2470 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:03.953601 kubelet[2470]: I1212 17:33:03.953532 2470 apiserver.go:52] "Watching apiserver" Dec 12 17:33:03.962358 kubelet[2470]: I1212 17:33:03.962243 2470 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:33:04.005668 kubelet[2470]: I1212 17:33:04.005622 2470 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:04.007848 kubelet[2470]: E1212 17:33:04.007818 2470 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:05.869605 systemd[1]: Reload requested from client PID 2773 ('systemctl') (unit session-9.scope)... Dec 12 17:33:05.869621 systemd[1]: Reloading... Dec 12 17:33:05.944005 zram_generator::config[2819]: No configuration found. Dec 12 17:33:06.113221 systemd[1]: Reloading finished in 243 ms. Dec 12 17:33:06.142541 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:33:06.156401 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:33:06.156676 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:33:06.156732 systemd[1]: kubelet.service: Consumed 912ms CPU time, 127.2M memory peak. Dec 12 17:33:06.158540 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:33:06.328819 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:33:06.332986 (kubelet)[2861]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:33:06.366223 kubelet[2861]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:33:06.366223 kubelet[2861]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:33:06.366223 kubelet[2861]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:33:06.366528 kubelet[2861]: I1212 17:33:06.366250 2861 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:33:06.372049 kubelet[2861]: I1212 17:33:06.372008 2861 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 12 17:33:06.372049 kubelet[2861]: I1212 17:33:06.372037 2861 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:33:06.374212 kubelet[2861]: I1212 17:33:06.374170 2861 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:33:06.377588 kubelet[2861]: I1212 17:33:06.377562 2861 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 12 17:33:06.380328 kubelet[2861]: I1212 17:33:06.380301 2861 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:33:06.383686 kubelet[2861]: I1212 17:33:06.383663 2861 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:33:06.386153 kubelet[2861]: I1212 17:33:06.386133 2861 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:33:06.386341 kubelet[2861]: I1212 17:33:06.386318 2861 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:33:06.386492 kubelet[2861]: I1212 17:33:06.386342 2861 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-3-c846c80ac0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:33:06.386567 kubelet[2861]: I1212 17:33:06.386514 2861 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:33:06.386567 kubelet[2861]: I1212 17:33:06.386524 2861 container_manager_linux.go:303] "Creating device plugin manager" Dec 12 17:33:06.386567 kubelet[2861]: I1212 17:33:06.386567 2861 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:33:06.386734 kubelet[2861]: I1212 17:33:06.386722 2861 kubelet.go:480] "Attempting to sync node with API server" Dec 12 17:33:06.386765 kubelet[2861]: I1212 17:33:06.386740 2861 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:33:06.386765 kubelet[2861]: I1212 17:33:06.386760 2861 kubelet.go:386] "Adding apiserver pod source" Dec 12 17:33:06.386815 kubelet[2861]: I1212 17:33:06.386772 2861 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:33:06.388026 kubelet[2861]: I1212 17:33:06.387892 2861 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:33:06.388651 kubelet[2861]: I1212 17:33:06.388630 2861 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:33:06.392487 kubelet[2861]: I1212 17:33:06.392469 2861 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:33:06.393126 kubelet[2861]: I1212 17:33:06.392614 2861 server.go:1289] "Started kubelet" Dec 12 17:33:06.394843 kubelet[2861]: I1212 17:33:06.393353 2861 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:33:06.394982 kubelet[2861]: I1212 17:33:06.393795 2861 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:33:06.396774 kubelet[2861]: I1212 17:33:06.396729 2861 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:33:06.396917 kubelet[2861]: I1212 17:33:06.395504 2861 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:33:06.397795 kubelet[2861]: I1212 17:33:06.397776 2861 server.go:317] "Adding debug handlers to kubelet server" Dec 12 17:33:06.400390 kubelet[2861]: I1212 17:33:06.400365 2861 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:33:06.404477 kubelet[2861]: I1212 17:33:06.404454 2861 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:33:06.404758 kubelet[2861]: E1212 17:33:06.404740 2861 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-3-c846c80ac0\" not found" Dec 12 17:33:06.405401 kubelet[2861]: I1212 17:33:06.405378 2861 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:33:06.405580 kubelet[2861]: I1212 17:33:06.405568 2861 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:33:06.411371 kubelet[2861]: I1212 17:33:06.411336 2861 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:33:06.411454 kubelet[2861]: I1212 17:33:06.411427 2861 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:33:06.414291 kubelet[2861]: I1212 17:33:06.414272 2861 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:33:06.414776 kubelet[2861]: E1212 17:33:06.414741 2861 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:33:06.417050 kubelet[2861]: I1212 17:33:06.416994 2861 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 12 17:33:06.418054 kubelet[2861]: I1212 17:33:06.417933 2861 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 12 17:33:06.418054 kubelet[2861]: I1212 17:33:06.418009 2861 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 12 17:33:06.418054 kubelet[2861]: I1212 17:33:06.418027 2861 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:33:06.418054 kubelet[2861]: I1212 17:33:06.418034 2861 kubelet.go:2436] "Starting kubelet main sync loop" Dec 12 17:33:06.418186 kubelet[2861]: E1212 17:33:06.418073 2861 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:33:06.448666 kubelet[2861]: I1212 17:33:06.448639 2861 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:33:06.448666 kubelet[2861]: I1212 17:33:06.448657 2861 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:33:06.448666 kubelet[2861]: I1212 17:33:06.448677 2861 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:33:06.448831 kubelet[2861]: I1212 17:33:06.448795 2861 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 17:33:06.448831 kubelet[2861]: I1212 17:33:06.448804 2861 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 17:33:06.448831 kubelet[2861]: I1212 17:33:06.448819 2861 policy_none.go:49] "None policy: Start" Dec 12 17:33:06.448831 kubelet[2861]: I1212 17:33:06.448827 2861 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:33:06.448924 kubelet[2861]: I1212 17:33:06.448834 2861 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:33:06.448960 kubelet[2861]: I1212 17:33:06.448923 2861 state_mem.go:75] "Updated machine memory state" Dec 12 17:33:06.452164 kubelet[2861]: E1212 17:33:06.452139 2861 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:33:06.452312 kubelet[2861]: I1212 17:33:06.452295 2861 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:33:06.452341 kubelet[2861]: I1212 17:33:06.452313 2861 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:33:06.452939 kubelet[2861]: I1212 17:33:06.452811 2861 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:33:06.454282 kubelet[2861]: E1212 17:33:06.453719 2861 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:33:06.520013 kubelet[2861]: I1212 17:33:06.519938 2861 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.520142 kubelet[2861]: I1212 17:33:06.520092 2861 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.520266 kubelet[2861]: I1212 17:33:06.520243 2861 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.555558 kubelet[2861]: I1212 17:33:06.555525 2861 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.562441 kubelet[2861]: I1212 17:33:06.562407 2861 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.562554 kubelet[2861]: I1212 17:33:06.562486 2861 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607491 kubelet[2861]: I1212 17:33:06.607359 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607491 kubelet[2861]: I1212 17:33:06.607473 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607691 kubelet[2861]: I1212 17:33:06.607542 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fa0e1379d862bcb6f5c7bd03139d1cf3-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" (UID: \"fa0e1379d862bcb6f5c7bd03139d1cf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607691 kubelet[2861]: I1212 17:33:06.607601 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fa0e1379d862bcb6f5c7bd03139d1cf3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" (UID: \"fa0e1379d862bcb6f5c7bd03139d1cf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607691 kubelet[2861]: I1212 17:33:06.607653 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607691 kubelet[2861]: I1212 17:33:06.607673 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607691 kubelet[2861]: I1212 17:33:06.607690 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5561443558f42886144f76f0f7c5ab97-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" (UID: \"5561443558f42886144f76f0f7c5ab97\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607795 kubelet[2861]: I1212 17:33:06.607707 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8e0dd4161c776d02dbeafab2c8bc0682-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-3-c846c80ac0\" (UID: \"8e0dd4161c776d02dbeafab2c8bc0682\") " pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:06.607795 kubelet[2861]: I1212 17:33:06.607722 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fa0e1379d862bcb6f5c7bd03139d1cf3-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" (UID: \"fa0e1379d862bcb6f5c7bd03139d1cf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:07.387321 kubelet[2861]: I1212 17:33:07.387238 2861 apiserver.go:52] "Watching apiserver" Dec 12 17:33:07.406054 kubelet[2861]: I1212 17:33:07.406000 2861 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:33:07.431484 kubelet[2861]: I1212 17:33:07.431153 2861 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:07.431484 kubelet[2861]: I1212 17:33:07.431236 2861 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:07.431484 kubelet[2861]: I1212 17:33:07.431286 2861 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:07.437973 kubelet[2861]: E1212 17:33:07.437521 2861 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-3-c846c80ac0\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:07.442801 kubelet[2861]: E1212 17:33:07.442770 2861 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-3-c846c80ac0\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:07.442902 kubelet[2861]: E1212 17:33:07.442793 2861 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-3-c846c80ac0\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:07.457972 kubelet[2861]: I1212 17:33:07.457131 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-3-c846c80ac0" podStartSLOduration=1.457079878 podStartE2EDuration="1.457079878s" podCreationTimestamp="2025-12-12 17:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:33:07.455861752 +0000 UTC m=+1.119416647" watchObservedRunningTime="2025-12-12 17:33:07.457079878 +0000 UTC m=+1.120634733" Dec 12 17:33:07.482181 kubelet[2861]: I1212 17:33:07.482104 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-3-c846c80ac0" podStartSLOduration=1.4820874050000001 podStartE2EDuration="1.482087405s" podCreationTimestamp="2025-12-12 17:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:33:07.472508797 +0000 UTC m=+1.136063652" watchObservedRunningTime="2025-12-12 17:33:07.482087405 +0000 UTC m=+1.145642260" Dec 12 17:33:07.492865 kubelet[2861]: I1212 17:33:07.492286 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-3-c846c80ac0" podStartSLOduration=1.492270217 podStartE2EDuration="1.492270217s" podCreationTimestamp="2025-12-12 17:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:33:07.482499807 +0000 UTC m=+1.146054662" watchObservedRunningTime="2025-12-12 17:33:07.492270217 +0000 UTC m=+1.155825072" Dec 12 17:33:10.994074 kubelet[2861]: I1212 17:33:10.994042 2861 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 17:33:10.996024 containerd[1621]: time="2025-12-12T17:33:10.994579768Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 17:33:10.996610 kubelet[2861]: I1212 17:33:10.996392 2861 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 17:33:12.063820 systemd[1]: Created slice kubepods-besteffort-podb74f2045_81d2_4f18_a5de_5eb4541ab13a.slice - libcontainer container kubepods-besteffort-podb74f2045_81d2_4f18_a5de_5eb4541ab13a.slice. Dec 12 17:33:12.141717 kubelet[2861]: I1212 17:33:12.141563 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b74f2045-81d2-4f18-a5de-5eb4541ab13a-kube-proxy\") pod \"kube-proxy-l8dff\" (UID: \"b74f2045-81d2-4f18-a5de-5eb4541ab13a\") " pod="kube-system/kube-proxy-l8dff" Dec 12 17:33:12.141717 kubelet[2861]: I1212 17:33:12.141615 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b74f2045-81d2-4f18-a5de-5eb4541ab13a-lib-modules\") pod \"kube-proxy-l8dff\" (UID: \"b74f2045-81d2-4f18-a5de-5eb4541ab13a\") " pod="kube-system/kube-proxy-l8dff" Dec 12 17:33:12.141717 kubelet[2861]: I1212 17:33:12.141633 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b74f2045-81d2-4f18-a5de-5eb4541ab13a-xtables-lock\") pod \"kube-proxy-l8dff\" (UID: \"b74f2045-81d2-4f18-a5de-5eb4541ab13a\") " pod="kube-system/kube-proxy-l8dff" Dec 12 17:33:12.141717 kubelet[2861]: I1212 17:33:12.141650 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrfqj\" (UniqueName: \"kubernetes.io/projected/b74f2045-81d2-4f18-a5de-5eb4541ab13a-kube-api-access-hrfqj\") pod \"kube-proxy-l8dff\" (UID: \"b74f2045-81d2-4f18-a5de-5eb4541ab13a\") " pod="kube-system/kube-proxy-l8dff" Dec 12 17:33:12.176804 systemd[1]: Created slice kubepods-besteffort-pod829cd46b_689a_4cd7_b99f_90e6da1e4b08.slice - libcontainer container kubepods-besteffort-pod829cd46b_689a_4cd7_b99f_90e6da1e4b08.slice. Dec 12 17:33:12.242585 kubelet[2861]: I1212 17:33:12.242530 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f977\" (UniqueName: \"kubernetes.io/projected/829cd46b-689a-4cd7-b99f-90e6da1e4b08-kube-api-access-8f977\") pod \"tigera-operator-7dcd859c48-54kkt\" (UID: \"829cd46b-689a-4cd7-b99f-90e6da1e4b08\") " pod="tigera-operator/tigera-operator-7dcd859c48-54kkt" Dec 12 17:33:12.242753 kubelet[2861]: I1212 17:33:12.242610 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/829cd46b-689a-4cd7-b99f-90e6da1e4b08-var-lib-calico\") pod \"tigera-operator-7dcd859c48-54kkt\" (UID: \"829cd46b-689a-4cd7-b99f-90e6da1e4b08\") " pod="tigera-operator/tigera-operator-7dcd859c48-54kkt" Dec 12 17:33:12.379369 containerd[1621]: time="2025-12-12T17:33:12.379232922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-l8dff,Uid:b74f2045-81d2-4f18-a5de-5eb4541ab13a,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:12.394394 containerd[1621]: time="2025-12-12T17:33:12.394311998Z" level=info msg="connecting to shim 65537d37bfafbd17065fd7d21567172f726f1bc8ccc006cbf83f6a171bc8a98a" address="unix:///run/containerd/s/5dd013b21884cf73c2da8c8e84d6e41863c65c42387ea0d74a2a45997484b1e1" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:12.417273 systemd[1]: Started cri-containerd-65537d37bfafbd17065fd7d21567172f726f1bc8ccc006cbf83f6a171bc8a98a.scope - libcontainer container 65537d37bfafbd17065fd7d21567172f726f1bc8ccc006cbf83f6a171bc8a98a. Dec 12 17:33:12.439769 containerd[1621]: time="2025-12-12T17:33:12.439732509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-l8dff,Uid:b74f2045-81d2-4f18-a5de-5eb4541ab13a,Namespace:kube-system,Attempt:0,} returns sandbox id \"65537d37bfafbd17065fd7d21567172f726f1bc8ccc006cbf83f6a171bc8a98a\"" Dec 12 17:33:12.444235 containerd[1621]: time="2025-12-12T17:33:12.444150131Z" level=info msg="CreateContainer within sandbox \"65537d37bfafbd17065fd7d21567172f726f1bc8ccc006cbf83f6a171bc8a98a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 17:33:12.452464 containerd[1621]: time="2025-12-12T17:33:12.452434654Z" level=info msg="Container 9e43665ade8af46e679eb59cc11585697725b13cda0107a1b9d4d9fb57978500: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:12.461458 containerd[1621]: time="2025-12-12T17:33:12.461419219Z" level=info msg="CreateContainer within sandbox \"65537d37bfafbd17065fd7d21567172f726f1bc8ccc006cbf83f6a171bc8a98a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9e43665ade8af46e679eb59cc11585697725b13cda0107a1b9d4d9fb57978500\"" Dec 12 17:33:12.462151 containerd[1621]: time="2025-12-12T17:33:12.462116023Z" level=info msg="StartContainer for \"9e43665ade8af46e679eb59cc11585697725b13cda0107a1b9d4d9fb57978500\"" Dec 12 17:33:12.463579 containerd[1621]: time="2025-12-12T17:33:12.463552470Z" level=info msg="connecting to shim 9e43665ade8af46e679eb59cc11585697725b13cda0107a1b9d4d9fb57978500" address="unix:///run/containerd/s/5dd013b21884cf73c2da8c8e84d6e41863c65c42387ea0d74a2a45997484b1e1" protocol=ttrpc version=3 Dec 12 17:33:12.480075 containerd[1621]: time="2025-12-12T17:33:12.480036114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-54kkt,Uid:829cd46b-689a-4cd7-b99f-90e6da1e4b08,Namespace:tigera-operator,Attempt:0,}" Dec 12 17:33:12.485156 systemd[1]: Started cri-containerd-9e43665ade8af46e679eb59cc11585697725b13cda0107a1b9d4d9fb57978500.scope - libcontainer container 9e43665ade8af46e679eb59cc11585697725b13cda0107a1b9d4d9fb57978500. Dec 12 17:33:12.501731 containerd[1621]: time="2025-12-12T17:33:12.501662504Z" level=info msg="connecting to shim e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641" address="unix:///run/containerd/s/6a00247a1588c3021098e7167c16463db4f8da5e24074137242a7517d3460dba" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:12.527121 systemd[1]: Started cri-containerd-e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641.scope - libcontainer container e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641. Dec 12 17:33:12.552250 containerd[1621]: time="2025-12-12T17:33:12.552129600Z" level=info msg="StartContainer for \"9e43665ade8af46e679eb59cc11585697725b13cda0107a1b9d4d9fb57978500\" returns successfully" Dec 12 17:33:12.567783 containerd[1621]: time="2025-12-12T17:33:12.567743159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-54kkt,Uid:829cd46b-689a-4cd7-b99f-90e6da1e4b08,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641\"" Dec 12 17:33:12.570262 containerd[1621]: time="2025-12-12T17:33:12.570195052Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 17:33:13.256367 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4021063619.mount: Deactivated successfully. Dec 12 17:33:14.186245 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2319153411.mount: Deactivated successfully. Dec 12 17:33:14.353715 kubelet[2861]: I1212 17:33:14.353503 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-l8dff" podStartSLOduration=2.35348483 podStartE2EDuration="2.35348483s" podCreationTimestamp="2025-12-12 17:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:33:13.456711035 +0000 UTC m=+7.120265850" watchObservedRunningTime="2025-12-12 17:33:14.35348483 +0000 UTC m=+8.017039685" Dec 12 17:33:14.478708 containerd[1621]: time="2025-12-12T17:33:14.478585826Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:14.479846 containerd[1621]: time="2025-12-12T17:33:14.479774272Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 12 17:33:14.480942 containerd[1621]: time="2025-12-12T17:33:14.480916918Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:14.483289 containerd[1621]: time="2025-12-12T17:33:14.483242930Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:14.483927 containerd[1621]: time="2025-12-12T17:33:14.483898613Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 1.913669441s" Dec 12 17:33:14.483994 containerd[1621]: time="2025-12-12T17:33:14.483932333Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 12 17:33:14.488106 containerd[1621]: time="2025-12-12T17:33:14.488073394Z" level=info msg="CreateContainer within sandbox \"e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 17:33:14.497738 containerd[1621]: time="2025-12-12T17:33:14.497690163Z" level=info msg="Container bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:14.504482 containerd[1621]: time="2025-12-12T17:33:14.504422797Z" level=info msg="CreateContainer within sandbox \"e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901\"" Dec 12 17:33:14.505033 containerd[1621]: time="2025-12-12T17:33:14.504994680Z" level=info msg="StartContainer for \"bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901\"" Dec 12 17:33:14.505822 containerd[1621]: time="2025-12-12T17:33:14.505798244Z" level=info msg="connecting to shim bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901" address="unix:///run/containerd/s/6a00247a1588c3021098e7167c16463db4f8da5e24074137242a7517d3460dba" protocol=ttrpc version=3 Dec 12 17:33:14.532303 systemd[1]: Started cri-containerd-bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901.scope - libcontainer container bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901. Dec 12 17:33:14.556237 containerd[1621]: time="2025-12-12T17:33:14.556201780Z" level=info msg="StartContainer for \"bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901\" returns successfully" Dec 12 17:33:15.459860 kubelet[2861]: I1212 17:33:15.459790 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-54kkt" podStartSLOduration=1.544907683 podStartE2EDuration="3.45977553s" podCreationTimestamp="2025-12-12 17:33:12 +0000 UTC" firstStartedPulling="2025-12-12 17:33:12.569685769 +0000 UTC m=+6.233240624" lastFinishedPulling="2025-12-12 17:33:14.484553656 +0000 UTC m=+8.148108471" observedRunningTime="2025-12-12 17:33:15.459119687 +0000 UTC m=+9.122674542" watchObservedRunningTime="2025-12-12 17:33:15.45977553 +0000 UTC m=+9.123330385" Dec 12 17:33:19.985748 sudo[1893]: pam_unix(sudo:session): session closed for user root Dec 12 17:33:20.142411 sshd[1892]: Connection closed by 147.75.109.163 port 53438 Dec 12 17:33:20.145139 sshd-session[1889]: pam_unix(sshd:session): session closed for user core Dec 12 17:33:20.149981 systemd[1]: sshd@8-10.0.8.78:22-147.75.109.163:53438.service: Deactivated successfully. Dec 12 17:33:20.155607 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 17:33:20.156382 systemd[1]: session-9.scope: Consumed 7.073s CPU time, 224.4M memory peak. Dec 12 17:33:20.159690 systemd-logind[1599]: Session 9 logged out. Waiting for processes to exit. Dec 12 17:33:20.163359 systemd-logind[1599]: Removed session 9. Dec 12 17:33:28.117133 systemd[1]: Created slice kubepods-besteffort-podfcfc58f7_4633_4a41_9460_9c7841d6f5e6.slice - libcontainer container kubepods-besteffort-podfcfc58f7_4633_4a41_9460_9c7841d6f5e6.slice. Dec 12 17:33:28.138227 kubelet[2861]: I1212 17:33:28.138186 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcfc58f7-4633-4a41-9460-9c7841d6f5e6-tigera-ca-bundle\") pod \"calico-typha-9698f68c5-j72nr\" (UID: \"fcfc58f7-4633-4a41-9460-9c7841d6f5e6\") " pod="calico-system/calico-typha-9698f68c5-j72nr" Dec 12 17:33:28.138572 kubelet[2861]: I1212 17:33:28.138244 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/fcfc58f7-4633-4a41-9460-9c7841d6f5e6-typha-certs\") pod \"calico-typha-9698f68c5-j72nr\" (UID: \"fcfc58f7-4633-4a41-9460-9c7841d6f5e6\") " pod="calico-system/calico-typha-9698f68c5-j72nr" Dec 12 17:33:28.138572 kubelet[2861]: I1212 17:33:28.138265 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5c89\" (UniqueName: \"kubernetes.io/projected/fcfc58f7-4633-4a41-9460-9c7841d6f5e6-kube-api-access-k5c89\") pod \"calico-typha-9698f68c5-j72nr\" (UID: \"fcfc58f7-4633-4a41-9460-9c7841d6f5e6\") " pod="calico-system/calico-typha-9698f68c5-j72nr" Dec 12 17:33:28.343459 systemd[1]: Created slice kubepods-besteffort-pod028d00ee_97d9_451e_afea_22bc3d35e81d.slice - libcontainer container kubepods-besteffort-pod028d00ee_97d9_451e_afea_22bc3d35e81d.slice. Dec 12 17:33:28.421400 containerd[1621]: time="2025-12-12T17:33:28.421302972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9698f68c5-j72nr,Uid:fcfc58f7-4633-4a41-9460-9c7841d6f5e6,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:28.439850 kubelet[2861]: I1212 17:33:28.439774 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-flexvol-driver-host\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.439992 kubelet[2861]: I1212 17:33:28.439876 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-policysync\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.439992 kubelet[2861]: I1212 17:33:28.439901 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-cni-net-dir\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.439992 kubelet[2861]: I1212 17:33:28.439918 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-cni-bin-dir\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.439992 kubelet[2861]: I1212 17:33:28.439936 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-cni-log-dir\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.439992 kubelet[2861]: I1212 17:33:28.439978 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/028d00ee-97d9-451e-afea-22bc3d35e81d-tigera-ca-bundle\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.440101 kubelet[2861]: I1212 17:33:28.439996 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-var-run-calico\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.440101 kubelet[2861]: I1212 17:33:28.440017 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/028d00ee-97d9-451e-afea-22bc3d35e81d-node-certs\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.440101 kubelet[2861]: I1212 17:33:28.440030 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-xtables-lock\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.440101 kubelet[2861]: I1212 17:33:28.440046 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-var-lib-calico\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.440101 kubelet[2861]: I1212 17:33:28.440063 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq4h6\" (UniqueName: \"kubernetes.io/projected/028d00ee-97d9-451e-afea-22bc3d35e81d-kube-api-access-vq4h6\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.440240 kubelet[2861]: I1212 17:33:28.440077 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/028d00ee-97d9-451e-afea-22bc3d35e81d-lib-modules\") pod \"calico-node-kdvt8\" (UID: \"028d00ee-97d9-451e-afea-22bc3d35e81d\") " pod="calico-system/calico-node-kdvt8" Dec 12 17:33:28.440559 containerd[1621]: time="2025-12-12T17:33:28.440495069Z" level=info msg="connecting to shim 777c9179099dd70bf1cebe112cff6b4008c45b90ce6c75e09b2d833a391c74c1" address="unix:///run/containerd/s/5dc3277d578ce49a4444a71415db7a8e64aec263d1ee38a43fdad9bef3a41637" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:28.463131 systemd[1]: Started cri-containerd-777c9179099dd70bf1cebe112cff6b4008c45b90ce6c75e09b2d833a391c74c1.scope - libcontainer container 777c9179099dd70bf1cebe112cff6b4008c45b90ce6c75e09b2d833a391c74c1. Dec 12 17:33:28.495356 containerd[1621]: time="2025-12-12T17:33:28.495312828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9698f68c5-j72nr,Uid:fcfc58f7-4633-4a41-9460-9c7841d6f5e6,Namespace:calico-system,Attempt:0,} returns sandbox id \"777c9179099dd70bf1cebe112cff6b4008c45b90ce6c75e09b2d833a391c74c1\"" Dec 12 17:33:28.499189 containerd[1621]: time="2025-12-12T17:33:28.499157767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 17:33:28.534172 kubelet[2861]: E1212 17:33:28.533509 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:28.548708 kubelet[2861]: E1212 17:33:28.548254 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.548708 kubelet[2861]: W1212 17:33:28.548286 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.548708 kubelet[2861]: E1212 17:33:28.548310 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.561430 kubelet[2861]: E1212 17:33:28.561357 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.561430 kubelet[2861]: W1212 17:33:28.561378 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.561430 kubelet[2861]: E1212 17:33:28.561397 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.626531 kubelet[2861]: E1212 17:33:28.626414 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.626531 kubelet[2861]: W1212 17:33:28.626432 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.626531 kubelet[2861]: E1212 17:33:28.626450 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.626772 kubelet[2861]: E1212 17:33:28.626759 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.626861 kubelet[2861]: W1212 17:33:28.626818 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.627018 kubelet[2861]: E1212 17:33:28.626911 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.627638 kubelet[2861]: E1212 17:33:28.627443 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.627638 kubelet[2861]: W1212 17:33:28.627464 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.627638 kubelet[2861]: E1212 17:33:28.627475 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.627853 kubelet[2861]: E1212 17:33:28.627838 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.627922 kubelet[2861]: W1212 17:33:28.627909 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.628465 kubelet[2861]: E1212 17:33:28.628349 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.628582 kubelet[2861]: E1212 17:33:28.628570 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.628795 kubelet[2861]: W1212 17:33:28.628695 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.628795 kubelet[2861]: E1212 17:33:28.628716 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.628964 kubelet[2861]: E1212 17:33:28.628918 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.628964 kubelet[2861]: W1212 17:33:28.628933 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.629142 kubelet[2861]: E1212 17:33:28.628942 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.629323 kubelet[2861]: E1212 17:33:28.629310 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.629495 kubelet[2861]: W1212 17:33:28.629395 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.629495 kubelet[2861]: E1212 17:33:28.629411 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.629713 kubelet[2861]: E1212 17:33:28.629673 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.629713 kubelet[2861]: W1212 17:33:28.629684 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.629713 kubelet[2861]: E1212 17:33:28.629694 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.630339 kubelet[2861]: E1212 17:33:28.630201 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.630339 kubelet[2861]: W1212 17:33:28.630216 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.630339 kubelet[2861]: E1212 17:33:28.630227 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.630502 kubelet[2861]: E1212 17:33:28.630489 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.630555 kubelet[2861]: W1212 17:33:28.630545 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.630684 kubelet[2861]: E1212 17:33:28.630599 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.630781 kubelet[2861]: E1212 17:33:28.630769 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.630833 kubelet[2861]: W1212 17:33:28.630823 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.631021 kubelet[2861]: E1212 17:33:28.630882 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.631133 kubelet[2861]: E1212 17:33:28.631120 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.631189 kubelet[2861]: W1212 17:33:28.631179 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.631238 kubelet[2861]: E1212 17:33:28.631229 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.631502 kubelet[2861]: E1212 17:33:28.631415 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.631502 kubelet[2861]: W1212 17:33:28.631426 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.631502 kubelet[2861]: E1212 17:33:28.631435 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.631659 kubelet[2861]: E1212 17:33:28.631647 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.631711 kubelet[2861]: W1212 17:33:28.631701 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.631760 kubelet[2861]: E1212 17:33:28.631750 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.632033 kubelet[2861]: E1212 17:33:28.631924 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.632033 kubelet[2861]: W1212 17:33:28.631935 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.632033 kubelet[2861]: E1212 17:33:28.631961 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.632189 kubelet[2861]: E1212 17:33:28.632176 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.632242 kubelet[2861]: W1212 17:33:28.632232 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.632294 kubelet[2861]: E1212 17:33:28.632284 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.632800 kubelet[2861]: E1212 17:33:28.632782 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.632881 kubelet[2861]: W1212 17:33:28.632867 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.633103 kubelet[2861]: E1212 17:33:28.632972 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.633245 kubelet[2861]: E1212 17:33:28.633229 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.633306 kubelet[2861]: W1212 17:33:28.633295 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.633440 kubelet[2861]: E1212 17:33:28.633350 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.633802 kubelet[2861]: E1212 17:33:28.633785 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.633872 kubelet[2861]: W1212 17:33:28.633860 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.633933 kubelet[2861]: E1212 17:33:28.633922 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.634267 kubelet[2861]: E1212 17:33:28.634177 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.634267 kubelet[2861]: W1212 17:33:28.634188 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.634267 kubelet[2861]: E1212 17:33:28.634197 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.641472 kubelet[2861]: E1212 17:33:28.641452 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.641472 kubelet[2861]: W1212 17:33:28.641469 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.641612 kubelet[2861]: E1212 17:33:28.641484 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.641612 kubelet[2861]: I1212 17:33:28.641510 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d21fdd1b-5217-4220-a07c-5b154ce8fa0d-varrun\") pod \"csi-node-driver-j96bs\" (UID: \"d21fdd1b-5217-4220-a07c-5b154ce8fa0d\") " pod="calico-system/csi-node-driver-j96bs" Dec 12 17:33:28.641700 kubelet[2861]: E1212 17:33:28.641683 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.641700 kubelet[2861]: W1212 17:33:28.641692 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.641700 kubelet[2861]: E1212 17:33:28.641699 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.641903 kubelet[2861]: I1212 17:33:28.641718 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzf9v\" (UniqueName: \"kubernetes.io/projected/d21fdd1b-5217-4220-a07c-5b154ce8fa0d-kube-api-access-rzf9v\") pod \"csi-node-driver-j96bs\" (UID: \"d21fdd1b-5217-4220-a07c-5b154ce8fa0d\") " pod="calico-system/csi-node-driver-j96bs" Dec 12 17:33:28.642089 kubelet[2861]: E1212 17:33:28.642011 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.642089 kubelet[2861]: W1212 17:33:28.642033 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.642089 kubelet[2861]: E1212 17:33:28.642045 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.642315 kubelet[2861]: E1212 17:33:28.642303 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.642482 kubelet[2861]: W1212 17:33:28.642366 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.642482 kubelet[2861]: E1212 17:33:28.642381 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.642613 kubelet[2861]: E1212 17:33:28.642595 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.642748 kubelet[2861]: W1212 17:33:28.642657 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.642748 kubelet[2861]: E1212 17:33:28.642672 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.642748 kubelet[2861]: I1212 17:33:28.642699 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d21fdd1b-5217-4220-a07c-5b154ce8fa0d-socket-dir\") pod \"csi-node-driver-j96bs\" (UID: \"d21fdd1b-5217-4220-a07c-5b154ce8fa0d\") " pod="calico-system/csi-node-driver-j96bs" Dec 12 17:33:28.643044 kubelet[2861]: E1212 17:33:28.642963 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.643044 kubelet[2861]: W1212 17:33:28.642977 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.643044 kubelet[2861]: E1212 17:33:28.642986 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.643044 kubelet[2861]: I1212 17:33:28.643034 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d21fdd1b-5217-4220-a07c-5b154ce8fa0d-kubelet-dir\") pod \"csi-node-driver-j96bs\" (UID: \"d21fdd1b-5217-4220-a07c-5b154ce8fa0d\") " pod="calico-system/csi-node-driver-j96bs" Dec 12 17:33:28.643299 kubelet[2861]: E1212 17:33:28.643288 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.643425 kubelet[2861]: W1212 17:33:28.643352 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.643425 kubelet[2861]: E1212 17:33:28.643367 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.643618 kubelet[2861]: E1212 17:33:28.643607 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.643682 kubelet[2861]: W1212 17:33:28.643662 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.643740 kubelet[2861]: E1212 17:33:28.643728 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.644079 kubelet[2861]: E1212 17:33:28.643982 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.644079 kubelet[2861]: W1212 17:33:28.643994 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.644079 kubelet[2861]: E1212 17:33:28.644004 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.644079 kubelet[2861]: I1212 17:33:28.644027 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d21fdd1b-5217-4220-a07c-5b154ce8fa0d-registration-dir\") pod \"csi-node-driver-j96bs\" (UID: \"d21fdd1b-5217-4220-a07c-5b154ce8fa0d\") " pod="calico-system/csi-node-driver-j96bs" Dec 12 17:33:28.644365 kubelet[2861]: E1212 17:33:28.644352 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.644423 kubelet[2861]: W1212 17:33:28.644412 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.644474 kubelet[2861]: E1212 17:33:28.644465 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.644676 kubelet[2861]: E1212 17:33:28.644666 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.644771 kubelet[2861]: W1212 17:33:28.644724 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.644771 kubelet[2861]: E1212 17:33:28.644738 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.644985 kubelet[2861]: E1212 17:33:28.644962 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.644985 kubelet[2861]: W1212 17:33:28.644982 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.645052 kubelet[2861]: E1212 17:33:28.644995 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.645170 kubelet[2861]: E1212 17:33:28.645157 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.645170 kubelet[2861]: W1212 17:33:28.645167 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.645241 kubelet[2861]: E1212 17:33:28.645176 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.645395 kubelet[2861]: E1212 17:33:28.645381 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.645395 kubelet[2861]: W1212 17:33:28.645392 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.645453 kubelet[2861]: E1212 17:33:28.645401 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.645754 kubelet[2861]: E1212 17:33:28.645738 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.645754 kubelet[2861]: W1212 17:33:28.645753 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.645817 kubelet[2861]: E1212 17:33:28.645764 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.647489 containerd[1621]: time="2025-12-12T17:33:28.647452801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kdvt8,Uid:028d00ee-97d9-451e-afea-22bc3d35e81d,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:28.666464 containerd[1621]: time="2025-12-12T17:33:28.666373097Z" level=info msg="connecting to shim a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076" address="unix:///run/containerd/s/6684aea924c725554496e05ebde7aa3126a1d6a403966fb47b205a885f56cadb" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:28.690149 systemd[1]: Started cri-containerd-a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076.scope - libcontainer container a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076. Dec 12 17:33:28.715689 containerd[1621]: time="2025-12-12T17:33:28.715643827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kdvt8,Uid:028d00ee-97d9-451e-afea-22bc3d35e81d,Namespace:calico-system,Attempt:0,} returns sandbox id \"a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076\"" Dec 12 17:33:28.746355 kubelet[2861]: E1212 17:33:28.746309 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.746355 kubelet[2861]: W1212 17:33:28.746332 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.746355 kubelet[2861]: E1212 17:33:28.746349 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.746739 kubelet[2861]: E1212 17:33:28.746575 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.746739 kubelet[2861]: W1212 17:33:28.746589 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.746739 kubelet[2861]: E1212 17:33:28.746599 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.747008 kubelet[2861]: E1212 17:33:28.746943 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.747008 kubelet[2861]: W1212 17:33:28.746975 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.747008 kubelet[2861]: E1212 17:33:28.746987 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.747732 kubelet[2861]: E1212 17:33:28.747688 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.747732 kubelet[2861]: W1212 17:33:28.747710 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.747732 kubelet[2861]: E1212 17:33:28.747722 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.747916 kubelet[2861]: E1212 17:33:28.747870 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.747916 kubelet[2861]: W1212 17:33:28.747881 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.747916 kubelet[2861]: E1212 17:33:28.747900 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.748065 kubelet[2861]: E1212 17:33:28.748050 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.748065 kubelet[2861]: W1212 17:33:28.748060 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.748114 kubelet[2861]: E1212 17:33:28.748068 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.748718 kubelet[2861]: E1212 17:33:28.748251 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.748718 kubelet[2861]: W1212 17:33:28.748265 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.748718 kubelet[2861]: E1212 17:33:28.748273 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.748718 kubelet[2861]: E1212 17:33:28.748618 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.748718 kubelet[2861]: W1212 17:33:28.748657 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.748718 kubelet[2861]: E1212 17:33:28.748670 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.748938 kubelet[2861]: E1212 17:33:28.748902 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.748938 kubelet[2861]: W1212 17:33:28.748925 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.749019 kubelet[2861]: E1212 17:33:28.748936 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.749206 kubelet[2861]: E1212 17:33:28.749163 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.749206 kubelet[2861]: W1212 17:33:28.749180 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.749206 kubelet[2861]: E1212 17:33:28.749191 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.749366 kubelet[2861]: E1212 17:33:28.749353 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.749366 kubelet[2861]: W1212 17:33:28.749364 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.749410 kubelet[2861]: E1212 17:33:28.749372 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.749540 kubelet[2861]: E1212 17:33:28.749518 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.749540 kubelet[2861]: W1212 17:33:28.749529 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.749540 kubelet[2861]: E1212 17:33:28.749537 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.749794 kubelet[2861]: E1212 17:33:28.749776 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.749794 kubelet[2861]: W1212 17:33:28.749790 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.749847 kubelet[2861]: E1212 17:33:28.749800 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.749977 kubelet[2861]: E1212 17:33:28.749963 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.749977 kubelet[2861]: W1212 17:33:28.749975 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.750033 kubelet[2861]: E1212 17:33:28.749984 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.750201 kubelet[2861]: E1212 17:33:28.750188 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.750223 kubelet[2861]: W1212 17:33:28.750201 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.750223 kubelet[2861]: E1212 17:33:28.750210 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.750370 kubelet[2861]: E1212 17:33:28.750358 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.750394 kubelet[2861]: W1212 17:33:28.750370 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.750394 kubelet[2861]: E1212 17:33:28.750378 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.750549 kubelet[2861]: E1212 17:33:28.750536 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.750572 kubelet[2861]: W1212 17:33:28.750548 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.750572 kubelet[2861]: E1212 17:33:28.750557 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.750753 kubelet[2861]: E1212 17:33:28.750740 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.750753 kubelet[2861]: W1212 17:33:28.750751 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.750804 kubelet[2861]: E1212 17:33:28.750760 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.750938 kubelet[2861]: E1212 17:33:28.750925 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.750938 kubelet[2861]: W1212 17:33:28.750936 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.750999 kubelet[2861]: E1212 17:33:28.750957 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.752450 kubelet[2861]: E1212 17:33:28.751332 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.752450 kubelet[2861]: W1212 17:33:28.751348 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.752450 kubelet[2861]: E1212 17:33:28.751360 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.752450 kubelet[2861]: E1212 17:33:28.751536 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.752450 kubelet[2861]: W1212 17:33:28.751545 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.752450 kubelet[2861]: E1212 17:33:28.751554 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.752450 kubelet[2861]: E1212 17:33:28.751752 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.752450 kubelet[2861]: W1212 17:33:28.751761 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.752450 kubelet[2861]: E1212 17:33:28.751770 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.752450 kubelet[2861]: E1212 17:33:28.751919 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.752701 kubelet[2861]: W1212 17:33:28.751926 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.752701 kubelet[2861]: E1212 17:33:28.751934 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.752701 kubelet[2861]: E1212 17:33:28.752129 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.752701 kubelet[2861]: W1212 17:33:28.752137 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.752701 kubelet[2861]: E1212 17:33:28.752146 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.752701 kubelet[2861]: E1212 17:33:28.752324 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.752701 kubelet[2861]: W1212 17:33:28.752332 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.752701 kubelet[2861]: E1212 17:33:28.752340 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:28.761466 kubelet[2861]: E1212 17:33:28.761424 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:28.761466 kubelet[2861]: W1212 17:33:28.761446 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:28.761466 kubelet[2861]: E1212 17:33:28.761463 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:29.921532 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3150070666.mount: Deactivated successfully. Dec 12 17:33:30.419545 kubelet[2861]: E1212 17:33:30.419427 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:32.418830 kubelet[2861]: E1212 17:33:32.418784 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:32.566220 containerd[1621]: time="2025-12-12T17:33:32.566120387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:32.567527 containerd[1621]: time="2025-12-12T17:33:32.567502034Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 12 17:33:32.568514 containerd[1621]: time="2025-12-12T17:33:32.568452999Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:32.570643 containerd[1621]: time="2025-12-12T17:33:32.570590169Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:32.571251 containerd[1621]: time="2025-12-12T17:33:32.571216053Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 4.072024606s" Dec 12 17:33:32.571251 containerd[1621]: time="2025-12-12T17:33:32.571247213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 12 17:33:32.572271 containerd[1621]: time="2025-12-12T17:33:32.572243618Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 17:33:32.582166 containerd[1621]: time="2025-12-12T17:33:32.582128268Z" level=info msg="CreateContainer within sandbox \"777c9179099dd70bf1cebe112cff6b4008c45b90ce6c75e09b2d833a391c74c1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 17:33:32.590682 containerd[1621]: time="2025-12-12T17:33:32.589563306Z" level=info msg="Container 3a1c1f4d3bd9b1b044ff0310877bf00baea78c26a816e8ce59079c4ecd269620: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:32.599705 containerd[1621]: time="2025-12-12T17:33:32.599663517Z" level=info msg="CreateContainer within sandbox \"777c9179099dd70bf1cebe112cff6b4008c45b90ce6c75e09b2d833a391c74c1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3a1c1f4d3bd9b1b044ff0310877bf00baea78c26a816e8ce59079c4ecd269620\"" Dec 12 17:33:32.600408 containerd[1621]: time="2025-12-12T17:33:32.600375601Z" level=info msg="StartContainer for \"3a1c1f4d3bd9b1b044ff0310877bf00baea78c26a816e8ce59079c4ecd269620\"" Dec 12 17:33:32.601583 containerd[1621]: time="2025-12-12T17:33:32.601504687Z" level=info msg="connecting to shim 3a1c1f4d3bd9b1b044ff0310877bf00baea78c26a816e8ce59079c4ecd269620" address="unix:///run/containerd/s/5dc3277d578ce49a4444a71415db7a8e64aec263d1ee38a43fdad9bef3a41637" protocol=ttrpc version=3 Dec 12 17:33:32.623108 systemd[1]: Started cri-containerd-3a1c1f4d3bd9b1b044ff0310877bf00baea78c26a816e8ce59079c4ecd269620.scope - libcontainer container 3a1c1f4d3bd9b1b044ff0310877bf00baea78c26a816e8ce59079c4ecd269620. Dec 12 17:33:32.656921 containerd[1621]: time="2025-12-12T17:33:32.656873008Z" level=info msg="StartContainer for \"3a1c1f4d3bd9b1b044ff0310877bf00baea78c26a816e8ce59079c4ecd269620\" returns successfully" Dec 12 17:33:33.513110 kubelet[2861]: I1212 17:33:33.512997 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-9698f68c5-j72nr" podStartSLOduration=1.439545345 podStartE2EDuration="5.512978877s" podCreationTimestamp="2025-12-12 17:33:28 +0000 UTC" firstStartedPulling="2025-12-12 17:33:28.498712845 +0000 UTC m=+22.162267700" lastFinishedPulling="2025-12-12 17:33:32.572146377 +0000 UTC m=+26.235701232" observedRunningTime="2025-12-12 17:33:33.500886255 +0000 UTC m=+27.164441110" watchObservedRunningTime="2025-12-12 17:33:33.512978877 +0000 UTC m=+27.176533732" Dec 12 17:33:33.563619 kubelet[2861]: E1212 17:33:33.563538 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.563619 kubelet[2861]: W1212 17:33:33.563566 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.563619 kubelet[2861]: E1212 17:33:33.563588 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.563920 kubelet[2861]: E1212 17:33:33.563733 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.563920 kubelet[2861]: W1212 17:33:33.563741 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.563920 kubelet[2861]: E1212 17:33:33.563782 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564012 kubelet[2861]: E1212 17:33:33.563935 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564012 kubelet[2861]: W1212 17:33:33.563959 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564012 kubelet[2861]: E1212 17:33:33.563968 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564107 kubelet[2861]: E1212 17:33:33.564092 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564107 kubelet[2861]: W1212 17:33:33.564102 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564150 kubelet[2861]: E1212 17:33:33.564110 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564277 kubelet[2861]: E1212 17:33:33.564236 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564277 kubelet[2861]: W1212 17:33:33.564254 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564277 kubelet[2861]: E1212 17:33:33.564264 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564389 kubelet[2861]: E1212 17:33:33.564377 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564389 kubelet[2861]: W1212 17:33:33.564386 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564435 kubelet[2861]: E1212 17:33:33.564394 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564506 kubelet[2861]: E1212 17:33:33.564496 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564506 kubelet[2861]: W1212 17:33:33.564504 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564552 kubelet[2861]: E1212 17:33:33.564512 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564629 kubelet[2861]: E1212 17:33:33.564619 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564629 kubelet[2861]: W1212 17:33:33.564628 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564671 kubelet[2861]: E1212 17:33:33.564635 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564764 kubelet[2861]: E1212 17:33:33.564754 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564764 kubelet[2861]: W1212 17:33:33.564763 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564810 kubelet[2861]: E1212 17:33:33.564770 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.564889 kubelet[2861]: E1212 17:33:33.564879 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.564923 kubelet[2861]: W1212 17:33:33.564888 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.564923 kubelet[2861]: E1212 17:33:33.564896 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.565059 kubelet[2861]: E1212 17:33:33.565047 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.565059 kubelet[2861]: W1212 17:33:33.565057 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.565108 kubelet[2861]: E1212 17:33:33.565065 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.565185 kubelet[2861]: E1212 17:33:33.565174 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.565208 kubelet[2861]: W1212 17:33:33.565184 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.565208 kubelet[2861]: E1212 17:33:33.565192 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.565313 kubelet[2861]: E1212 17:33:33.565303 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.565336 kubelet[2861]: W1212 17:33:33.565312 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.565336 kubelet[2861]: E1212 17:33:33.565319 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.565437 kubelet[2861]: E1212 17:33:33.565427 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.565459 kubelet[2861]: W1212 17:33:33.565437 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.565459 kubelet[2861]: E1212 17:33:33.565444 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.565615 kubelet[2861]: E1212 17:33:33.565603 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.565615 kubelet[2861]: W1212 17:33:33.565612 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.565678 kubelet[2861]: E1212 17:33:33.565619 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.585340 kubelet[2861]: E1212 17:33:33.585271 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.585340 kubelet[2861]: W1212 17:33:33.585294 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.585340 kubelet[2861]: E1212 17:33:33.585311 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.585772 kubelet[2861]: E1212 17:33:33.585505 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.585772 kubelet[2861]: W1212 17:33:33.585514 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.585772 kubelet[2861]: E1212 17:33:33.585540 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.585772 kubelet[2861]: E1212 17:33:33.585722 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.585772 kubelet[2861]: W1212 17:33:33.585730 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.585772 kubelet[2861]: E1212 17:33:33.585739 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.585919 kubelet[2861]: E1212 17:33:33.585900 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.585919 kubelet[2861]: W1212 17:33:33.585911 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.585978 kubelet[2861]: E1212 17:33:33.585920 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.586063 kubelet[2861]: E1212 17:33:33.586051 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.586063 kubelet[2861]: W1212 17:33:33.586061 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.586110 kubelet[2861]: E1212 17:33:33.586069 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.586188 kubelet[2861]: E1212 17:33:33.586177 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.586188 kubelet[2861]: W1212 17:33:33.586186 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.586229 kubelet[2861]: E1212 17:33:33.586194 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.586336 kubelet[2861]: E1212 17:33:33.586326 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.586359 kubelet[2861]: W1212 17:33:33.586335 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.586359 kubelet[2861]: E1212 17:33:33.586342 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.586639 kubelet[2861]: E1212 17:33:33.586619 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.586639 kubelet[2861]: W1212 17:33:33.586637 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.586696 kubelet[2861]: E1212 17:33:33.586649 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.586807 kubelet[2861]: E1212 17:33:33.586795 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.586807 kubelet[2861]: W1212 17:33:33.586804 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.586857 kubelet[2861]: E1212 17:33:33.586813 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.587015 kubelet[2861]: E1212 17:33:33.587004 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.587015 kubelet[2861]: W1212 17:33:33.587014 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.587069 kubelet[2861]: E1212 17:33:33.587027 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.587208 kubelet[2861]: E1212 17:33:33.587195 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.587208 kubelet[2861]: W1212 17:33:33.587206 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.587259 kubelet[2861]: E1212 17:33:33.587215 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.587373 kubelet[2861]: E1212 17:33:33.587360 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.587399 kubelet[2861]: W1212 17:33:33.587371 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.587399 kubelet[2861]: E1212 17:33:33.587390 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.587606 kubelet[2861]: E1212 17:33:33.587562 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.587606 kubelet[2861]: W1212 17:33:33.587585 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.587606 kubelet[2861]: E1212 17:33:33.587596 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.587917 kubelet[2861]: E1212 17:33:33.587902 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.587954 kubelet[2861]: W1212 17:33:33.587916 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.587954 kubelet[2861]: E1212 17:33:33.587927 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.588122 kubelet[2861]: E1212 17:33:33.588102 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.588122 kubelet[2861]: W1212 17:33:33.588113 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.588213 kubelet[2861]: E1212 17:33:33.588130 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.588996 kubelet[2861]: E1212 17:33:33.588284 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.588996 kubelet[2861]: W1212 17:33:33.588292 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.588996 kubelet[2861]: E1212 17:33:33.588299 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.588996 kubelet[2861]: E1212 17:33:33.588435 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.588996 kubelet[2861]: W1212 17:33:33.588443 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.588996 kubelet[2861]: E1212 17:33:33.588451 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.588996 kubelet[2861]: E1212 17:33:33.588841 2861 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:33:33.588996 kubelet[2861]: W1212 17:33:33.588851 2861 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:33:33.588996 kubelet[2861]: E1212 17:33:33.588878 2861 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:33:33.801849 containerd[1621]: time="2025-12-12T17:33:33.801728543Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:33.803373 containerd[1621]: time="2025-12-12T17:33:33.803339752Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 12 17:33:33.804314 containerd[1621]: time="2025-12-12T17:33:33.804083395Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:33.806084 containerd[1621]: time="2025-12-12T17:33:33.806050405Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:33.806726 containerd[1621]: time="2025-12-12T17:33:33.806697649Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.234423231s" Dec 12 17:33:33.806808 containerd[1621]: time="2025-12-12T17:33:33.806794289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 12 17:33:33.811009 containerd[1621]: time="2025-12-12T17:33:33.810975950Z" level=info msg="CreateContainer within sandbox \"a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 17:33:33.823552 containerd[1621]: time="2025-12-12T17:33:33.823485654Z" level=info msg="Container d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:33.832970 containerd[1621]: time="2025-12-12T17:33:33.831969097Z" level=info msg="CreateContainer within sandbox \"a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860\"" Dec 12 17:33:33.833487 containerd[1621]: time="2025-12-12T17:33:33.833445025Z" level=info msg="StartContainer for \"d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860\"" Dec 12 17:33:33.835669 containerd[1621]: time="2025-12-12T17:33:33.835625756Z" level=info msg="connecting to shim d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860" address="unix:///run/containerd/s/6684aea924c725554496e05ebde7aa3126a1d6a403966fb47b205a885f56cadb" protocol=ttrpc version=3 Dec 12 17:33:33.861131 systemd[1]: Started cri-containerd-d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860.scope - libcontainer container d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860. Dec 12 17:33:33.937538 containerd[1621]: time="2025-12-12T17:33:33.937497393Z" level=info msg="StartContainer for \"d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860\" returns successfully" Dec 12 17:33:33.951124 systemd[1]: cri-containerd-d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860.scope: Deactivated successfully. Dec 12 17:33:33.953295 containerd[1621]: time="2025-12-12T17:33:33.953190793Z" level=info msg="received container exit event container_id:\"d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860\" id:\"d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860\" pid:3552 exited_at:{seconds:1765560813 nanos:952722630}" Dec 12 17:33:33.972987 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d2fd36a891c702d8ba70a3b8bf1ada09e329cadc970255b2f6d23494829fb860-rootfs.mount: Deactivated successfully. Dec 12 17:33:34.419143 kubelet[2861]: E1212 17:33:34.419095 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:34.494813 containerd[1621]: time="2025-12-12T17:33:34.494752664Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 17:33:36.420007 kubelet[2861]: E1212 17:33:36.419867 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:36.597968 containerd[1621]: time="2025-12-12T17:33:36.597650746Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:36.599970 containerd[1621]: time="2025-12-12T17:33:36.598590631Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 12 17:33:36.601701 containerd[1621]: time="2025-12-12T17:33:36.601653007Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:36.605937 containerd[1621]: time="2025-12-12T17:33:36.605873908Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:36.606324 containerd[1621]: time="2025-12-12T17:33:36.606290750Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.111352765s" Dec 12 17:33:36.606324 containerd[1621]: time="2025-12-12T17:33:36.606318150Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 12 17:33:36.613618 containerd[1621]: time="2025-12-12T17:33:36.613563547Z" level=info msg="CreateContainer within sandbox \"a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 17:33:36.628744 containerd[1621]: time="2025-12-12T17:33:36.626903095Z" level=info msg="Container 381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:36.637357 containerd[1621]: time="2025-12-12T17:33:36.637290748Z" level=info msg="CreateContainer within sandbox \"a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2\"" Dec 12 17:33:36.637981 containerd[1621]: time="2025-12-12T17:33:36.637759990Z" level=info msg="StartContainer for \"381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2\"" Dec 12 17:33:36.639452 containerd[1621]: time="2025-12-12T17:33:36.639416598Z" level=info msg="connecting to shim 381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2" address="unix:///run/containerd/s/6684aea924c725554496e05ebde7aa3126a1d6a403966fb47b205a885f56cadb" protocol=ttrpc version=3 Dec 12 17:33:36.657131 systemd[1]: Started cri-containerd-381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2.scope - libcontainer container 381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2. Dec 12 17:33:36.737624 containerd[1621]: time="2025-12-12T17:33:36.737556457Z" level=info msg="StartContainer for \"381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2\" returns successfully" Dec 12 17:33:37.129817 containerd[1621]: time="2025-12-12T17:33:37.129607129Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:33:37.131344 systemd[1]: cri-containerd-381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2.scope: Deactivated successfully. Dec 12 17:33:37.131776 systemd[1]: cri-containerd-381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2.scope: Consumed 492ms CPU time, 189.3M memory peak, 165.9M written to disk. Dec 12 17:33:37.132846 containerd[1621]: time="2025-12-12T17:33:37.132809785Z" level=info msg="received container exit event container_id:\"381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2\" id:\"381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2\" pid:3614 exited_at:{seconds:1765560817 nanos:132581104}" Dec 12 17:33:37.152793 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-381b8a49a46b4f60b2e1054c506b656f66e9128aafea0a41b2e96adbd6526ff2-rootfs.mount: Deactivated successfully. Dec 12 17:33:37.173072 kubelet[2861]: I1212 17:33:37.173014 2861 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 12 17:33:37.222466 systemd[1]: Created slice kubepods-burstable-pod5e5003fc_9302_4306_b397_652f2bfea588.slice - libcontainer container kubepods-burstable-pod5e5003fc_9302_4306_b397_652f2bfea588.slice. Dec 12 17:33:37.232028 systemd[1]: Created slice kubepods-burstable-pod44903b8b_15b2_4980_b478_2c324f4d054a.slice - libcontainer container kubepods-burstable-pod44903b8b_15b2_4980_b478_2c324f4d054a.slice. Dec 12 17:33:37.241388 systemd[1]: Created slice kubepods-besteffort-pod48d6a2b5_a80f_4de9_91aa_e8709a4fec3b.slice - libcontainer container kubepods-besteffort-pod48d6a2b5_a80f_4de9_91aa_e8709a4fec3b.slice. Dec 12 17:33:37.249905 systemd[1]: Created slice kubepods-besteffort-podf0f2e696_f5c9_4490_a447_8711a361f9d0.slice - libcontainer container kubepods-besteffort-podf0f2e696_f5c9_4490_a447_8711a361f9d0.slice. Dec 12 17:33:37.256603 systemd[1]: Created slice kubepods-besteffort-podff7ea540_d740_4fa7_9163_b17e2194ee80.slice - libcontainer container kubepods-besteffort-podff7ea540_d740_4fa7_9163_b17e2194ee80.slice. Dec 12 17:33:37.263275 systemd[1]: Created slice kubepods-besteffort-pod0577e60c_0557_46e6_8e63_d6ada201b69d.slice - libcontainer container kubepods-besteffort-pod0577e60c_0557_46e6_8e63_d6ada201b69d.slice. Dec 12 17:33:37.268649 systemd[1]: Created slice kubepods-besteffort-pod764a9b76_22f0_48f8_92a8_02d3a1323d4b.slice - libcontainer container kubepods-besteffort-pod764a9b76_22f0_48f8_92a8_02d3a1323d4b.slice. Dec 12 17:33:37.313007 kubelet[2861]: I1212 17:33:37.312934 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r954\" (UniqueName: \"kubernetes.io/projected/0577e60c-0557-46e6-8e63-d6ada201b69d-kube-api-access-8r954\") pod \"whisker-76bf9cd594-s2nkq\" (UID: \"0577e60c-0557-46e6-8e63-d6ada201b69d\") " pod="calico-system/whisker-76bf9cd594-s2nkq" Dec 12 17:33:37.313137 kubelet[2861]: I1212 17:33:37.313084 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7ea540-d740-4fa7-9163-b17e2194ee80-tigera-ca-bundle\") pod \"calico-kube-controllers-698c665495-s6zxl\" (UID: \"ff7ea540-d740-4fa7-9163-b17e2194ee80\") " pod="calico-system/calico-kube-controllers-698c665495-s6zxl" Dec 12 17:33:37.313164 kubelet[2861]: I1212 17:33:37.313135 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/764a9b76-22f0-48f8-92a8-02d3a1323d4b-goldmane-key-pair\") pod \"goldmane-666569f655-9vgpb\" (UID: \"764a9b76-22f0-48f8-92a8-02d3a1323d4b\") " pod="calico-system/goldmane-666569f655-9vgpb" Dec 12 17:33:37.313164 kubelet[2861]: I1212 17:33:37.313156 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59pph\" (UniqueName: \"kubernetes.io/projected/764a9b76-22f0-48f8-92a8-02d3a1323d4b-kube-api-access-59pph\") pod \"goldmane-666569f655-9vgpb\" (UID: \"764a9b76-22f0-48f8-92a8-02d3a1323d4b\") " pod="calico-system/goldmane-666569f655-9vgpb" Dec 12 17:33:37.313219 kubelet[2861]: I1212 17:33:37.313174 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/764a9b76-22f0-48f8-92a8-02d3a1323d4b-goldmane-ca-bundle\") pod \"goldmane-666569f655-9vgpb\" (UID: \"764a9b76-22f0-48f8-92a8-02d3a1323d4b\") " pod="calico-system/goldmane-666569f655-9vgpb" Dec 12 17:33:37.313264 kubelet[2861]: I1212 17:33:37.313222 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-backend-key-pair\") pod \"whisker-76bf9cd594-s2nkq\" (UID: \"0577e60c-0557-46e6-8e63-d6ada201b69d\") " pod="calico-system/whisker-76bf9cd594-s2nkq" Dec 12 17:33:37.313304 kubelet[2861]: I1212 17:33:37.313289 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szn5p\" (UniqueName: \"kubernetes.io/projected/ff7ea540-d740-4fa7-9163-b17e2194ee80-kube-api-access-szn5p\") pod \"calico-kube-controllers-698c665495-s6zxl\" (UID: \"ff7ea540-d740-4fa7-9163-b17e2194ee80\") " pod="calico-system/calico-kube-controllers-698c665495-s6zxl" Dec 12 17:33:37.313333 kubelet[2861]: I1212 17:33:37.313314 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f0f2e696-f5c9-4490-a447-8711a361f9d0-calico-apiserver-certs\") pod \"calico-apiserver-6d5697548f-vqbgd\" (UID: \"f0f2e696-f5c9-4490-a447-8711a361f9d0\") " pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" Dec 12 17:33:37.313356 kubelet[2861]: I1212 17:33:37.313330 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/764a9b76-22f0-48f8-92a8-02d3a1323d4b-config\") pod \"goldmane-666569f655-9vgpb\" (UID: \"764a9b76-22f0-48f8-92a8-02d3a1323d4b\") " pod="calico-system/goldmane-666569f655-9vgpb" Dec 12 17:33:37.313356 kubelet[2861]: I1212 17:33:37.313349 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44903b8b-15b2-4980-b478-2c324f4d054a-config-volume\") pod \"coredns-674b8bbfcf-ks2sw\" (UID: \"44903b8b-15b2-4980-b478-2c324f4d054a\") " pod="kube-system/coredns-674b8bbfcf-ks2sw" Dec 12 17:33:37.313416 kubelet[2861]: I1212 17:33:37.313399 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh4dq\" (UniqueName: \"kubernetes.io/projected/48d6a2b5-a80f-4de9-91aa-e8709a4fec3b-kube-api-access-vh4dq\") pod \"calico-apiserver-6d5697548f-q7brm\" (UID: \"48d6a2b5-a80f-4de9-91aa-e8709a4fec3b\") " pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" Dec 12 17:33:37.313454 kubelet[2861]: I1212 17:33:37.313433 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khvbx\" (UniqueName: \"kubernetes.io/projected/f0f2e696-f5c9-4490-a447-8711a361f9d0-kube-api-access-khvbx\") pod \"calico-apiserver-6d5697548f-vqbgd\" (UID: \"f0f2e696-f5c9-4490-a447-8711a361f9d0\") " pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" Dec 12 17:33:37.313482 kubelet[2861]: I1212 17:33:37.313472 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e5003fc-9302-4306-b397-652f2bfea588-config-volume\") pod \"coredns-674b8bbfcf-5vf47\" (UID: \"5e5003fc-9302-4306-b397-652f2bfea588\") " pod="kube-system/coredns-674b8bbfcf-5vf47" Dec 12 17:33:37.313504 kubelet[2861]: I1212 17:33:37.313496 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px744\" (UniqueName: \"kubernetes.io/projected/5e5003fc-9302-4306-b397-652f2bfea588-kube-api-access-px744\") pod \"coredns-674b8bbfcf-5vf47\" (UID: \"5e5003fc-9302-4306-b397-652f2bfea588\") " pod="kube-system/coredns-674b8bbfcf-5vf47" Dec 12 17:33:37.313527 kubelet[2861]: I1212 17:33:37.313516 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kvfx\" (UniqueName: \"kubernetes.io/projected/44903b8b-15b2-4980-b478-2c324f4d054a-kube-api-access-8kvfx\") pod \"coredns-674b8bbfcf-ks2sw\" (UID: \"44903b8b-15b2-4980-b478-2c324f4d054a\") " pod="kube-system/coredns-674b8bbfcf-ks2sw" Dec 12 17:33:37.313549 kubelet[2861]: I1212 17:33:37.313541 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48d6a2b5-a80f-4de9-91aa-e8709a4fec3b-calico-apiserver-certs\") pod \"calico-apiserver-6d5697548f-q7brm\" (UID: \"48d6a2b5-a80f-4de9-91aa-e8709a4fec3b\") " pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" Dec 12 17:33:37.313571 kubelet[2861]: I1212 17:33:37.313556 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-ca-bundle\") pod \"whisker-76bf9cd594-s2nkq\" (UID: \"0577e60c-0557-46e6-8e63-d6ada201b69d\") " pod="calico-system/whisker-76bf9cd594-s2nkq" Dec 12 17:33:37.502245 containerd[1621]: time="2025-12-12T17:33:37.502199021Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 17:33:37.529882 containerd[1621]: time="2025-12-12T17:33:37.529839842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5vf47,Uid:5e5003fc-9302-4306-b397-652f2bfea588,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:37.538605 containerd[1621]: time="2025-12-12T17:33:37.538567446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ks2sw,Uid:44903b8b-15b2-4980-b478-2c324f4d054a,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:37.548593 containerd[1621]: time="2025-12-12T17:33:37.548541137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-q7brm,Uid:48d6a2b5-a80f-4de9-91aa-e8709a4fec3b,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:33:37.554402 containerd[1621]: time="2025-12-12T17:33:37.554256446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-vqbgd,Uid:f0f2e696-f5c9-4490-a447-8711a361f9d0,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:33:37.561515 containerd[1621]: time="2025-12-12T17:33:37.561420562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-698c665495-s6zxl,Uid:ff7ea540-d740-4fa7-9163-b17e2194ee80,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:37.569578 containerd[1621]: time="2025-12-12T17:33:37.569419483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76bf9cd594-s2nkq,Uid:0577e60c-0557-46e6-8e63-d6ada201b69d,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:37.573159 containerd[1621]: time="2025-12-12T17:33:37.573094861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9vgpb,Uid:764a9b76-22f0-48f8-92a8-02d3a1323d4b,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:37.647028 containerd[1621]: time="2025-12-12T17:33:37.646969357Z" level=error msg="Failed to destroy network for sandbox \"72b237871b161844f0fcd0d682457f6eac5ae086b83af7ce0bf0f207e5528dec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.648806 systemd[1]: run-netns-cni\x2d14144045\x2ddf45\x2d20c0\x2d4347\x2dfa5b038f8840.mount: Deactivated successfully. Dec 12 17:33:37.650833 containerd[1621]: time="2025-12-12T17:33:37.650657415Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ks2sw,Uid:44903b8b-15b2-4980-b478-2c324f4d054a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b237871b161844f0fcd0d682457f6eac5ae086b83af7ce0bf0f207e5528dec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.651135 kubelet[2861]: E1212 17:33:37.651036 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b237871b161844f0fcd0d682457f6eac5ae086b83af7ce0bf0f207e5528dec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.651373 kubelet[2861]: E1212 17:33:37.651194 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b237871b161844f0fcd0d682457f6eac5ae086b83af7ce0bf0f207e5528dec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-ks2sw" Dec 12 17:33:37.651373 kubelet[2861]: E1212 17:33:37.651221 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b237871b161844f0fcd0d682457f6eac5ae086b83af7ce0bf0f207e5528dec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-ks2sw" Dec 12 17:33:37.651981 kubelet[2861]: E1212 17:33:37.651297 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-ks2sw_kube-system(44903b8b-15b2-4980-b478-2c324f4d054a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-ks2sw_kube-system(44903b8b-15b2-4980-b478-2c324f4d054a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72b237871b161844f0fcd0d682457f6eac5ae086b83af7ce0bf0f207e5528dec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-ks2sw" podUID="44903b8b-15b2-4980-b478-2c324f4d054a" Dec 12 17:33:37.656482 containerd[1621]: time="2025-12-12T17:33:37.656426045Z" level=error msg="Failed to destroy network for sandbox \"aec55645a20bb0ade1c807d0de4767162ebbe1d940858ba140fd2cbb1ffbd42d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.658500 systemd[1]: run-netns-cni\x2d56ad022b\x2db5c3\x2d11b8\x2d59e5\x2dbe415c9739c6.mount: Deactivated successfully. Dec 12 17:33:37.661617 containerd[1621]: time="2025-12-12T17:33:37.661472110Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5vf47,Uid:5e5003fc-9302-4306-b397-652f2bfea588,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec55645a20bb0ade1c807d0de4767162ebbe1d940858ba140fd2cbb1ffbd42d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.661921 kubelet[2861]: E1212 17:33:37.661882 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec55645a20bb0ade1c807d0de4767162ebbe1d940858ba140fd2cbb1ffbd42d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.662073 kubelet[2861]: E1212 17:33:37.662053 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec55645a20bb0ade1c807d0de4767162ebbe1d940858ba140fd2cbb1ffbd42d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5vf47" Dec 12 17:33:37.662222 kubelet[2861]: E1212 17:33:37.662159 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec55645a20bb0ade1c807d0de4767162ebbe1d940858ba140fd2cbb1ffbd42d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5vf47" Dec 12 17:33:37.662622 kubelet[2861]: E1212 17:33:37.662330 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-5vf47_kube-system(5e5003fc-9302-4306-b397-652f2bfea588)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-5vf47_kube-system(5e5003fc-9302-4306-b397-652f2bfea588)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aec55645a20bb0ade1c807d0de4767162ebbe1d940858ba140fd2cbb1ffbd42d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-5vf47" podUID="5e5003fc-9302-4306-b397-652f2bfea588" Dec 12 17:33:37.669886 containerd[1621]: time="2025-12-12T17:33:37.669355310Z" level=error msg="Failed to destroy network for sandbox \"d821176ac461f9c1fccec638ee23b37f3da7d00c4e72987a043c71333df6a4d3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.671273 systemd[1]: run-netns-cni\x2d94f6b004\x2d51ac\x2debc0\x2d7387\x2dfd0d51f622e8.mount: Deactivated successfully. Dec 12 17:33:37.673481 containerd[1621]: time="2025-12-12T17:33:37.673436411Z" level=error msg="Failed to destroy network for sandbox \"ab7c3dd68f4de35a7db8810611863299630acb84df587de420212d38e061a6cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.675542 systemd[1]: run-netns-cni\x2de421e594\x2d91ea\x2d1aaf\x2dac39\x2d5090a522cb88.mount: Deactivated successfully. Dec 12 17:33:37.676149 containerd[1621]: time="2025-12-12T17:33:37.676099585Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-698c665495-s6zxl,Uid:ff7ea540-d740-4fa7-9163-b17e2194ee80,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d821176ac461f9c1fccec638ee23b37f3da7d00c4e72987a043c71333df6a4d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.676671 kubelet[2861]: E1212 17:33:37.676623 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d821176ac461f9c1fccec638ee23b37f3da7d00c4e72987a043c71333df6a4d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.676748 kubelet[2861]: E1212 17:33:37.676689 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d821176ac461f9c1fccec638ee23b37f3da7d00c4e72987a043c71333df6a4d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" Dec 12 17:33:37.676748 kubelet[2861]: E1212 17:33:37.676712 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d821176ac461f9c1fccec638ee23b37f3da7d00c4e72987a043c71333df6a4d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" Dec 12 17:33:37.676797 kubelet[2861]: E1212 17:33:37.676754 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-698c665495-s6zxl_calico-system(ff7ea540-d740-4fa7-9163-b17e2194ee80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-698c665495-s6zxl_calico-system(ff7ea540-d740-4fa7-9163-b17e2194ee80)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d821176ac461f9c1fccec638ee23b37f3da7d00c4e72987a043c71333df6a4d3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:33:37.677557 containerd[1621]: time="2025-12-12T17:33:37.677461472Z" level=error msg="Failed to destroy network for sandbox \"2b210853b91b4d05a37edb7f3c1933cb1880b34ff215e46f4949ce8576e9cfc0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.679184 containerd[1621]: time="2025-12-12T17:33:37.678910279Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-q7brm,Uid:48d6a2b5-a80f-4de9-91aa-e8709a4fec3b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab7c3dd68f4de35a7db8810611863299630acb84df587de420212d38e061a6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.679440 kubelet[2861]: E1212 17:33:37.679393 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab7c3dd68f4de35a7db8810611863299630acb84df587de420212d38e061a6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.679508 kubelet[2861]: E1212 17:33:37.679459 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab7c3dd68f4de35a7db8810611863299630acb84df587de420212d38e061a6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" Dec 12 17:33:37.679508 kubelet[2861]: E1212 17:33:37.679488 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab7c3dd68f4de35a7db8810611863299630acb84df587de420212d38e061a6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" Dec 12 17:33:37.679557 kubelet[2861]: E1212 17:33:37.679537 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6d5697548f-q7brm_calico-apiserver(48d6a2b5-a80f-4de9-91aa-e8709a4fec3b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6d5697548f-q7brm_calico-apiserver(48d6a2b5-a80f-4de9-91aa-e8709a4fec3b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab7c3dd68f4de35a7db8810611863299630acb84df587de420212d38e061a6cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:33:37.680469 containerd[1621]: time="2025-12-12T17:33:37.680430727Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-vqbgd,Uid:f0f2e696-f5c9-4490-a447-8711a361f9d0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b210853b91b4d05a37edb7f3c1933cb1880b34ff215e46f4949ce8576e9cfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.682298 containerd[1621]: time="2025-12-12T17:33:37.682091495Z" level=error msg="Failed to destroy network for sandbox \"3eaa231262ddf529c406d93e9743d4079710661b95c2cd8d2708cedcea7bdad4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.682638 kubelet[2861]: E1212 17:33:37.682509 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b210853b91b4d05a37edb7f3c1933cb1880b34ff215e46f4949ce8576e9cfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.682638 kubelet[2861]: E1212 17:33:37.682564 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b210853b91b4d05a37edb7f3c1933cb1880b34ff215e46f4949ce8576e9cfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" Dec 12 17:33:37.682638 kubelet[2861]: E1212 17:33:37.682582 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b210853b91b4d05a37edb7f3c1933cb1880b34ff215e46f4949ce8576e9cfc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" Dec 12 17:33:37.682869 kubelet[2861]: E1212 17:33:37.682822 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6d5697548f-vqbgd_calico-apiserver(f0f2e696-f5c9-4490-a447-8711a361f9d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6d5697548f-vqbgd_calico-apiserver(f0f2e696-f5c9-4490-a447-8711a361f9d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b210853b91b4d05a37edb7f3c1933cb1880b34ff215e46f4949ce8576e9cfc0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:33:37.684426 containerd[1621]: time="2025-12-12T17:33:37.684381907Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9vgpb,Uid:764a9b76-22f0-48f8-92a8-02d3a1323d4b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3eaa231262ddf529c406d93e9743d4079710661b95c2cd8d2708cedcea7bdad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.684777 containerd[1621]: time="2025-12-12T17:33:37.684413947Z" level=error msg="Failed to destroy network for sandbox \"86fcd0e80a1f5756efc2e7ba04f382ef4597cd4e68ade6baf744c4c8258fec28\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.684852 kubelet[2861]: E1212 17:33:37.684599 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3eaa231262ddf529c406d93e9743d4079710661b95c2cd8d2708cedcea7bdad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.684852 kubelet[2861]: E1212 17:33:37.684642 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3eaa231262ddf529c406d93e9743d4079710661b95c2cd8d2708cedcea7bdad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-9vgpb" Dec 12 17:33:37.684852 kubelet[2861]: E1212 17:33:37.684662 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3eaa231262ddf529c406d93e9743d4079710661b95c2cd8d2708cedcea7bdad4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-9vgpb" Dec 12 17:33:37.684995 kubelet[2861]: E1212 17:33:37.684699 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-9vgpb_calico-system(764a9b76-22f0-48f8-92a8-02d3a1323d4b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-9vgpb_calico-system(764a9b76-22f0-48f8-92a8-02d3a1323d4b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3eaa231262ddf529c406d93e9743d4079710661b95c2cd8d2708cedcea7bdad4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:33:37.685620 containerd[1621]: time="2025-12-12T17:33:37.685587633Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76bf9cd594-s2nkq,Uid:0577e60c-0557-46e6-8e63-d6ada201b69d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"86fcd0e80a1f5756efc2e7ba04f382ef4597cd4e68ade6baf744c4c8258fec28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.685994 kubelet[2861]: E1212 17:33:37.685755 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86fcd0e80a1f5756efc2e7ba04f382ef4597cd4e68ade6baf744c4c8258fec28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:37.685994 kubelet[2861]: E1212 17:33:37.685819 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86fcd0e80a1f5756efc2e7ba04f382ef4597cd4e68ade6baf744c4c8258fec28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76bf9cd594-s2nkq" Dec 12 17:33:37.685994 kubelet[2861]: E1212 17:33:37.685836 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86fcd0e80a1f5756efc2e7ba04f382ef4597cd4e68ade6baf744c4c8258fec28\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76bf9cd594-s2nkq" Dec 12 17:33:37.686089 kubelet[2861]: E1212 17:33:37.685891 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-76bf9cd594-s2nkq_calico-system(0577e60c-0557-46e6-8e63-d6ada201b69d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-76bf9cd594-s2nkq_calico-system(0577e60c-0557-46e6-8e63-d6ada201b69d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"86fcd0e80a1f5756efc2e7ba04f382ef4597cd4e68ade6baf744c4c8258fec28\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-76bf9cd594-s2nkq" podUID="0577e60c-0557-46e6-8e63-d6ada201b69d" Dec 12 17:33:38.424418 systemd[1]: Created slice kubepods-besteffort-podd21fdd1b_5217_4220_a07c_5b154ce8fa0d.slice - libcontainer container kubepods-besteffort-podd21fdd1b_5217_4220_a07c_5b154ce8fa0d.slice. Dec 12 17:33:38.426287 containerd[1621]: time="2025-12-12T17:33:38.426249795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j96bs,Uid:d21fdd1b-5217-4220-a07c-5b154ce8fa0d,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:38.467555 containerd[1621]: time="2025-12-12T17:33:38.467508645Z" level=error msg="Failed to destroy network for sandbox \"15d3a5469f0832d0b958d74532d125e0ef6a1c588d5a1f6e8085b2842718f1f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:38.468952 containerd[1621]: time="2025-12-12T17:33:38.468875292Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j96bs,Uid:d21fdd1b-5217-4220-a07c-5b154ce8fa0d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d3a5469f0832d0b958d74532d125e0ef6a1c588d5a1f6e8085b2842718f1f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:38.469189 kubelet[2861]: E1212 17:33:38.469122 2861 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d3a5469f0832d0b958d74532d125e0ef6a1c588d5a1f6e8085b2842718f1f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:33:38.469288 kubelet[2861]: E1212 17:33:38.469238 2861 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d3a5469f0832d0b958d74532d125e0ef6a1c588d5a1f6e8085b2842718f1f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j96bs" Dec 12 17:33:38.469347 kubelet[2861]: E1212 17:33:38.469312 2861 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d3a5469f0832d0b958d74532d125e0ef6a1c588d5a1f6e8085b2842718f1f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j96bs" Dec 12 17:33:38.469452 kubelet[2861]: E1212 17:33:38.469430 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"15d3a5469f0832d0b958d74532d125e0ef6a1c588d5a1f6e8085b2842718f1f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:38.629239 systemd[1]: run-netns-cni\x2d89866e2d\x2d9baa\x2d7668\x2d40c7\x2d2087ab06b6e4.mount: Deactivated successfully. Dec 12 17:33:38.629333 systemd[1]: run-netns-cni\x2d66866f5c\x2d0f5c\x2d8ffb\x2de58b\x2d6b63b332e560.mount: Deactivated successfully. Dec 12 17:33:38.629378 systemd[1]: run-netns-cni\x2d81b2aebe\x2d0bd1\x2d5035\x2dca5c\x2d6d8936c4377c.mount: Deactivated successfully. Dec 12 17:33:41.021453 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2907240967.mount: Deactivated successfully. Dec 12 17:33:41.041053 containerd[1621]: time="2025-12-12T17:33:41.040980718Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:41.041978 containerd[1621]: time="2025-12-12T17:33:41.041903882Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 12 17:33:41.042968 containerd[1621]: time="2025-12-12T17:33:41.042878447Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:41.044885 containerd[1621]: time="2025-12-12T17:33:41.044829377Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:33:41.045415 containerd[1621]: time="2025-12-12T17:33:41.045378260Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 3.543140039s" Dec 12 17:33:41.045415 containerd[1621]: time="2025-12-12T17:33:41.045407020Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 12 17:33:41.063473 containerd[1621]: time="2025-12-12T17:33:41.063431392Z" level=info msg="CreateContainer within sandbox \"a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 17:33:41.073993 containerd[1621]: time="2025-12-12T17:33:41.073377842Z" level=info msg="Container fd0668fc45476703a91388ca84ba192f347a27f024c5aa76c2fbc18c1d6d41e0: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:41.083210 containerd[1621]: time="2025-12-12T17:33:41.083151892Z" level=info msg="CreateContainer within sandbox \"a3a8e98b0841a8ad94b2ff031888ac680a9d937ccd25edf1b6e518574b9c1076\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fd0668fc45476703a91388ca84ba192f347a27f024c5aa76c2fbc18c1d6d41e0\"" Dec 12 17:33:41.083954 containerd[1621]: time="2025-12-12T17:33:41.083876536Z" level=info msg="StartContainer for \"fd0668fc45476703a91388ca84ba192f347a27f024c5aa76c2fbc18c1d6d41e0\"" Dec 12 17:33:41.085469 containerd[1621]: time="2025-12-12T17:33:41.085443664Z" level=info msg="connecting to shim fd0668fc45476703a91388ca84ba192f347a27f024c5aa76c2fbc18c1d6d41e0" address="unix:///run/containerd/s/6684aea924c725554496e05ebde7aa3126a1d6a403966fb47b205a885f56cadb" protocol=ttrpc version=3 Dec 12 17:33:41.107134 systemd[1]: Started cri-containerd-fd0668fc45476703a91388ca84ba192f347a27f024c5aa76c2fbc18c1d6d41e0.scope - libcontainer container fd0668fc45476703a91388ca84ba192f347a27f024c5aa76c2fbc18c1d6d41e0. Dec 12 17:33:41.182674 containerd[1621]: time="2025-12-12T17:33:41.182633957Z" level=info msg="StartContainer for \"fd0668fc45476703a91388ca84ba192f347a27f024c5aa76c2fbc18c1d6d41e0\" returns successfully" Dec 12 17:33:41.321977 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 17:33:41.322196 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 17:33:41.542344 kubelet[2861]: I1212 17:33:41.542279 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-kdvt8" podStartSLOduration=1.212553751 podStartE2EDuration="13.542245904s" podCreationTimestamp="2025-12-12 17:33:28 +0000 UTC" firstStartedPulling="2025-12-12 17:33:28.716658192 +0000 UTC m=+22.380213007" lastFinishedPulling="2025-12-12 17:33:41.046350305 +0000 UTC m=+34.709905160" observedRunningTime="2025-12-12 17:33:41.541090098 +0000 UTC m=+35.204644993" watchObservedRunningTime="2025-12-12 17:33:41.542245904 +0000 UTC m=+35.205800759" Dec 12 17:33:41.546599 kubelet[2861]: I1212 17:33:41.546559 2861 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-backend-key-pair\") pod \"0577e60c-0557-46e6-8e63-d6ada201b69d\" (UID: \"0577e60c-0557-46e6-8e63-d6ada201b69d\") " Dec 12 17:33:41.546680 kubelet[2861]: I1212 17:33:41.546606 2861 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-ca-bundle\") pod \"0577e60c-0557-46e6-8e63-d6ada201b69d\" (UID: \"0577e60c-0557-46e6-8e63-d6ada201b69d\") " Dec 12 17:33:41.546680 kubelet[2861]: I1212 17:33:41.546638 2861 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r954\" (UniqueName: \"kubernetes.io/projected/0577e60c-0557-46e6-8e63-d6ada201b69d-kube-api-access-8r954\") pod \"0577e60c-0557-46e6-8e63-d6ada201b69d\" (UID: \"0577e60c-0557-46e6-8e63-d6ada201b69d\") " Dec 12 17:33:41.547997 kubelet[2861]: I1212 17:33:41.547442 2861 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "0577e60c-0557-46e6-8e63-d6ada201b69d" (UID: "0577e60c-0557-46e6-8e63-d6ada201b69d"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 17:33:41.550111 kubelet[2861]: I1212 17:33:41.550073 2861 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "0577e60c-0557-46e6-8e63-d6ada201b69d" (UID: "0577e60c-0557-46e6-8e63-d6ada201b69d"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 17:33:41.550883 kubelet[2861]: I1212 17:33:41.550827 2861 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0577e60c-0557-46e6-8e63-d6ada201b69d-kube-api-access-8r954" (OuterVolumeSpecName: "kube-api-access-8r954") pod "0577e60c-0557-46e6-8e63-d6ada201b69d" (UID: "0577e60c-0557-46e6-8e63-d6ada201b69d"). InnerVolumeSpecName "kube-api-access-8r954". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 17:33:41.647484 kubelet[2861]: I1212 17:33:41.647372 2861 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-ca-bundle\") on node \"ci-4459-2-2-3-c846c80ac0\" DevicePath \"\"" Dec 12 17:33:41.647484 kubelet[2861]: I1212 17:33:41.647455 2861 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8r954\" (UniqueName: \"kubernetes.io/projected/0577e60c-0557-46e6-8e63-d6ada201b69d-kube-api-access-8r954\") on node \"ci-4459-2-2-3-c846c80ac0\" DevicePath \"\"" Dec 12 17:33:41.647484 kubelet[2861]: I1212 17:33:41.647489 2861 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0577e60c-0557-46e6-8e63-d6ada201b69d-whisker-backend-key-pair\") on node \"ci-4459-2-2-3-c846c80ac0\" DevicePath \"\"" Dec 12 17:33:41.821231 systemd[1]: Removed slice kubepods-besteffort-pod0577e60c_0557_46e6_8e63_d6ada201b69d.slice - libcontainer container kubepods-besteffort-pod0577e60c_0557_46e6_8e63_d6ada201b69d.slice. Dec 12 17:33:41.875597 systemd[1]: Created slice kubepods-besteffort-podc7e92205_dc44_4937_bdc7_2e9e83a2cc4e.slice - libcontainer container kubepods-besteffort-podc7e92205_dc44_4937_bdc7_2e9e83a2cc4e.slice. Dec 12 17:33:41.949569 kubelet[2861]: I1212 17:33:41.949490 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c7e92205-dc44-4937-bdc7-2e9e83a2cc4e-whisker-backend-key-pair\") pod \"whisker-78745dcfb4-7t4w7\" (UID: \"c7e92205-dc44-4937-bdc7-2e9e83a2cc4e\") " pod="calico-system/whisker-78745dcfb4-7t4w7" Dec 12 17:33:41.949569 kubelet[2861]: I1212 17:33:41.949535 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7e92205-dc44-4937-bdc7-2e9e83a2cc4e-whisker-ca-bundle\") pod \"whisker-78745dcfb4-7t4w7\" (UID: \"c7e92205-dc44-4937-bdc7-2e9e83a2cc4e\") " pod="calico-system/whisker-78745dcfb4-7t4w7" Dec 12 17:33:41.949769 kubelet[2861]: I1212 17:33:41.949653 2861 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwpnl\" (UniqueName: \"kubernetes.io/projected/c7e92205-dc44-4937-bdc7-2e9e83a2cc4e-kube-api-access-hwpnl\") pod \"whisker-78745dcfb4-7t4w7\" (UID: \"c7e92205-dc44-4937-bdc7-2e9e83a2cc4e\") " pod="calico-system/whisker-78745dcfb4-7t4w7" Dec 12 17:33:42.022571 systemd[1]: var-lib-kubelet-pods-0577e60c\x2d0557\x2d46e6\x2d8e63\x2dd6ada201b69d-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8r954.mount: Deactivated successfully. Dec 12 17:33:42.022661 systemd[1]: var-lib-kubelet-pods-0577e60c\x2d0557\x2d46e6\x2d8e63\x2dd6ada201b69d-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 17:33:42.180167 containerd[1621]: time="2025-12-12T17:33:42.180025904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78745dcfb4-7t4w7,Uid:c7e92205-dc44-4937-bdc7-2e9e83a2cc4e,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:42.306495 systemd-networkd[1525]: calib388dd4fbe0: Link UP Dec 12 17:33:42.306679 systemd-networkd[1525]: calib388dd4fbe0: Gained carrier Dec 12 17:33:42.316151 containerd[1621]: 2025-12-12 17:33:42.200 [INFO][4010] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 17:33:42.316151 containerd[1621]: 2025-12-12 17:33:42.219 [INFO][4010] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0 whisker-78745dcfb4- calico-system c7e92205-dc44-4937-bdc7-2e9e83a2cc4e 915 0 2025-12-12 17:33:41 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78745dcfb4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 whisker-78745dcfb4-7t4w7 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib388dd4fbe0 [] [] }} ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-" Dec 12 17:33:42.316151 containerd[1621]: 2025-12-12 17:33:42.219 [INFO][4010] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" Dec 12 17:33:42.316151 containerd[1621]: 2025-12-12 17:33:42.263 [INFO][4024] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" HandleID="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Workload="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.263 [INFO][4024] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" HandleID="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Workload="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000483470), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"whisker-78745dcfb4-7t4w7", "timestamp":"2025-12-12 17:33:42.263593408 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.263 [INFO][4024] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.263 [INFO][4024] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.264 [INFO][4024] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.274 [INFO][4024] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.280 [INFO][4024] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.284 [INFO][4024] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.286 [INFO][4024] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316384 containerd[1621]: 2025-12-12 17:33:42.288 [INFO][4024] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316630 containerd[1621]: 2025-12-12 17:33:42.288 [INFO][4024] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316630 containerd[1621]: 2025-12-12 17:33:42.289 [INFO][4024] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450 Dec 12 17:33:42.316630 containerd[1621]: 2025-12-12 17:33:42.293 [INFO][4024] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316630 containerd[1621]: 2025-12-12 17:33:42.298 [INFO][4024] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.65/26] block=192.168.76.64/26 handle="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316630 containerd[1621]: 2025-12-12 17:33:42.298 [INFO][4024] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.65/26] handle="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:42.316630 containerd[1621]: 2025-12-12 17:33:42.298 [INFO][4024] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:42.316630 containerd[1621]: 2025-12-12 17:33:42.298 [INFO][4024] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.65/26] IPv6=[] ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" HandleID="k8s-pod-network.25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Workload="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" Dec 12 17:33:42.316828 containerd[1621]: 2025-12-12 17:33:42.300 [INFO][4010] cni-plugin/k8s.go 418: Populated endpoint ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0", GenerateName:"whisker-78745dcfb4-", Namespace:"calico-system", SelfLink:"", UID:"c7e92205-dc44-4937-bdc7-2e9e83a2cc4e", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78745dcfb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"whisker-78745dcfb4-7t4w7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.76.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib388dd4fbe0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:42.316828 containerd[1621]: 2025-12-12 17:33:42.300 [INFO][4010] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.65/32] ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" Dec 12 17:33:42.316992 containerd[1621]: 2025-12-12 17:33:42.300 [INFO][4010] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib388dd4fbe0 ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" Dec 12 17:33:42.316992 containerd[1621]: 2025-12-12 17:33:42.306 [INFO][4010] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" Dec 12 17:33:42.317056 containerd[1621]: 2025-12-12 17:33:42.307 [INFO][4010] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0", GenerateName:"whisker-78745dcfb4-", Namespace:"calico-system", SelfLink:"", UID:"c7e92205-dc44-4937-bdc7-2e9e83a2cc4e", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78745dcfb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450", Pod:"whisker-78745dcfb4-7t4w7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.76.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib388dd4fbe0", MAC:"ba:a5:e8:00:e6:05", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:42.317128 containerd[1621]: 2025-12-12 17:33:42.314 [INFO][4010] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" Namespace="calico-system" Pod="whisker-78745dcfb4-7t4w7" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-whisker--78745dcfb4--7t4w7-eth0" Dec 12 17:33:42.338295 containerd[1621]: time="2025-12-12T17:33:42.338246108Z" level=info msg="connecting to shim 25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450" address="unix:///run/containerd/s/96c569512acb40d7db28aab3b07fb3a61296fc9b44b1787b8817fcaca86ba77a" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:42.355099 systemd[1]: Started cri-containerd-25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450.scope - libcontainer container 25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450. Dec 12 17:33:42.388378 containerd[1621]: time="2025-12-12T17:33:42.388312722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78745dcfb4-7t4w7,Uid:c7e92205-dc44-4937-bdc7-2e9e83a2cc4e,Namespace:calico-system,Attempt:0,} returns sandbox id \"25a50bc984800075b87699392ae707aeb2ca8ba56105154eeaf7512b7dd7f450\"" Dec 12 17:33:42.389996 containerd[1621]: time="2025-12-12T17:33:42.389893930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:33:42.421535 kubelet[2861]: I1212 17:33:42.421473 2861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0577e60c-0557-46e6-8e63-d6ada201b69d" path="/var/lib/kubelet/pods/0577e60c-0557-46e6-8e63-d6ada201b69d/volumes" Dec 12 17:33:42.742244 containerd[1621]: time="2025-12-12T17:33:42.741995199Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:42.743318 containerd[1621]: time="2025-12-12T17:33:42.743276205Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:33:42.743586 containerd[1621]: time="2025-12-12T17:33:42.743338765Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:33:42.744129 kubelet[2861]: E1212 17:33:42.744081 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:33:42.744538 kubelet[2861]: E1212 17:33:42.744137 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:33:42.744599 kubelet[2861]: E1212 17:33:42.744284 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0baedf461ecb4cecaa52fed4d1b4bb46,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:42.746292 containerd[1621]: time="2025-12-12T17:33:42.746182100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:33:42.995591 systemd-networkd[1525]: vxlan.calico: Link UP Dec 12 17:33:42.995596 systemd-networkd[1525]: vxlan.calico: Gained carrier Dec 12 17:33:43.101253 containerd[1621]: time="2025-12-12T17:33:43.101130143Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:43.102760 containerd[1621]: time="2025-12-12T17:33:43.102718511Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:33:43.102819 containerd[1621]: time="2025-12-12T17:33:43.102802031Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:33:43.103015 kubelet[2861]: E1212 17:33:43.102975 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:33:43.103060 kubelet[2861]: E1212 17:33:43.103027 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:33:43.103181 kubelet[2861]: E1212 17:33:43.103143 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:43.104612 kubelet[2861]: E1212 17:33:43.104504 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:33:43.430163 systemd-networkd[1525]: calib388dd4fbe0: Gained IPv6LL Dec 12 17:33:43.523061 kubelet[2861]: E1212 17:33:43.522985 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:33:44.710216 systemd-networkd[1525]: vxlan.calico: Gained IPv6LL Dec 12 17:33:48.419971 containerd[1621]: time="2025-12-12T17:33:48.419850561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ks2sw,Uid:44903b8b-15b2-4980-b478-2c324f4d054a,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:48.420442 containerd[1621]: time="2025-12-12T17:33:48.420019922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-698c665495-s6zxl,Uid:ff7ea540-d740-4fa7-9163-b17e2194ee80,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:48.525344 systemd-networkd[1525]: cali24b061e6ca5: Link UP Dec 12 17:33:48.525472 systemd-networkd[1525]: cali24b061e6ca5: Gained carrier Dec 12 17:33:48.540576 containerd[1621]: 2025-12-12 17:33:48.459 [INFO][4339] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0 calico-kube-controllers-698c665495- calico-system ff7ea540-d740-4fa7-9163-b17e2194ee80 855 0 2025-12-12 17:33:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:698c665495 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 calico-kube-controllers-698c665495-s6zxl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali24b061e6ca5 [] [] }} ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-" Dec 12 17:33:48.540576 containerd[1621]: 2025-12-12 17:33:48.460 [INFO][4339] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" Dec 12 17:33:48.540576 containerd[1621]: 2025-12-12 17:33:48.483 [INFO][4358] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" HandleID="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.483 [INFO][4358] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" HandleID="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c32c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"calico-kube-controllers-698c665495-s6zxl", "timestamp":"2025-12-12 17:33:48.483402524 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.483 [INFO][4358] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.483 [INFO][4358] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.483 [INFO][4358] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.493 [INFO][4358] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.498 [INFO][4358] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.502 [INFO][4358] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.503 [INFO][4358] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540758 containerd[1621]: 2025-12-12 17:33:48.506 [INFO][4358] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540981 containerd[1621]: 2025-12-12 17:33:48.506 [INFO][4358] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540981 containerd[1621]: 2025-12-12 17:33:48.507 [INFO][4358] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74 Dec 12 17:33:48.540981 containerd[1621]: 2025-12-12 17:33:48.511 [INFO][4358] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540981 containerd[1621]: 2025-12-12 17:33:48.518 [INFO][4358] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.66/26] block=192.168.76.64/26 handle="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540981 containerd[1621]: 2025-12-12 17:33:48.518 [INFO][4358] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.66/26] handle="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.540981 containerd[1621]: 2025-12-12 17:33:48.518 [INFO][4358] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:48.540981 containerd[1621]: 2025-12-12 17:33:48.518 [INFO][4358] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.66/26] IPv6=[] ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" HandleID="k8s-pod-network.6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" Dec 12 17:33:48.541113 containerd[1621]: 2025-12-12 17:33:48.520 [INFO][4339] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0", GenerateName:"calico-kube-controllers-698c665495-", Namespace:"calico-system", SelfLink:"", UID:"ff7ea540-d740-4fa7-9163-b17e2194ee80", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"698c665495", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"calico-kube-controllers-698c665495-s6zxl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.76.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali24b061e6ca5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:48.541163 containerd[1621]: 2025-12-12 17:33:48.520 [INFO][4339] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.66/32] ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" Dec 12 17:33:48.541163 containerd[1621]: 2025-12-12 17:33:48.520 [INFO][4339] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24b061e6ca5 ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" Dec 12 17:33:48.541163 containerd[1621]: 2025-12-12 17:33:48.525 [INFO][4339] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" Dec 12 17:33:48.541232 containerd[1621]: 2025-12-12 17:33:48.527 [INFO][4339] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0", GenerateName:"calico-kube-controllers-698c665495-", Namespace:"calico-system", SelfLink:"", UID:"ff7ea540-d740-4fa7-9163-b17e2194ee80", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"698c665495", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74", Pod:"calico-kube-controllers-698c665495-s6zxl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.76.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali24b061e6ca5", MAC:"56:ae:d9:85:9b:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:48.541278 containerd[1621]: 2025-12-12 17:33:48.538 [INFO][4339] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" Namespace="calico-system" Pod="calico-kube-controllers-698c665495-s6zxl" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--kube--controllers--698c665495--s6zxl-eth0" Dec 12 17:33:48.561982 containerd[1621]: time="2025-12-12T17:33:48.561925243Z" level=info msg="connecting to shim 6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74" address="unix:///run/containerd/s/91f58f229a60140769e95303194a350497c0736fd3d07e58f3c483539f141de7" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:48.588294 systemd[1]: Started cri-containerd-6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74.scope - libcontainer container 6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74. Dec 12 17:33:48.632458 containerd[1621]: time="2025-12-12T17:33:48.632418721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-698c665495-s6zxl,Uid:ff7ea540-d740-4fa7-9163-b17e2194ee80,Namespace:calico-system,Attempt:0,} returns sandbox id \"6ccbe8bc5147c7f7115bd45f58ebbf67bc0594298ad923d5e15342b53996af74\"" Dec 12 17:33:48.634544 containerd[1621]: time="2025-12-12T17:33:48.634509772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:33:48.637479 systemd-networkd[1525]: cali43c4040875e: Link UP Dec 12 17:33:48.637752 systemd-networkd[1525]: cali43c4040875e: Gained carrier Dec 12 17:33:48.653491 containerd[1621]: 2025-12-12 17:33:48.462 [INFO][4333] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0 coredns-674b8bbfcf- kube-system 44903b8b-15b2-4980-b478-2c324f4d054a 853 0 2025-12-12 17:33:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 coredns-674b8bbfcf-ks2sw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali43c4040875e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-" Dec 12 17:33:48.653491 containerd[1621]: 2025-12-12 17:33:48.462 [INFO][4333] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" Dec 12 17:33:48.653491 containerd[1621]: 2025-12-12 17:33:48.488 [INFO][4364] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" HandleID="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Workload="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.488 [INFO][4364] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" HandleID="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Workload="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd5f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"coredns-674b8bbfcf-ks2sw", "timestamp":"2025-12-12 17:33:48.488728991 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.488 [INFO][4364] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.518 [INFO][4364] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.518 [INFO][4364] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.595 [INFO][4364] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.602 [INFO][4364] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.609 [INFO][4364] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.612 [INFO][4364] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653680 containerd[1621]: 2025-12-12 17:33:48.615 [INFO][4364] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653905 containerd[1621]: 2025-12-12 17:33:48.615 [INFO][4364] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653905 containerd[1621]: 2025-12-12 17:33:48.617 [INFO][4364] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92 Dec 12 17:33:48.653905 containerd[1621]: 2025-12-12 17:33:48.623 [INFO][4364] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653905 containerd[1621]: 2025-12-12 17:33:48.631 [INFO][4364] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.67/26] block=192.168.76.64/26 handle="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653905 containerd[1621]: 2025-12-12 17:33:48.631 [INFO][4364] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.67/26] handle="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:48.653905 containerd[1621]: 2025-12-12 17:33:48.631 [INFO][4364] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:48.653905 containerd[1621]: 2025-12-12 17:33:48.631 [INFO][4364] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.67/26] IPv6=[] ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" HandleID="k8s-pod-network.7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Workload="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" Dec 12 17:33:48.654198 containerd[1621]: 2025-12-12 17:33:48.635 [INFO][4333] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"44903b8b-15b2-4980-b478-2c324f4d054a", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"coredns-674b8bbfcf-ks2sw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.76.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43c4040875e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:48.654198 containerd[1621]: 2025-12-12 17:33:48.635 [INFO][4333] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.67/32] ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" Dec 12 17:33:48.654198 containerd[1621]: 2025-12-12 17:33:48.635 [INFO][4333] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43c4040875e ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" Dec 12 17:33:48.654198 containerd[1621]: 2025-12-12 17:33:48.638 [INFO][4333] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" Dec 12 17:33:48.654198 containerd[1621]: 2025-12-12 17:33:48.639 [INFO][4333] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"44903b8b-15b2-4980-b478-2c324f4d054a", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92", Pod:"coredns-674b8bbfcf-ks2sw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.76.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali43c4040875e", MAC:"92:62:08:88:f2:3b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:48.654198 containerd[1621]: 2025-12-12 17:33:48.651 [INFO][4333] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" Namespace="kube-system" Pod="coredns-674b8bbfcf-ks2sw" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--ks2sw-eth0" Dec 12 17:33:48.678636 containerd[1621]: time="2025-12-12T17:33:48.678517435Z" level=info msg="connecting to shim 7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92" address="unix:///run/containerd/s/e3aedb236ab3fb241e55e70bfcfb471e8bdb183276bc30f8419e496c6551087c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:48.706112 systemd[1]: Started cri-containerd-7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92.scope - libcontainer container 7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92. Dec 12 17:33:48.737332 containerd[1621]: time="2025-12-12T17:33:48.737285014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ks2sw,Uid:44903b8b-15b2-4980-b478-2c324f4d054a,Namespace:kube-system,Attempt:0,} returns sandbox id \"7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92\"" Dec 12 17:33:48.743574 containerd[1621]: time="2025-12-12T17:33:48.743540326Z" level=info msg="CreateContainer within sandbox \"7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:33:48.751356 containerd[1621]: time="2025-12-12T17:33:48.751278685Z" level=info msg="Container 3402bf0317030dfc88a1802687b30fcb784838eef7722554e15f55364354cf9a: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:48.757731 containerd[1621]: time="2025-12-12T17:33:48.757666238Z" level=info msg="CreateContainer within sandbox \"7f47483cb7f286689b995d8e1f9123f69841ffbb8a86e3ee4bae30d4cd7c8d92\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3402bf0317030dfc88a1802687b30fcb784838eef7722554e15f55364354cf9a\"" Dec 12 17:33:48.758510 containerd[1621]: time="2025-12-12T17:33:48.758467762Z" level=info msg="StartContainer for \"3402bf0317030dfc88a1802687b30fcb784838eef7722554e15f55364354cf9a\"" Dec 12 17:33:48.760602 containerd[1621]: time="2025-12-12T17:33:48.760564092Z" level=info msg="connecting to shim 3402bf0317030dfc88a1802687b30fcb784838eef7722554e15f55364354cf9a" address="unix:///run/containerd/s/e3aedb236ab3fb241e55e70bfcfb471e8bdb183276bc30f8419e496c6551087c" protocol=ttrpc version=3 Dec 12 17:33:48.779104 systemd[1]: Started cri-containerd-3402bf0317030dfc88a1802687b30fcb784838eef7722554e15f55364354cf9a.scope - libcontainer container 3402bf0317030dfc88a1802687b30fcb784838eef7722554e15f55364354cf9a. Dec 12 17:33:48.803536 containerd[1621]: time="2025-12-12T17:33:48.803497790Z" level=info msg="StartContainer for \"3402bf0317030dfc88a1802687b30fcb784838eef7722554e15f55364354cf9a\" returns successfully" Dec 12 17:33:48.987450 containerd[1621]: time="2025-12-12T17:33:48.987391245Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:48.988913 containerd[1621]: time="2025-12-12T17:33:48.988832652Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:33:48.989061 containerd[1621]: time="2025-12-12T17:33:48.988842252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:33:48.989151 kubelet[2861]: E1212 17:33:48.989088 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:33:48.989151 kubelet[2861]: E1212 17:33:48.989146 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:33:48.989477 kubelet[2861]: E1212 17:33:48.989279 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szn5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-698c665495-s6zxl_calico-system(ff7ea540-d740-4fa7-9163-b17e2194ee80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:48.990688 kubelet[2861]: E1212 17:33:48.990640 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:33:49.420111 containerd[1621]: time="2025-12-12T17:33:49.419712321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j96bs,Uid:d21fdd1b-5217-4220-a07c-5b154ce8fa0d,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:49.516699 systemd-networkd[1525]: calid85458cd99d: Link UP Dec 12 17:33:49.517354 systemd-networkd[1525]: calid85458cd99d: Gained carrier Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.456 [INFO][4521] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0 csi-node-driver- calico-system d21fdd1b-5217-4220-a07c-5b154ce8fa0d 756 0 2025-12-12 17:33:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 csi-node-driver-j96bs eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid85458cd99d [] [] }} ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.456 [INFO][4521] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.476 [INFO][4535] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" HandleID="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Workload="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.476 [INFO][4535] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" HandleID="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Workload="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003ab390), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"csi-node-driver-j96bs", "timestamp":"2025-12-12 17:33:49.47663993 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.476 [INFO][4535] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.476 [INFO][4535] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.476 [INFO][4535] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.486 [INFO][4535] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.492 [INFO][4535] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.496 [INFO][4535] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.498 [INFO][4535] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.500 [INFO][4535] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.500 [INFO][4535] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.501 [INFO][4535] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25 Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.505 [INFO][4535] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.511 [INFO][4535] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.68/26] block=192.168.76.64/26 handle="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.512 [INFO][4535] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.68/26] handle="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.512 [INFO][4535] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:49.536399 containerd[1621]: 2025-12-12 17:33:49.512 [INFO][4535] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.68/26] IPv6=[] ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" HandleID="k8s-pod-network.d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Workload="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" Dec 12 17:33:49.538353 containerd[1621]: 2025-12-12 17:33:49.514 [INFO][4521] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d21fdd1b-5217-4220-a07c-5b154ce8fa0d", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"csi-node-driver-j96bs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.76.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid85458cd99d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:49.538353 containerd[1621]: 2025-12-12 17:33:49.514 [INFO][4521] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.68/32] ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" Dec 12 17:33:49.538353 containerd[1621]: 2025-12-12 17:33:49.514 [INFO][4521] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid85458cd99d ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" Dec 12 17:33:49.538353 containerd[1621]: 2025-12-12 17:33:49.517 [INFO][4521] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" Dec 12 17:33:49.538353 containerd[1621]: 2025-12-12 17:33:49.518 [INFO][4521] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d21fdd1b-5217-4220-a07c-5b154ce8fa0d", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25", Pod:"csi-node-driver-j96bs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.76.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid85458cd99d", MAC:"a6:61:48:7e:78:9c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:49.538353 containerd[1621]: 2025-12-12 17:33:49.532 [INFO][4521] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" Namespace="calico-system" Pod="csi-node-driver-j96bs" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-csi--node--driver--j96bs-eth0" Dec 12 17:33:49.540260 kubelet[2861]: E1212 17:33:49.540126 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:33:49.565511 containerd[1621]: time="2025-12-12T17:33:49.565460581Z" level=info msg="connecting to shim d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25" address="unix:///run/containerd/s/295dbcdc769f4f0090b1926297fe0248e1af368f88e0752376e664112ae47857" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:49.569053 kubelet[2861]: I1212 17:33:49.568970 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-ks2sw" podStartSLOduration=37.568920719 podStartE2EDuration="37.568920719s" podCreationTimestamp="2025-12-12 17:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:33:49.550492345 +0000 UTC m=+43.214047200" watchObservedRunningTime="2025-12-12 17:33:49.568920719 +0000 UTC m=+43.232475574" Dec 12 17:33:49.604848 systemd[1]: Started cri-containerd-d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25.scope - libcontainer container d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25. Dec 12 17:33:49.632031 containerd[1621]: time="2025-12-12T17:33:49.631974919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j96bs,Uid:d21fdd1b-5217-4220-a07c-5b154ce8fa0d,Namespace:calico-system,Attempt:0,} returns sandbox id \"d3fd6d0d028e8aa25a63702ae32fddcc4483fae628fa12c04cb699ed4040aa25\"" Dec 12 17:33:49.633430 containerd[1621]: time="2025-12-12T17:33:49.633402926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:33:49.702294 systemd-networkd[1525]: cali43c4040875e: Gained IPv6LL Dec 12 17:33:49.766810 systemd-networkd[1525]: cali24b061e6ca5: Gained IPv6LL Dec 12 17:33:49.962570 containerd[1621]: time="2025-12-12T17:33:49.962330877Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:49.963994 containerd[1621]: time="2025-12-12T17:33:49.963933485Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:33:49.964049 containerd[1621]: time="2025-12-12T17:33:49.963973485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:33:49.964227 kubelet[2861]: E1212 17:33:49.964170 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:33:49.964227 kubelet[2861]: E1212 17:33:49.964222 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:33:49.964387 kubelet[2861]: E1212 17:33:49.964338 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:49.966319 containerd[1621]: time="2025-12-12T17:33:49.966289497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:33:50.312064 containerd[1621]: time="2025-12-12T17:33:50.311815212Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:50.313393 containerd[1621]: time="2025-12-12T17:33:50.313319860Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:33:50.313393 containerd[1621]: time="2025-12-12T17:33:50.313364700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:33:50.313592 kubelet[2861]: E1212 17:33:50.313532 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:33:50.313592 kubelet[2861]: E1212 17:33:50.313583 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:33:50.313872 kubelet[2861]: E1212 17:33:50.313738 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:50.315133 kubelet[2861]: E1212 17:33:50.315083 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:50.542473 kubelet[2861]: E1212 17:33:50.542426 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:33:50.544325 kubelet[2861]: E1212 17:33:50.544252 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:50.919287 systemd-networkd[1525]: calid85458cd99d: Gained IPv6LL Dec 12 17:33:51.419299 containerd[1621]: time="2025-12-12T17:33:51.419252598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5vf47,Uid:5e5003fc-9302-4306-b397-652f2bfea588,Namespace:kube-system,Attempt:0,}" Dec 12 17:33:51.513919 systemd-networkd[1525]: cali0140077d18f: Link UP Dec 12 17:33:51.514319 systemd-networkd[1525]: cali0140077d18f: Gained carrier Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.452 [INFO][4602] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0 coredns-674b8bbfcf- kube-system 5e5003fc-9302-4306-b397-652f2bfea588 848 0 2025-12-12 17:33:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 coredns-674b8bbfcf-5vf47 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0140077d18f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.453 [INFO][4602] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.474 [INFO][4616] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" HandleID="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Workload="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.474 [INFO][4616] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" HandleID="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Workload="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004dd20), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"coredns-674b8bbfcf-5vf47", "timestamp":"2025-12-12 17:33:51.47477308 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.475 [INFO][4616] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.475 [INFO][4616] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.475 [INFO][4616] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.485 [INFO][4616] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.489 [INFO][4616] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.493 [INFO][4616] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.495 [INFO][4616] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.497 [INFO][4616] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.497 [INFO][4616] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.499 [INFO][4616] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.503 [INFO][4616] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.508 [INFO][4616] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.69/26] block=192.168.76.64/26 handle="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.508 [INFO][4616] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.69/26] handle="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.508 [INFO][4616] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:51.526701 containerd[1621]: 2025-12-12 17:33:51.508 [INFO][4616] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.69/26] IPv6=[] ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" HandleID="k8s-pod-network.3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Workload="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" Dec 12 17:33:51.527256 containerd[1621]: 2025-12-12 17:33:51.510 [INFO][4602] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5e5003fc-9302-4306-b397-652f2bfea588", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"coredns-674b8bbfcf-5vf47", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.76.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0140077d18f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:51.527256 containerd[1621]: 2025-12-12 17:33:51.511 [INFO][4602] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.69/32] ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" Dec 12 17:33:51.527256 containerd[1621]: 2025-12-12 17:33:51.511 [INFO][4602] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0140077d18f ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" Dec 12 17:33:51.527256 containerd[1621]: 2025-12-12 17:33:51.514 [INFO][4602] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" Dec 12 17:33:51.527256 containerd[1621]: 2025-12-12 17:33:51.515 [INFO][4602] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"5e5003fc-9302-4306-b397-652f2bfea588", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f", Pod:"coredns-674b8bbfcf-5vf47", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.76.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0140077d18f", MAC:"da:9d:1b:22:52:c3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:51.527256 containerd[1621]: 2025-12-12 17:33:51.524 [INFO][4602] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5vf47" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-coredns--674b8bbfcf--5vf47-eth0" Dec 12 17:33:51.547199 kubelet[2861]: E1212 17:33:51.547149 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:33:51.555289 containerd[1621]: time="2025-12-12T17:33:51.555231769Z" level=info msg="connecting to shim 3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f" address="unix:///run/containerd/s/bafa46a4e86000f002b2532e12d0962ed0a4d68a381ab244fcd2566111cdd903" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:51.577112 systemd[1]: Started cri-containerd-3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f.scope - libcontainer container 3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f. Dec 12 17:33:51.606933 containerd[1621]: time="2025-12-12T17:33:51.606896631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5vf47,Uid:5e5003fc-9302-4306-b397-652f2bfea588,Namespace:kube-system,Attempt:0,} returns sandbox id \"3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f\"" Dec 12 17:33:51.611992 containerd[1621]: time="2025-12-12T17:33:51.611524975Z" level=info msg="CreateContainer within sandbox \"3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:33:51.625628 containerd[1621]: time="2025-12-12T17:33:51.625538726Z" level=info msg="Container a312321d8039cec3907eb8d925b6f71f178a74d4cee1fb9f37200614bc6c1631: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:51.632843 containerd[1621]: time="2025-12-12T17:33:51.632795443Z" level=info msg="CreateContainer within sandbox \"3f71c4e739af87e709446711845af7e5c64163489ed62990bb5c9f1103e68c3f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a312321d8039cec3907eb8d925b6f71f178a74d4cee1fb9f37200614bc6c1631\"" Dec 12 17:33:51.633645 containerd[1621]: time="2025-12-12T17:33:51.633596527Z" level=info msg="StartContainer for \"a312321d8039cec3907eb8d925b6f71f178a74d4cee1fb9f37200614bc6c1631\"" Dec 12 17:33:51.634810 containerd[1621]: time="2025-12-12T17:33:51.634763573Z" level=info msg="connecting to shim a312321d8039cec3907eb8d925b6f71f178a74d4cee1fb9f37200614bc6c1631" address="unix:///run/containerd/s/bafa46a4e86000f002b2532e12d0962ed0a4d68a381ab244fcd2566111cdd903" protocol=ttrpc version=3 Dec 12 17:33:51.660377 systemd[1]: Started cri-containerd-a312321d8039cec3907eb8d925b6f71f178a74d4cee1fb9f37200614bc6c1631.scope - libcontainer container a312321d8039cec3907eb8d925b6f71f178a74d4cee1fb9f37200614bc6c1631. Dec 12 17:33:51.693372 containerd[1621]: time="2025-12-12T17:33:51.693312990Z" level=info msg="StartContainer for \"a312321d8039cec3907eb8d925b6f71f178a74d4cee1fb9f37200614bc6c1631\" returns successfully" Dec 12 17:33:52.419519 containerd[1621]: time="2025-12-12T17:33:52.419454119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9vgpb,Uid:764a9b76-22f0-48f8-92a8-02d3a1323d4b,Namespace:calico-system,Attempt:0,}" Dec 12 17:33:52.420839 containerd[1621]: time="2025-12-12T17:33:52.419454879Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-vqbgd,Uid:f0f2e696-f5c9-4490-a447-8711a361f9d0,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:33:52.545936 systemd-networkd[1525]: caliacfa78d5d81: Link UP Dec 12 17:33:52.546799 systemd-networkd[1525]: caliacfa78d5d81: Gained carrier Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.466 [INFO][4714] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0 calico-apiserver-6d5697548f- calico-apiserver f0f2e696-f5c9-4490-a447-8711a361f9d0 856 0 2025-12-12 17:33:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6d5697548f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 calico-apiserver-6d5697548f-vqbgd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliacfa78d5d81 [] [] }} ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.466 [INFO][4714] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.495 [INFO][4749] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" HandleID="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.496 [INFO][4749] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" HandleID="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd590), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"calico-apiserver-6d5697548f-vqbgd", "timestamp":"2025-12-12 17:33:52.495933228 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.496 [INFO][4749] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.496 [INFO][4749] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.496 [INFO][4749] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.507 [INFO][4749] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.512 [INFO][4749] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.516 [INFO][4749] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.518 [INFO][4749] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.521 [INFO][4749] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.521 [INFO][4749] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.523 [INFO][4749] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952 Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.531 [INFO][4749] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.538 [INFO][4749] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.70/26] block=192.168.76.64/26 handle="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.539 [INFO][4749] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.70/26] handle="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.539 [INFO][4749] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:52.566811 containerd[1621]: 2025-12-12 17:33:52.539 [INFO][4749] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.70/26] IPv6=[] ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" HandleID="k8s-pod-network.e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" Dec 12 17:33:52.567447 containerd[1621]: 2025-12-12 17:33:52.542 [INFO][4714] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0", GenerateName:"calico-apiserver-6d5697548f-", Namespace:"calico-apiserver", SelfLink:"", UID:"f0f2e696-f5c9-4490-a447-8711a361f9d0", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d5697548f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"calico-apiserver-6d5697548f-vqbgd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.76.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliacfa78d5d81", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:52.567447 containerd[1621]: 2025-12-12 17:33:52.543 [INFO][4714] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.70/32] ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" Dec 12 17:33:52.567447 containerd[1621]: 2025-12-12 17:33:52.543 [INFO][4714] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliacfa78d5d81 ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" Dec 12 17:33:52.567447 containerd[1621]: 2025-12-12 17:33:52.548 [INFO][4714] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" Dec 12 17:33:52.567447 containerd[1621]: 2025-12-12 17:33:52.550 [INFO][4714] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0", GenerateName:"calico-apiserver-6d5697548f-", Namespace:"calico-apiserver", SelfLink:"", UID:"f0f2e696-f5c9-4490-a447-8711a361f9d0", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d5697548f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952", Pod:"calico-apiserver-6d5697548f-vqbgd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.76.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliacfa78d5d81", MAC:"fe:5d:ee:b2:70:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:52.567447 containerd[1621]: 2025-12-12 17:33:52.562 [INFO][4714] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-vqbgd" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--vqbgd-eth0" Dec 12 17:33:52.575982 kubelet[2861]: I1212 17:33:52.574177 2861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-5vf47" podStartSLOduration=40.574159585 podStartE2EDuration="40.574159585s" podCreationTimestamp="2025-12-12 17:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:33:52.57313166 +0000 UTC m=+46.236686515" watchObservedRunningTime="2025-12-12 17:33:52.574159585 +0000 UTC m=+46.237714560" Dec 12 17:33:52.620304 containerd[1621]: time="2025-12-12T17:33:52.619708856Z" level=info msg="connecting to shim e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952" address="unix:///run/containerd/s/6237afe94aaaf612ff796c01226b637ccbea072d65561a68118044d105bb1bc6" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:52.647355 systemd-networkd[1525]: cali0140077d18f: Gained IPv6LL Dec 12 17:33:52.658554 systemd[1]: Started cri-containerd-e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952.scope - libcontainer container e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952. Dec 12 17:33:52.664248 systemd-networkd[1525]: calic7881e84fd5: Link UP Dec 12 17:33:52.669328 systemd-networkd[1525]: calic7881e84fd5: Gained carrier Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.466 [INFO][4725] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0 goldmane-666569f655- calico-system 764a9b76-22f0-48f8-92a8-02d3a1323d4b 858 0 2025-12-12 17:33:25 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 goldmane-666569f655-9vgpb eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic7881e84fd5 [] [] }} ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.466 [INFO][4725] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.503 [INFO][4743] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" HandleID="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Workload="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.504 [INFO][4743] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" HandleID="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Workload="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3580), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"goldmane-666569f655-9vgpb", "timestamp":"2025-12-12 17:33:52.503814028 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.504 [INFO][4743] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.539 [INFO][4743] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.539 [INFO][4743] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.607 [INFO][4743] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.615 [INFO][4743] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.622 [INFO][4743] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.624 [INFO][4743] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.626 [INFO][4743] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.627 [INFO][4743] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.629 [INFO][4743] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410 Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.636 [INFO][4743] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.645 [INFO][4743] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.71/26] block=192.168.76.64/26 handle="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.645 [INFO][4743] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.71/26] handle="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.645 [INFO][4743] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:52.693620 containerd[1621]: 2025-12-12 17:33:52.646 [INFO][4743] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.71/26] IPv6=[] ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" HandleID="k8s-pod-network.2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Workload="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" Dec 12 17:33:52.694364 containerd[1621]: 2025-12-12 17:33:52.659 [INFO][4725] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"764a9b76-22f0-48f8-92a8-02d3a1323d4b", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"goldmane-666569f655-9vgpb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.76.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic7881e84fd5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:52.694364 containerd[1621]: 2025-12-12 17:33:52.659 [INFO][4725] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.71/32] ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" Dec 12 17:33:52.694364 containerd[1621]: 2025-12-12 17:33:52.659 [INFO][4725] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic7881e84fd5 ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" Dec 12 17:33:52.694364 containerd[1621]: 2025-12-12 17:33:52.670 [INFO][4725] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" Dec 12 17:33:52.694364 containerd[1621]: 2025-12-12 17:33:52.670 [INFO][4725] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"764a9b76-22f0-48f8-92a8-02d3a1323d4b", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410", Pod:"goldmane-666569f655-9vgpb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.76.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic7881e84fd5", MAC:"66:52:5b:d1:d1:35", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:52.694364 containerd[1621]: 2025-12-12 17:33:52.691 [INFO][4725] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" Namespace="calico-system" Pod="goldmane-666569f655-9vgpb" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-goldmane--666569f655--9vgpb-eth0" Dec 12 17:33:52.725943 containerd[1621]: time="2025-12-12T17:33:52.725898516Z" level=info msg="connecting to shim 2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410" address="unix:///run/containerd/s/1b119a0eb2f6fd5b4b6a0decdccf3f99b11953435035a33156a7213173084110" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:52.731656 containerd[1621]: time="2025-12-12T17:33:52.731615585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-vqbgd,Uid:f0f2e696-f5c9-4490-a447-8711a361f9d0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e9d67fc715038fbe06f85b7623ce13bda1450ac9e1fede8cf7585fb92cc2d952\"" Dec 12 17:33:52.733427 containerd[1621]: time="2025-12-12T17:33:52.733335433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:33:52.759292 systemd[1]: Started cri-containerd-2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410.scope - libcontainer container 2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410. Dec 12 17:33:52.791428 containerd[1621]: time="2025-12-12T17:33:52.791383168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-9vgpb,Uid:764a9b76-22f0-48f8-92a8-02d3a1323d4b,Namespace:calico-system,Attempt:0,} returns sandbox id \"2cd5c99415500158d3523251448a2a755161c98fa910a3fc575fde1541fd0410\"" Dec 12 17:33:53.074644 containerd[1621]: time="2025-12-12T17:33:53.074507647Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:53.076345 containerd[1621]: time="2025-12-12T17:33:53.076294496Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:33:53.076551 kubelet[2861]: E1212 17:33:53.076508 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:33:53.076659 kubelet[2861]: E1212 17:33:53.076554 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:33:53.076840 containerd[1621]: time="2025-12-12T17:33:53.076328736Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:33:53.076840 containerd[1621]: time="2025-12-12T17:33:53.076824738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:33:53.077284 kubelet[2861]: E1212 17:33:53.077207 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khvbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-vqbgd_calico-apiserver(f0f2e696-f5c9-4490-a447-8711a361f9d0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:53.078477 kubelet[2861]: E1212 17:33:53.078435 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:33:53.396423 containerd[1621]: time="2025-12-12T17:33:53.396229401Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:53.397538 containerd[1621]: time="2025-12-12T17:33:53.397463727Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:33:53.397538 containerd[1621]: time="2025-12-12T17:33:53.397523207Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:33:53.397722 kubelet[2861]: E1212 17:33:53.397675 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:33:53.397780 kubelet[2861]: E1212 17:33:53.397731 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:33:53.397956 kubelet[2861]: E1212 17:33:53.397869 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-59pph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9vgpb_calico-system(764a9b76-22f0-48f8-92a8-02d3a1323d4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:53.399034 kubelet[2861]: E1212 17:33:53.398997 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:33:53.420118 containerd[1621]: time="2025-12-12T17:33:53.419962241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-q7brm,Uid:48d6a2b5-a80f-4de9-91aa-e8709a4fec3b,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:33:53.523433 systemd-networkd[1525]: cali5f473979161: Link UP Dec 12 17:33:53.524175 systemd-networkd[1525]: cali5f473979161: Gained carrier Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.463 [INFO][4887] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0 calico-apiserver-6d5697548f- calico-apiserver 48d6a2b5-a80f-4de9-91aa-e8709a4fec3b 854 0 2025-12-12 17:33:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6d5697548f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-3-c846c80ac0 calico-apiserver-6d5697548f-q7brm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5f473979161 [] [] }} ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.463 [INFO][4887] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.485 [INFO][4902] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" HandleID="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.485 [INFO][4902] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" HandleID="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034b5c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-3-c846c80ac0", "pod":"calico-apiserver-6d5697548f-q7brm", "timestamp":"2025-12-12 17:33:53.485050612 +0000 UTC"}, Hostname:"ci-4459-2-2-3-c846c80ac0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.485 [INFO][4902] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.485 [INFO][4902] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.485 [INFO][4902] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-3-c846c80ac0' Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.494 [INFO][4902] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.498 [INFO][4902] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.502 [INFO][4902] ipam/ipam.go 511: Trying affinity for 192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.504 [INFO][4902] ipam/ipam.go 158: Attempting to load block cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.506 [INFO][4902] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.76.64/26 host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.506 [INFO][4902] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.76.64/26 handle="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.507 [INFO][4902] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497 Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.512 [INFO][4902] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.76.64/26 handle="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.519 [INFO][4902] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.76.72/26] block=192.168.76.64/26 handle="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.519 [INFO][4902] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.76.72/26] handle="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" host="ci-4459-2-2-3-c846c80ac0" Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.519 [INFO][4902] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:33:53.535873 containerd[1621]: 2025-12-12 17:33:53.519 [INFO][4902] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.76.72/26] IPv6=[] ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" HandleID="k8s-pod-network.5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Workload="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" Dec 12 17:33:53.536520 containerd[1621]: 2025-12-12 17:33:53.521 [INFO][4887] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0", GenerateName:"calico-apiserver-6d5697548f-", Namespace:"calico-apiserver", SelfLink:"", UID:"48d6a2b5-a80f-4de9-91aa-e8709a4fec3b", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d5697548f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"", Pod:"calico-apiserver-6d5697548f-q7brm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.76.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5f473979161", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:53.536520 containerd[1621]: 2025-12-12 17:33:53.521 [INFO][4887] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.76.72/32] ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" Dec 12 17:33:53.536520 containerd[1621]: 2025-12-12 17:33:53.521 [INFO][4887] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5f473979161 ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" Dec 12 17:33:53.536520 containerd[1621]: 2025-12-12 17:33:53.524 [INFO][4887] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" Dec 12 17:33:53.536520 containerd[1621]: 2025-12-12 17:33:53.524 [INFO][4887] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0", GenerateName:"calico-apiserver-6d5697548f-", Namespace:"calico-apiserver", SelfLink:"", UID:"48d6a2b5-a80f-4de9-91aa-e8709a4fec3b", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 33, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d5697548f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-3-c846c80ac0", ContainerID:"5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497", Pod:"calico-apiserver-6d5697548f-q7brm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.76.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5f473979161", MAC:"5a:22:d2:f1:8c:4f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:33:53.536520 containerd[1621]: 2025-12-12 17:33:53.532 [INFO][4887] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" Namespace="calico-apiserver" Pod="calico-apiserver-6d5697548f-q7brm" WorkloadEndpoint="ci--4459--2--2--3--c846c80ac0-k8s-calico--apiserver--6d5697548f--q7brm-eth0" Dec 12 17:33:53.563126 kubelet[2861]: E1212 17:33:53.562867 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:33:53.563640 kubelet[2861]: E1212 17:33:53.563553 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:33:53.570366 containerd[1621]: time="2025-12-12T17:33:53.570325045Z" level=info msg="connecting to shim 5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497" address="unix:///run/containerd/s/3ca7e81eb03be92503845f712de153e8d583158aed4ec4d8363aee49351c4c78" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:33:53.606252 systemd[1]: Started cri-containerd-5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497.scope - libcontainer container 5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497. Dec 12 17:33:53.639012 containerd[1621]: time="2025-12-12T17:33:53.638973754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d5697548f-q7brm,Uid:48d6a2b5-a80f-4de9-91aa-e8709a4fec3b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5596788c9ba84f93eaf61b2ff8d0d238131840fed58f791368ea1d8425f0d497\"" Dec 12 17:33:53.640595 containerd[1621]: time="2025-12-12T17:33:53.640569482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:33:53.959973 containerd[1621]: time="2025-12-12T17:33:53.959861264Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:53.961841 containerd[1621]: time="2025-12-12T17:33:53.961794794Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:33:53.961995 containerd[1621]: time="2025-12-12T17:33:53.961968955Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:33:53.962197 kubelet[2861]: E1212 17:33:53.962158 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:33:53.962524 kubelet[2861]: E1212 17:33:53.962207 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:33:53.962524 kubelet[2861]: E1212 17:33:53.962362 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vh4dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-q7brm_calico-apiserver(48d6a2b5-a80f-4de9-91aa-e8709a4fec3b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:53.963620 kubelet[2861]: E1212 17:33:53.963569 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:33:54.310194 systemd-networkd[1525]: caliacfa78d5d81: Gained IPv6LL Dec 12 17:33:54.438550 systemd-networkd[1525]: calic7881e84fd5: Gained IPv6LL Dec 12 17:33:54.567767 kubelet[2861]: E1212 17:33:54.567293 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:33:54.567767 kubelet[2861]: E1212 17:33:54.567624 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:33:54.568384 kubelet[2861]: E1212 17:33:54.567999 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:33:54.886307 systemd-networkd[1525]: cali5f473979161: Gained IPv6LL Dec 12 17:33:55.568841 kubelet[2861]: E1212 17:33:55.568775 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:33:58.422941 containerd[1621]: time="2025-12-12T17:33:58.422878016Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:33:58.849803 containerd[1621]: time="2025-12-12T17:33:58.849745264Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:58.850956 containerd[1621]: time="2025-12-12T17:33:58.850905510Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:33:58.851022 containerd[1621]: time="2025-12-12T17:33:58.850964790Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:33:58.851174 kubelet[2861]: E1212 17:33:58.851138 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:33:58.851432 kubelet[2861]: E1212 17:33:58.851187 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:33:58.851432 kubelet[2861]: E1212 17:33:58.851297 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0baedf461ecb4cecaa52fed4d1b4bb46,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:58.853739 containerd[1621]: time="2025-12-12T17:33:58.853708364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:33:59.198433 containerd[1621]: time="2025-12-12T17:33:59.198306275Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:33:59.200348 containerd[1621]: time="2025-12-12T17:33:59.200299965Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:33:59.200422 containerd[1621]: time="2025-12-12T17:33:59.200394325Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:33:59.200933 kubelet[2861]: E1212 17:33:59.200647 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:33:59.200933 kubelet[2861]: E1212 17:33:59.200699 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:33:59.200933 kubelet[2861]: E1212 17:33:59.200868 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:33:59.202423 kubelet[2861]: E1212 17:33:59.202024 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:34:04.420153 containerd[1621]: time="2025-12-12T17:34:04.420107721Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:34:04.771918 containerd[1621]: time="2025-12-12T17:34:04.771842508Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:04.773319 containerd[1621]: time="2025-12-12T17:34:04.773274915Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:34:04.773372 containerd[1621]: time="2025-12-12T17:34:04.773323435Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:34:04.773582 kubelet[2861]: E1212 17:34:04.773535 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:34:04.773916 kubelet[2861]: E1212 17:34:04.773595 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:34:04.773916 kubelet[2861]: E1212 17:34:04.773805 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:04.774289 containerd[1621]: time="2025-12-12T17:34:04.774258840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:34:05.115523 containerd[1621]: time="2025-12-12T17:34:05.115300932Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:05.117118 containerd[1621]: time="2025-12-12T17:34:05.117071941Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:34:05.117184 containerd[1621]: time="2025-12-12T17:34:05.117122942Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:34:05.117346 kubelet[2861]: E1212 17:34:05.117302 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:34:05.117399 kubelet[2861]: E1212 17:34:05.117356 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:34:05.117988 kubelet[2861]: E1212 17:34:05.117598 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szn5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-698c665495-s6zxl_calico-system(ff7ea540-d740-4fa7-9163-b17e2194ee80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:05.118116 containerd[1621]: time="2025-12-12T17:34:05.117830465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:34:05.119125 kubelet[2861]: E1212 17:34:05.119057 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:34:05.466408 containerd[1621]: time="2025-12-12T17:34:05.466232715Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:05.467636 containerd[1621]: time="2025-12-12T17:34:05.467595562Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:34:05.467745 containerd[1621]: time="2025-12-12T17:34:05.467638002Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:34:05.467884 kubelet[2861]: E1212 17:34:05.467830 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:34:05.467935 kubelet[2861]: E1212 17:34:05.467885 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:34:05.468061 kubelet[2861]: E1212 17:34:05.468017 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:05.469637 kubelet[2861]: E1212 17:34:05.469562 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:34:07.419860 containerd[1621]: time="2025-12-12T17:34:07.419770599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:34:07.768309 containerd[1621]: time="2025-12-12T17:34:07.768191369Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:07.769913 containerd[1621]: time="2025-12-12T17:34:07.769863017Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:34:07.769989 containerd[1621]: time="2025-12-12T17:34:07.769973618Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:34:07.770292 kubelet[2861]: E1212 17:34:07.770118 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:34:07.770292 kubelet[2861]: E1212 17:34:07.770168 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:34:07.771015 kubelet[2861]: E1212 17:34:07.770425 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-59pph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9vgpb_calico-system(764a9b76-22f0-48f8-92a8-02d3a1323d4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:07.771149 containerd[1621]: time="2025-12-12T17:34:07.770563901Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:34:07.771709 kubelet[2861]: E1212 17:34:07.771665 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:34:08.110067 containerd[1621]: time="2025-12-12T17:34:08.109795264Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:08.111354 containerd[1621]: time="2025-12-12T17:34:08.111317632Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:34:08.111425 containerd[1621]: time="2025-12-12T17:34:08.111350232Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:34:08.111798 kubelet[2861]: E1212 17:34:08.111560 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:08.111798 kubelet[2861]: E1212 17:34:08.111606 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:08.111798 kubelet[2861]: E1212 17:34:08.111744 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vh4dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-q7brm_calico-apiserver(48d6a2b5-a80f-4de9-91aa-e8709a4fec3b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:08.113000 kubelet[2861]: E1212 17:34:08.112956 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:34:10.421112 containerd[1621]: time="2025-12-12T17:34:10.420962685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:34:10.422645 kubelet[2861]: E1212 17:34:10.422527 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:34:10.781593 containerd[1621]: time="2025-12-12T17:34:10.781521996Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:10.782871 containerd[1621]: time="2025-12-12T17:34:10.782834723Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:34:10.782987 containerd[1621]: time="2025-12-12T17:34:10.782912123Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:34:10.783100 kubelet[2861]: E1212 17:34:10.783064 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:10.783163 kubelet[2861]: E1212 17:34:10.783112 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:10.783280 kubelet[2861]: E1212 17:34:10.783238 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khvbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-vqbgd_calico-apiserver(f0f2e696-f5c9-4490-a447-8711a361f9d0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:10.784437 kubelet[2861]: E1212 17:34:10.784393 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:34:18.420847 kubelet[2861]: E1212 17:34:18.420802 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:34:18.422043 kubelet[2861]: E1212 17:34:18.422011 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:34:19.420754 kubelet[2861]: E1212 17:34:19.420680 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:34:22.420328 kubelet[2861]: E1212 17:34:22.420278 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:34:24.419970 kubelet[2861]: E1212 17:34:24.419861 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:34:24.421277 containerd[1621]: time="2025-12-12T17:34:24.421227764Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:34:24.761047 containerd[1621]: time="2025-12-12T17:34:24.760999810Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:24.763343 containerd[1621]: time="2025-12-12T17:34:24.763294862Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:34:24.763458 containerd[1621]: time="2025-12-12T17:34:24.763390422Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:34:24.763606 kubelet[2861]: E1212 17:34:24.763564 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:34:24.763660 kubelet[2861]: E1212 17:34:24.763629 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:34:24.763905 kubelet[2861]: E1212 17:34:24.763842 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0baedf461ecb4cecaa52fed4d1b4bb46,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:24.765858 containerd[1621]: time="2025-12-12T17:34:24.765784355Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:34:25.101796 containerd[1621]: time="2025-12-12T17:34:25.101466940Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:25.102777 containerd[1621]: time="2025-12-12T17:34:25.102739306Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:34:25.102880 containerd[1621]: time="2025-12-12T17:34:25.102826947Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:34:25.103086 kubelet[2861]: E1212 17:34:25.103024 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:34:25.103136 kubelet[2861]: E1212 17:34:25.103100 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:34:25.103260 kubelet[2861]: E1212 17:34:25.103222 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:25.104399 kubelet[2861]: E1212 17:34:25.104354 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:34:31.421073 containerd[1621]: time="2025-12-12T17:34:31.420234238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:34:31.752977 containerd[1621]: time="2025-12-12T17:34:31.752798488Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:31.754463 containerd[1621]: time="2025-12-12T17:34:31.754387776Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:34:31.754546 containerd[1621]: time="2025-12-12T17:34:31.754429896Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:34:31.754866 kubelet[2861]: E1212 17:34:31.754812 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:34:31.756204 kubelet[2861]: E1212 17:34:31.754875 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:34:31.756204 kubelet[2861]: E1212 17:34:31.755590 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:31.757314 containerd[1621]: time="2025-12-12T17:34:31.755447941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:34:32.097155 containerd[1621]: time="2025-12-12T17:34:32.097018076Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:32.099073 containerd[1621]: time="2025-12-12T17:34:32.099028607Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:34:32.099262 containerd[1621]: time="2025-12-12T17:34:32.099071767Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:34:32.099296 kubelet[2861]: E1212 17:34:32.099261 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:34:32.099339 kubelet[2861]: E1212 17:34:32.099308 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:34:32.099708 kubelet[2861]: E1212 17:34:32.099647 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-59pph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9vgpb_calico-system(764a9b76-22f0-48f8-92a8-02d3a1323d4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:32.100313 containerd[1621]: time="2025-12-12T17:34:32.100283373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:34:32.101764 kubelet[2861]: E1212 17:34:32.101692 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:34:32.415151 containerd[1621]: time="2025-12-12T17:34:32.414898051Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:32.416310 containerd[1621]: time="2025-12-12T17:34:32.416267378Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:34:32.416389 containerd[1621]: time="2025-12-12T17:34:32.416349379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:34:32.416585 kubelet[2861]: E1212 17:34:32.416498 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:34:32.416631 kubelet[2861]: E1212 17:34:32.416599 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:34:32.417210 kubelet[2861]: E1212 17:34:32.416724 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:32.418767 kubelet[2861]: E1212 17:34:32.418721 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:34:32.420643 containerd[1621]: time="2025-12-12T17:34:32.420608880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:34:32.767966 containerd[1621]: time="2025-12-12T17:34:32.767869204Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:32.769538 containerd[1621]: time="2025-12-12T17:34:32.769500733Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:34:32.769610 containerd[1621]: time="2025-12-12T17:34:32.769576813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:34:32.769776 kubelet[2861]: E1212 17:34:32.769722 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:34:32.769776 kubelet[2861]: E1212 17:34:32.769774 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:34:32.770050 kubelet[2861]: E1212 17:34:32.769907 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szn5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-698c665495-s6zxl_calico-system(ff7ea540-d740-4fa7-9163-b17e2194ee80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:32.771153 kubelet[2861]: E1212 17:34:32.771045 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:34:33.419532 containerd[1621]: time="2025-12-12T17:34:33.419490714Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:34:33.785312 containerd[1621]: time="2025-12-12T17:34:33.785225572Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:33.786723 containerd[1621]: time="2025-12-12T17:34:33.786675460Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:34:33.786807 containerd[1621]: time="2025-12-12T17:34:33.786723380Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:34:33.786976 kubelet[2861]: E1212 17:34:33.786877 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:33.787385 kubelet[2861]: E1212 17:34:33.786988 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:33.787385 kubelet[2861]: E1212 17:34:33.787118 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vh4dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-q7brm_calico-apiserver(48d6a2b5-a80f-4de9-91aa-e8709a4fec3b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:33.788322 kubelet[2861]: E1212 17:34:33.788265 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:34:38.421542 kubelet[2861]: E1212 17:34:38.421442 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:34:38.422831 containerd[1621]: time="2025-12-12T17:34:38.421584724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:34:38.784858 containerd[1621]: time="2025-12-12T17:34:38.784815650Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:34:38.786296 containerd[1621]: time="2025-12-12T17:34:38.786257617Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:34:38.786375 containerd[1621]: time="2025-12-12T17:34:38.786356617Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:34:38.786545 kubelet[2861]: E1212 17:34:38.786505 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:38.786596 kubelet[2861]: E1212 17:34:38.786560 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:34:38.786795 kubelet[2861]: E1212 17:34:38.786686 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khvbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-vqbgd_calico-apiserver(f0f2e696-f5c9-4490-a447-8711a361f9d0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:34:38.788153 kubelet[2861]: E1212 17:34:38.788101 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:34:43.420022 kubelet[2861]: E1212 17:34:43.419792 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:34:45.419634 kubelet[2861]: E1212 17:34:45.419589 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:34:46.420885 kubelet[2861]: E1212 17:34:46.420840 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:34:48.419976 kubelet[2861]: E1212 17:34:48.419844 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:34:52.423042 kubelet[2861]: E1212 17:34:52.422986 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:34:53.420623 kubelet[2861]: E1212 17:34:53.420567 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:34:55.421422 kubelet[2861]: E1212 17:34:55.421369 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:34:57.419629 kubelet[2861]: E1212 17:34:57.419577 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:34:59.419697 kubelet[2861]: E1212 17:34:59.419631 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:35:02.420519 kubelet[2861]: E1212 17:35:02.420449 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:35:05.419424 kubelet[2861]: E1212 17:35:05.419365 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:35:07.420741 kubelet[2861]: E1212 17:35:07.420673 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:35:07.421181 containerd[1621]: time="2025-12-12T17:35:07.420703756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:35:07.771232 containerd[1621]: time="2025-12-12T17:35:07.771052736Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:07.773084 containerd[1621]: time="2025-12-12T17:35:07.773041866Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:35:07.773084 containerd[1621]: time="2025-12-12T17:35:07.773044026Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:35:07.773930 kubelet[2861]: E1212 17:35:07.773294 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:35:07.773930 kubelet[2861]: E1212 17:35:07.773339 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:35:07.773930 kubelet[2861]: E1212 17:35:07.773473 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0baedf461ecb4cecaa52fed4d1b4bb46,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:07.776208 containerd[1621]: time="2025-12-12T17:35:07.776179082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:35:08.145522 containerd[1621]: time="2025-12-12T17:35:08.144809555Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:08.146615 containerd[1621]: time="2025-12-12T17:35:08.146538963Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:35:08.146687 containerd[1621]: time="2025-12-12T17:35:08.146616884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:35:08.146820 kubelet[2861]: E1212 17:35:08.146749 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:35:08.146820 kubelet[2861]: E1212 17:35:08.146814 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:35:08.147898 kubelet[2861]: E1212 17:35:08.146927 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:08.148414 kubelet[2861]: E1212 17:35:08.148373 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:35:08.420744 kubelet[2861]: E1212 17:35:08.420160 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:35:11.419675 kubelet[2861]: E1212 17:35:11.419594 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:35:13.420324 containerd[1621]: time="2025-12-12T17:35:13.420290873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:35:13.449690 update_engine[1601]: I20251212 17:35:13.449594 1601 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 12 17:35:13.450043 update_engine[1601]: I20251212 17:35:13.449694 1601 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 12 17:35:13.450043 update_engine[1601]: I20251212 17:35:13.450026 1601 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 12 17:35:13.450603 update_engine[1601]: I20251212 17:35:13.450407 1601 omaha_request_params.cc:62] Current group set to stable Dec 12 17:35:13.450603 update_engine[1601]: I20251212 17:35:13.450505 1601 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 12 17:35:13.450603 update_engine[1601]: I20251212 17:35:13.450515 1601 update_attempter.cc:643] Scheduling an action processor start. Dec 12 17:35:13.450603 update_engine[1601]: I20251212 17:35:13.450531 1601 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 12 17:35:13.450603 update_engine[1601]: I20251212 17:35:13.450554 1601 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 12 17:35:13.450603 update_engine[1601]: I20251212 17:35:13.450599 1601 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 12 17:35:13.450603 update_engine[1601]: I20251212 17:35:13.450606 1601 omaha_request_action.cc:272] Request: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450603 update_engine[1601]: Dec 12 17:35:13.450880 update_engine[1601]: I20251212 17:35:13.450613 1601 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:35:13.451544 locksmithd[1646]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 12 17:35:13.452408 update_engine[1601]: I20251212 17:35:13.452348 1601 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:35:13.453306 update_engine[1601]: I20251212 17:35:13.453260 1601 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:35:13.459273 update_engine[1601]: E20251212 17:35:13.459208 1601 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:35:13.459357 update_engine[1601]: I20251212 17:35:13.459311 1601 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 12 17:35:13.758333 containerd[1621]: time="2025-12-12T17:35:13.758255910Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:13.760659 containerd[1621]: time="2025-12-12T17:35:13.760601442Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:35:13.760725 containerd[1621]: time="2025-12-12T17:35:13.760703763Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:35:13.760894 kubelet[2861]: E1212 17:35:13.760834 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:35:13.761621 kubelet[2861]: E1212 17:35:13.760901 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:35:13.761621 kubelet[2861]: E1212 17:35:13.761098 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szn5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-698c665495-s6zxl_calico-system(ff7ea540-d740-4fa7-9163-b17e2194ee80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:13.762290 kubelet[2861]: E1212 17:35:13.762254 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:35:19.420771 containerd[1621]: time="2025-12-12T17:35:19.420721315Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:35:19.751837 containerd[1621]: time="2025-12-12T17:35:19.751792637Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:19.753979 containerd[1621]: time="2025-12-12T17:35:19.753545566Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:35:19.753979 containerd[1621]: time="2025-12-12T17:35:19.753639606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:35:19.754111 kubelet[2861]: E1212 17:35:19.753806 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:35:19.754111 kubelet[2861]: E1212 17:35:19.753890 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:35:19.754414 kubelet[2861]: E1212 17:35:19.754169 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khvbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-vqbgd_calico-apiserver(f0f2e696-f5c9-4490-a447-8711a361f9d0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:19.755384 kubelet[2861]: E1212 17:35:19.755333 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:35:21.420132 kubelet[2861]: E1212 17:35:21.420076 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:35:22.422343 containerd[1621]: time="2025-12-12T17:35:22.422308323Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:35:23.017648 containerd[1621]: time="2025-12-12T17:35:23.017475226Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:23.019139 containerd[1621]: time="2025-12-12T17:35:23.019038834Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:35:23.019139 containerd[1621]: time="2025-12-12T17:35:23.019116874Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:35:23.019293 kubelet[2861]: E1212 17:35:23.019245 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:35:23.019534 kubelet[2861]: E1212 17:35:23.019297 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:35:23.019534 kubelet[2861]: E1212 17:35:23.019426 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:23.021629 containerd[1621]: time="2025-12-12T17:35:23.021601567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:35:23.354636 containerd[1621]: time="2025-12-12T17:35:23.354529818Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:23.356618 containerd[1621]: time="2025-12-12T17:35:23.356562228Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:35:23.356678 containerd[1621]: time="2025-12-12T17:35:23.356655429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:35:23.357127 kubelet[2861]: E1212 17:35:23.357084 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:35:23.357186 kubelet[2861]: E1212 17:35:23.357140 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:35:23.357300 kubelet[2861]: E1212 17:35:23.357263 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:23.358619 kubelet[2861]: E1212 17:35:23.358575 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:35:23.419704 containerd[1621]: time="2025-12-12T17:35:23.419655349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:35:23.441550 update_engine[1601]: I20251212 17:35:23.441478 1601 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:35:23.441870 update_engine[1601]: I20251212 17:35:23.441567 1601 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:35:23.442966 update_engine[1601]: I20251212 17:35:23.442142 1601 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:35:23.448626 update_engine[1601]: E20251212 17:35:23.448566 1601 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:35:23.448708 update_engine[1601]: I20251212 17:35:23.448657 1601 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 12 17:35:23.745261 containerd[1621]: time="2025-12-12T17:35:23.745174843Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:23.746590 containerd[1621]: time="2025-12-12T17:35:23.746542849Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:35:23.746632 containerd[1621]: time="2025-12-12T17:35:23.746621130Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:35:23.746867 kubelet[2861]: E1212 17:35:23.746760 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:35:23.746867 kubelet[2861]: E1212 17:35:23.746820 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:35:23.747168 kubelet[2861]: E1212 17:35:23.747103 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vh4dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-q7brm_calico-apiserver(48d6a2b5-a80f-4de9-91aa-e8709a4fec3b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:23.748309 kubelet[2861]: E1212 17:35:23.748275 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:35:24.419629 kubelet[2861]: E1212 17:35:24.419571 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:35:26.421071 containerd[1621]: time="2025-12-12T17:35:26.421011715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:35:26.768879 containerd[1621]: time="2025-12-12T17:35:26.768711162Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:35:27.044807 containerd[1621]: time="2025-12-12T17:35:27.044677924Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:35:27.044807 containerd[1621]: time="2025-12-12T17:35:27.044745724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:35:27.044967 kubelet[2861]: E1212 17:35:27.044890 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:35:27.045367 kubelet[2861]: E1212 17:35:27.044966 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:35:27.045367 kubelet[2861]: E1212 17:35:27.045094 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-59pph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9vgpb_calico-system(764a9b76-22f0-48f8-92a8-02d3a1323d4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:35:27.046587 kubelet[2861]: E1212 17:35:27.046521 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:35:29.353824 systemd[1]: Started sshd@9-10.0.8.78:22-147.75.109.163:60486.service - OpenSSH per-connection server daemon (147.75.109.163:60486). Dec 12 17:35:30.342432 sshd[5116]: Accepted publickey for core from 147.75.109.163 port 60486 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:35:30.343733 sshd-session[5116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:35:30.348572 systemd-logind[1599]: New session 10 of user core. Dec 12 17:35:30.356151 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 17:35:31.108045 sshd[5119]: Connection closed by 147.75.109.163 port 60486 Dec 12 17:35:31.108398 sshd-session[5116]: pam_unix(sshd:session): session closed for user core Dec 12 17:35:31.111996 systemd[1]: sshd@9-10.0.8.78:22-147.75.109.163:60486.service: Deactivated successfully. Dec 12 17:35:31.115968 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 17:35:31.117014 systemd-logind[1599]: Session 10 logged out. Waiting for processes to exit. Dec 12 17:35:31.118242 systemd-logind[1599]: Removed session 10. Dec 12 17:35:33.441055 update_engine[1601]: I20251212 17:35:33.440984 1601 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:35:33.441444 update_engine[1601]: I20251212 17:35:33.441078 1601 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:35:33.441444 update_engine[1601]: I20251212 17:35:33.441413 1601 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:35:33.447591 update_engine[1601]: E20251212 17:35:33.447533 1601 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:35:33.447697 update_engine[1601]: I20251212 17:35:33.447636 1601 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 12 17:35:34.420512 kubelet[2861]: E1212 17:35:34.420050 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:35:34.422496 kubelet[2861]: E1212 17:35:34.422438 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:35:35.420509 kubelet[2861]: E1212 17:35:35.420438 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:35:35.420509 kubelet[2861]: E1212 17:35:35.420456 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:35:36.298583 systemd[1]: Started sshd@10-10.0.8.78:22-147.75.109.163:57036.service - OpenSSH per-connection server daemon (147.75.109.163:57036). Dec 12 17:35:37.379440 sshd[5135]: Accepted publickey for core from 147.75.109.163 port 57036 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:35:37.380727 sshd-session[5135]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:35:37.384567 systemd-logind[1599]: New session 11 of user core. Dec 12 17:35:37.394135 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 17:35:37.422551 kubelet[2861]: E1212 17:35:37.422500 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:35:38.171984 sshd[5138]: Connection closed by 147.75.109.163 port 57036 Dec 12 17:35:38.171858 sshd-session[5135]: pam_unix(sshd:session): session closed for user core Dec 12 17:35:38.175493 systemd-logind[1599]: Session 11 logged out. Waiting for processes to exit. Dec 12 17:35:38.175610 systemd[1]: sshd@10-10.0.8.78:22-147.75.109.163:57036.service: Deactivated successfully. Dec 12 17:35:38.178465 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 17:35:38.179751 systemd-logind[1599]: Removed session 11. Dec 12 17:35:38.421635 kubelet[2861]: E1212 17:35:38.420939 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:35:43.345480 systemd[1]: Started sshd@11-10.0.8.78:22-147.75.109.163:56510.service - OpenSSH per-connection server daemon (147.75.109.163:56510). Dec 12 17:35:43.442989 update_engine[1601]: I20251212 17:35:43.442382 1601 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:35:43.442989 update_engine[1601]: I20251212 17:35:43.442525 1601 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:35:43.442989 update_engine[1601]: I20251212 17:35:43.442901 1601 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:35:43.449420 update_engine[1601]: E20251212 17:35:43.449388 1601 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:35:43.449576 update_engine[1601]: I20251212 17:35:43.449556 1601 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449620 1601 omaha_request_action.cc:617] Omaha request response: Dec 12 17:35:43.450190 update_engine[1601]: E20251212 17:35:43.449699 1601 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449714 1601 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449719 1601 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449723 1601 update_attempter.cc:306] Processing Done. Dec 12 17:35:43.450190 update_engine[1601]: E20251212 17:35:43.449736 1601 update_attempter.cc:619] Update failed. Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449740 1601 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449744 1601 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449750 1601 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449824 1601 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449846 1601 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449851 1601 omaha_request_action.cc:272] Request: Dec 12 17:35:43.450190 update_engine[1601]: Dec 12 17:35:43.450190 update_engine[1601]: Dec 12 17:35:43.450190 update_engine[1601]: Dec 12 17:35:43.450190 update_engine[1601]: Dec 12 17:35:43.450190 update_engine[1601]: Dec 12 17:35:43.450190 update_engine[1601]: Dec 12 17:35:43.450190 update_engine[1601]: I20251212 17:35:43.449857 1601 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:35:43.450595 update_engine[1601]: I20251212 17:35:43.449872 1601 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:35:43.450595 update_engine[1601]: I20251212 17:35:43.450150 1601 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:35:43.450635 locksmithd[1646]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 12 17:35:43.457620 update_engine[1601]: E20251212 17:35:43.457533 1601 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:35:43.457701 update_engine[1601]: I20251212 17:35:43.457628 1601 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 12 17:35:43.457701 update_engine[1601]: I20251212 17:35:43.457638 1601 omaha_request_action.cc:617] Omaha request response: Dec 12 17:35:43.457701 update_engine[1601]: I20251212 17:35:43.457643 1601 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 17:35:43.457701 update_engine[1601]: I20251212 17:35:43.457648 1601 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 17:35:43.457701 update_engine[1601]: I20251212 17:35:43.457652 1601 update_attempter.cc:306] Processing Done. Dec 12 17:35:43.457701 update_engine[1601]: I20251212 17:35:43.457657 1601 update_attempter.cc:310] Error event sent. Dec 12 17:35:43.457701 update_engine[1601]: I20251212 17:35:43.457666 1601 update_check_scheduler.cc:74] Next update check in 44m6s Dec 12 17:35:43.458159 locksmithd[1646]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 12 17:35:44.359969 sshd[5180]: Accepted publickey for core from 147.75.109.163 port 56510 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:35:44.361775 sshd-session[5180]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:35:44.368052 systemd-logind[1599]: New session 12 of user core. Dec 12 17:35:44.376142 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 17:35:45.114833 sshd[5183]: Connection closed by 147.75.109.163 port 56510 Dec 12 17:35:45.115167 sshd-session[5180]: pam_unix(sshd:session): session closed for user core Dec 12 17:35:45.119844 systemd[1]: sshd@11-10.0.8.78:22-147.75.109.163:56510.service: Deactivated successfully. Dec 12 17:35:45.122122 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 17:35:45.123456 systemd-logind[1599]: Session 12 logged out. Waiting for processes to exit. Dec 12 17:35:45.127129 systemd-logind[1599]: Removed session 12. Dec 12 17:35:45.278336 systemd[1]: Started sshd@12-10.0.8.78:22-147.75.109.163:56518.service - OpenSSH per-connection server daemon (147.75.109.163:56518). Dec 12 17:35:46.248055 sshd[5197]: Accepted publickey for core from 147.75.109.163 port 56518 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:35:46.249392 sshd-session[5197]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:35:46.255864 systemd-logind[1599]: New session 13 of user core. Dec 12 17:35:46.265267 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 17:35:47.029168 sshd[5200]: Connection closed by 147.75.109.163 port 56518 Dec 12 17:35:47.029627 sshd-session[5197]: pam_unix(sshd:session): session closed for user core Dec 12 17:35:47.034367 systemd[1]: sshd@12-10.0.8.78:22-147.75.109.163:56518.service: Deactivated successfully. Dec 12 17:35:47.036566 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 17:35:47.038307 systemd-logind[1599]: Session 13 logged out. Waiting for processes to exit. Dec 12 17:35:47.039403 systemd-logind[1599]: Removed session 13. Dec 12 17:35:47.196055 systemd[1]: Started sshd@13-10.0.8.78:22-147.75.109.163:56534.service - OpenSSH per-connection server daemon (147.75.109.163:56534). Dec 12 17:35:47.420826 kubelet[2861]: E1212 17:35:47.420701 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:35:48.154814 sshd[5212]: Accepted publickey for core from 147.75.109.163 port 56534 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:35:48.156059 sshd-session[5212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:35:48.160245 systemd-logind[1599]: New session 14 of user core. Dec 12 17:35:48.166486 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 17:35:48.422455 kubelet[2861]: E1212 17:35:48.422289 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:35:48.423621 kubelet[2861]: E1212 17:35:48.423173 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:35:48.876955 sshd[5215]: Connection closed by 147.75.109.163 port 56534 Dec 12 17:35:48.877518 sshd-session[5212]: pam_unix(sshd:session): session closed for user core Dec 12 17:35:48.881169 systemd[1]: sshd@13-10.0.8.78:22-147.75.109.163:56534.service: Deactivated successfully. Dec 12 17:35:48.885239 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 17:35:48.885941 systemd-logind[1599]: Session 14 logged out. Waiting for processes to exit. Dec 12 17:35:48.887234 systemd-logind[1599]: Removed session 14. Dec 12 17:35:49.420140 kubelet[2861]: E1212 17:35:49.419807 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:35:49.420140 kubelet[2861]: E1212 17:35:49.420083 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:35:53.420049 kubelet[2861]: E1212 17:35:53.419931 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:35:54.049456 systemd[1]: Started sshd@14-10.0.8.78:22-147.75.109.163:59982.service - OpenSSH per-connection server daemon (147.75.109.163:59982). Dec 12 17:35:55.034258 sshd[5232]: Accepted publickey for core from 147.75.109.163 port 59982 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:35:55.035219 sshd-session[5232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:35:55.044854 systemd-logind[1599]: New session 15 of user core. Dec 12 17:35:55.053224 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 17:35:55.787849 sshd[5235]: Connection closed by 147.75.109.163 port 59982 Dec 12 17:35:55.788446 sshd-session[5232]: pam_unix(sshd:session): session closed for user core Dec 12 17:35:55.792166 systemd[1]: sshd@14-10.0.8.78:22-147.75.109.163:59982.service: Deactivated successfully. Dec 12 17:35:55.793812 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 17:35:55.794585 systemd-logind[1599]: Session 15 logged out. Waiting for processes to exit. Dec 12 17:35:55.795778 systemd-logind[1599]: Removed session 15. Dec 12 17:35:58.420584 kubelet[2861]: E1212 17:35:58.420495 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:36:00.972584 systemd[1]: Started sshd@15-10.0.8.78:22-147.75.109.163:59988.service - OpenSSH per-connection server daemon (147.75.109.163:59988). Dec 12 17:36:01.420792 kubelet[2861]: E1212 17:36:01.420668 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:36:02.016734 sshd[5248]: Accepted publickey for core from 147.75.109.163 port 59988 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:02.018032 sshd-session[5248]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:02.021647 systemd-logind[1599]: New session 16 of user core. Dec 12 17:36:02.034150 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 17:36:02.809660 sshd[5252]: Connection closed by 147.75.109.163 port 59988 Dec 12 17:36:02.809931 sshd-session[5248]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:02.816040 systemd[1]: sshd@15-10.0.8.78:22-147.75.109.163:59988.service: Deactivated successfully. Dec 12 17:36:02.817832 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 17:36:02.818979 systemd-logind[1599]: Session 16 logged out. Waiting for processes to exit. Dec 12 17:36:02.821568 systemd-logind[1599]: Removed session 16. Dec 12 17:36:03.420985 kubelet[2861]: E1212 17:36:03.420544 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:36:03.421593 kubelet[2861]: E1212 17:36:03.421086 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:36:04.420811 kubelet[2861]: E1212 17:36:04.420456 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:36:04.423258 kubelet[2861]: E1212 17:36:04.423195 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:36:07.975217 systemd[1]: Started sshd@16-10.0.8.78:22-147.75.109.163:47642.service - OpenSSH per-connection server daemon (147.75.109.163:47642). Dec 12 17:36:08.947303 sshd[5268]: Accepted publickey for core from 147.75.109.163 port 47642 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:08.948626 sshd-session[5268]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:08.952965 systemd-logind[1599]: New session 17 of user core. Dec 12 17:36:08.960116 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 17:36:09.693395 sshd[5271]: Connection closed by 147.75.109.163 port 47642 Dec 12 17:36:09.693804 sshd-session[5268]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:09.698831 systemd[1]: sshd@16-10.0.8.78:22-147.75.109.163:47642.service: Deactivated successfully. Dec 12 17:36:09.701102 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 17:36:09.701827 systemd-logind[1599]: Session 17 logged out. Waiting for processes to exit. Dec 12 17:36:09.703406 systemd-logind[1599]: Removed session 17. Dec 12 17:36:09.858324 systemd[1]: Started sshd@17-10.0.8.78:22-147.75.109.163:47646.service - OpenSSH per-connection server daemon (147.75.109.163:47646). Dec 12 17:36:10.422656 kubelet[2861]: E1212 17:36:10.422613 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:36:10.823306 sshd[5285]: Accepted publickey for core from 147.75.109.163 port 47646 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:10.825155 sshd-session[5285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:10.832375 systemd-logind[1599]: New session 18 of user core. Dec 12 17:36:10.839137 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 17:36:11.615309 sshd[5288]: Connection closed by 147.75.109.163 port 47646 Dec 12 17:36:11.615686 sshd-session[5285]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:11.619217 systemd[1]: sshd@17-10.0.8.78:22-147.75.109.163:47646.service: Deactivated successfully. Dec 12 17:36:11.620958 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 17:36:11.622600 systemd-logind[1599]: Session 18 logged out. Waiting for processes to exit. Dec 12 17:36:11.623704 systemd-logind[1599]: Removed session 18. Dec 12 17:36:11.781534 systemd[1]: Started sshd@18-10.0.8.78:22-147.75.109.163:47658.service - OpenSSH per-connection server daemon (147.75.109.163:47658). Dec 12 17:36:12.754028 sshd[5300]: Accepted publickey for core from 147.75.109.163 port 47658 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:12.755367 sshd-session[5300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:12.760025 systemd-logind[1599]: New session 19 of user core. Dec 12 17:36:12.767163 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 17:36:13.420403 kubelet[2861]: E1212 17:36:13.420353 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:36:14.001686 sshd[5334]: Connection closed by 147.75.109.163 port 47658 Dec 12 17:36:14.001543 sshd-session[5300]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:14.009272 systemd[1]: sshd@18-10.0.8.78:22-147.75.109.163:47658.service: Deactivated successfully. Dec 12 17:36:14.010969 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 17:36:14.016419 systemd-logind[1599]: Session 19 logged out. Waiting for processes to exit. Dec 12 17:36:14.017849 systemd-logind[1599]: Removed session 19. Dec 12 17:36:14.177689 systemd[1]: Started sshd@19-10.0.8.78:22-147.75.109.163:56880.service - OpenSSH per-connection server daemon (147.75.109.163:56880). Dec 12 17:36:15.165707 sshd[5353]: Accepted publickey for core from 147.75.109.163 port 56880 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:15.172308 sshd-session[5353]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:15.178740 systemd-logind[1599]: New session 20 of user core. Dec 12 17:36:15.186159 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 17:36:16.033199 sshd[5356]: Connection closed by 147.75.109.163 port 56880 Dec 12 17:36:16.034144 sshd-session[5353]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:16.039571 systemd-logind[1599]: Session 20 logged out. Waiting for processes to exit. Dec 12 17:36:16.039702 systemd[1]: sshd@19-10.0.8.78:22-147.75.109.163:56880.service: Deactivated successfully. Dec 12 17:36:16.041448 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 17:36:16.046898 systemd-logind[1599]: Removed session 20. Dec 12 17:36:16.192229 systemd[1]: Started sshd@20-10.0.8.78:22-147.75.109.163:56896.service - OpenSSH per-connection server daemon (147.75.109.163:56896). Dec 12 17:36:16.422797 kubelet[2861]: E1212 17:36:16.422668 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:36:17.160972 sshd[5367]: Accepted publickey for core from 147.75.109.163 port 56896 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:17.162205 sshd-session[5367]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:17.168389 systemd-logind[1599]: New session 21 of user core. Dec 12 17:36:17.175452 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 17:36:17.421104 kubelet[2861]: E1212 17:36:17.420770 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:36:17.908008 sshd[5370]: Connection closed by 147.75.109.163 port 56896 Dec 12 17:36:17.908372 sshd-session[5367]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:17.914004 systemd-logind[1599]: Session 21 logged out. Waiting for processes to exit. Dec 12 17:36:17.915581 systemd[1]: sshd@20-10.0.8.78:22-147.75.109.163:56896.service: Deactivated successfully. Dec 12 17:36:17.917938 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 17:36:17.919402 systemd-logind[1599]: Removed session 21. Dec 12 17:36:18.420348 kubelet[2861]: E1212 17:36:18.420305 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:36:19.419528 kubelet[2861]: E1212 17:36:19.419438 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:36:23.079453 systemd[1]: Started sshd@21-10.0.8.78:22-147.75.109.163:38520.service - OpenSSH per-connection server daemon (147.75.109.163:38520). Dec 12 17:36:23.421182 kubelet[2861]: E1212 17:36:23.420471 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:36:24.059939 sshd[5386]: Accepted publickey for core from 147.75.109.163 port 38520 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:24.061421 sshd-session[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:24.065596 systemd-logind[1599]: New session 22 of user core. Dec 12 17:36:24.078270 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 17:36:25.052042 sshd[5395]: Connection closed by 147.75.109.163 port 38520 Dec 12 17:36:25.052794 sshd-session[5386]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:25.056296 systemd[1]: sshd@21-10.0.8.78:22-147.75.109.163:38520.service: Deactivated successfully. Dec 12 17:36:25.058608 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 17:36:25.059364 systemd-logind[1599]: Session 22 logged out. Waiting for processes to exit. Dec 12 17:36:25.060667 systemd-logind[1599]: Removed session 22. Dec 12 17:36:28.420766 containerd[1621]: time="2025-12-12T17:36:28.420537786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:36:28.739755 containerd[1621]: time="2025-12-12T17:36:28.739689567Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:28.742023 containerd[1621]: time="2025-12-12T17:36:28.741956259Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:36:28.742023 containerd[1621]: time="2025-12-12T17:36:28.741995819Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:36:28.742255 kubelet[2861]: E1212 17:36:28.742193 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:36:28.742721 kubelet[2861]: E1212 17:36:28.742261 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:36:28.742721 kubelet[2861]: E1212 17:36:28.742397 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0baedf461ecb4cecaa52fed4d1b4bb46,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:28.744393 containerd[1621]: time="2025-12-12T17:36:28.744365831Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:36:29.091109 containerd[1621]: time="2025-12-12T17:36:29.090832751Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:29.092799 containerd[1621]: time="2025-12-12T17:36:29.092742440Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:36:29.092799 containerd[1621]: time="2025-12-12T17:36:29.092835561Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:36:29.093131 kubelet[2861]: E1212 17:36:29.093037 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:36:29.093174 kubelet[2861]: E1212 17:36:29.093121 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:36:29.093338 kubelet[2861]: E1212 17:36:29.093289 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwpnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78745dcfb4-7t4w7_calico-system(c7e92205-dc44-4937-bdc7-2e9e83a2cc4e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:29.094491 kubelet[2861]: E1212 17:36:29.094439 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:36:30.218784 systemd[1]: Started sshd@22-10.0.8.78:22-147.75.109.163:38522.service - OpenSSH per-connection server daemon (147.75.109.163:38522). Dec 12 17:36:30.420105 kubelet[2861]: E1212 17:36:30.420038 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:36:31.196978 sshd[5408]: Accepted publickey for core from 147.75.109.163 port 38522 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:31.199148 sshd-session[5408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:31.205329 systemd-logind[1599]: New session 23 of user core. Dec 12 17:36:31.215119 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 12 17:36:31.420136 kubelet[2861]: E1212 17:36:31.420064 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:36:31.938465 sshd[5411]: Connection closed by 147.75.109.163 port 38522 Dec 12 17:36:31.938893 sshd-session[5408]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:31.943201 systemd[1]: sshd@22-10.0.8.78:22-147.75.109.163:38522.service: Deactivated successfully. Dec 12 17:36:31.945188 systemd[1]: session-23.scope: Deactivated successfully. Dec 12 17:36:31.946054 systemd-logind[1599]: Session 23 logged out. Waiting for processes to exit. Dec 12 17:36:31.947731 systemd-logind[1599]: Removed session 23. Dec 12 17:36:32.420267 kubelet[2861]: E1212 17:36:32.420143 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:36:32.420642 kubelet[2861]: E1212 17:36:32.420283 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:36:37.137532 systemd[1]: Started sshd@23-10.0.8.78:22-147.75.109.163:48684.service - OpenSSH per-connection server daemon (147.75.109.163:48684). Dec 12 17:36:37.419891 kubelet[2861]: E1212 17:36:37.419749 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:36:38.197446 sshd[5428]: Accepted publickey for core from 147.75.109.163 port 48684 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:38.199130 sshd-session[5428]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:38.205285 systemd-logind[1599]: New session 24 of user core. Dec 12 17:36:38.214145 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 12 17:36:38.984456 sshd[5431]: Connection closed by 147.75.109.163 port 48684 Dec 12 17:36:38.984818 sshd-session[5428]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:38.988425 systemd[1]: sshd@23-10.0.8.78:22-147.75.109.163:48684.service: Deactivated successfully. Dec 12 17:36:38.990235 systemd[1]: session-24.scope: Deactivated successfully. Dec 12 17:36:38.990906 systemd-logind[1599]: Session 24 logged out. Waiting for processes to exit. Dec 12 17:36:38.992024 systemd-logind[1599]: Removed session 24. Dec 12 17:36:39.422064 kubelet[2861]: E1212 17:36:39.421888 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:36:44.142603 systemd[1]: Started sshd@24-10.0.8.78:22-147.75.109.163:40006.service - OpenSSH per-connection server daemon (147.75.109.163:40006). Dec 12 17:36:45.132541 sshd[5472]: Accepted publickey for core from 147.75.109.163 port 40006 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:36:45.133888 sshd-session[5472]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:36:45.138538 systemd-logind[1599]: New session 25 of user core. Dec 12 17:36:45.148102 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 12 17:36:45.419763 containerd[1621]: time="2025-12-12T17:36:45.419637179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:36:45.759920 containerd[1621]: time="2025-12-12T17:36:45.759872467Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:45.763385 containerd[1621]: time="2025-12-12T17:36:45.763326645Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:36:45.763483 containerd[1621]: time="2025-12-12T17:36:45.763394205Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:36:45.763611 kubelet[2861]: E1212 17:36:45.763560 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:36:45.763931 kubelet[2861]: E1212 17:36:45.763616 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:36:45.763931 kubelet[2861]: E1212 17:36:45.763836 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vh4dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-q7brm_calico-apiserver(48d6a2b5-a80f-4de9-91aa-e8709a4fec3b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:45.764155 containerd[1621]: time="2025-12-12T17:36:45.763915208Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:36:45.765401 kubelet[2861]: E1212 17:36:45.765360 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:36:45.861456 sshd[5475]: Connection closed by 147.75.109.163 port 40006 Dec 12 17:36:45.861280 sshd-session[5472]: pam_unix(sshd:session): session closed for user core Dec 12 17:36:45.865187 systemd[1]: sshd@24-10.0.8.78:22-147.75.109.163:40006.service: Deactivated successfully. Dec 12 17:36:45.866868 systemd[1]: session-25.scope: Deactivated successfully. Dec 12 17:36:45.867682 systemd-logind[1599]: Session 25 logged out. Waiting for processes to exit. Dec 12 17:36:45.869810 systemd-logind[1599]: Removed session 25. Dec 12 17:36:46.093117 containerd[1621]: time="2025-12-12T17:36:46.092844159Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:46.098615 containerd[1621]: time="2025-12-12T17:36:46.098462507Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:36:46.098615 containerd[1621]: time="2025-12-12T17:36:46.098575788Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:36:46.098981 kubelet[2861]: E1212 17:36:46.098733 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:36:46.098981 kubelet[2861]: E1212 17:36:46.098784 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:36:46.099151 kubelet[2861]: E1212 17:36:46.099063 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szn5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-698c665495-s6zxl_calico-system(ff7ea540-d740-4fa7-9163-b17e2194ee80): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:46.100409 kubelet[2861]: E1212 17:36:46.100369 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:36:46.421278 containerd[1621]: time="2025-12-12T17:36:46.421175187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:36:46.767133 containerd[1621]: time="2025-12-12T17:36:46.767050504Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:46.769577 containerd[1621]: time="2025-12-12T17:36:46.769498196Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:36:46.769675 containerd[1621]: time="2025-12-12T17:36:46.769505356Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:36:46.769855 kubelet[2861]: E1212 17:36:46.769819 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:36:46.770149 kubelet[2861]: E1212 17:36:46.769868 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:36:46.770149 kubelet[2861]: E1212 17:36:46.769998 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:46.772513 containerd[1621]: time="2025-12-12T17:36:46.772224490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:36:47.118563 containerd[1621]: time="2025-12-12T17:36:47.118399449Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:47.119746 containerd[1621]: time="2025-12-12T17:36:47.119691375Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:36:47.119810 containerd[1621]: time="2025-12-12T17:36:47.119766295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:36:47.119954 kubelet[2861]: E1212 17:36:47.119901 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:36:47.120006 kubelet[2861]: E1212 17:36:47.119966 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:36:47.120124 kubelet[2861]: E1212 17:36:47.120089 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzf9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-j96bs_calico-system(d21fdd1b-5217-4220-a07c-5b154ce8fa0d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:47.121303 kubelet[2861]: E1212 17:36:47.121244 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:36:47.421042 containerd[1621]: time="2025-12-12T17:36:47.420420063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:36:47.754146 containerd[1621]: time="2025-12-12T17:36:47.754091478Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:47.758736 containerd[1621]: time="2025-12-12T17:36:47.758649861Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:36:47.758866 containerd[1621]: time="2025-12-12T17:36:47.758759381Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:36:47.758960 kubelet[2861]: E1212 17:36:47.758911 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:36:47.759012 kubelet[2861]: E1212 17:36:47.758975 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:36:47.759462 kubelet[2861]: E1212 17:36:47.759117 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-59pph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-9vgpb_calico-system(764a9b76-22f0-48f8-92a8-02d3a1323d4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:47.760268 kubelet[2861]: E1212 17:36:47.760232 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:36:49.420158 containerd[1621]: time="2025-12-12T17:36:49.420114221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:36:49.747074 containerd[1621]: time="2025-12-12T17:36:49.747030962Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:36:49.748720 containerd[1621]: time="2025-12-12T17:36:49.748676370Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:36:49.749012 containerd[1621]: time="2025-12-12T17:36:49.748740730Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:36:49.749122 kubelet[2861]: E1212 17:36:49.749003 2861 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:36:49.749122 kubelet[2861]: E1212 17:36:49.749049 2861 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:36:49.749631 kubelet[2861]: E1212 17:36:49.749519 2861 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-khvbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6d5697548f-vqbgd_calico-apiserver(f0f2e696-f5c9-4490-a447-8711a361f9d0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:36:49.751130 kubelet[2861]: E1212 17:36:49.751093 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:36:52.422586 kubelet[2861]: E1212 17:36:52.422144 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:36:59.420097 kubelet[2861]: E1212 17:36:59.420047 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:36:59.420558 kubelet[2861]: E1212 17:36:59.420456 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:37:00.420512 kubelet[2861]: E1212 17:37:00.420454 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:37:00.420972 kubelet[2861]: E1212 17:37:00.420698 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:37:03.420343 kubelet[2861]: E1212 17:37:03.420240 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:37:05.420217 kubelet[2861]: E1212 17:37:05.420151 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:37:12.266645 systemd[1]: cri-containerd-bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901.scope: Deactivated successfully. Dec 12 17:37:12.267056 systemd[1]: cri-containerd-bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901.scope: Consumed 41.946s CPU time, 108.4M memory peak. Dec 12 17:37:12.269053 containerd[1621]: time="2025-12-12T17:37:12.268993596Z" level=info msg="received container exit event container_id:\"bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901\" id:\"bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901\" pid:3186 exit_status:1 exited_at:{seconds:1765561032 nanos:268663955}" Dec 12 17:37:12.275252 systemd[1]: cri-containerd-203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4.scope: Deactivated successfully. Dec 12 17:37:12.275518 systemd[1]: cri-containerd-203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4.scope: Consumed 5.092s CPU time, 63.3M memory peak. Dec 12 17:37:12.277041 containerd[1621]: time="2025-12-12T17:37:12.276939317Z" level=info msg="received container exit event container_id:\"203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4\" id:\"203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4\" pid:2707 exit_status:1 exited_at:{seconds:1765561032 nanos:276633795}" Dec 12 17:37:12.292738 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901-rootfs.mount: Deactivated successfully. Dec 12 17:37:12.304419 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4-rootfs.mount: Deactivated successfully. Dec 12 17:37:12.420297 kubelet[2861]: E1212 17:37:12.420225 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:37:12.520476 kubelet[2861]: E1212 17:37:12.520266 2861 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.78:32798->10.0.8.20:2379: read: connection timed out" Dec 12 17:37:12.524165 systemd[1]: cri-containerd-f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0.scope: Deactivated successfully. Dec 12 17:37:12.524452 systemd[1]: cri-containerd-f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0.scope: Consumed 4.401s CPU time, 22.6M memory peak. Dec 12 17:37:12.527090 containerd[1621]: time="2025-12-12T17:37:12.526942703Z" level=info msg="received container exit event container_id:\"f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0\" id:\"f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0\" pid:2716 exit_status:1 exited_at:{seconds:1765561032 nanos:526365140}" Dec 12 17:37:12.550376 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0-rootfs.mount: Deactivated successfully. Dec 12 17:37:12.985206 kubelet[2861]: I1212 17:37:12.985169 2861 scope.go:117] "RemoveContainer" containerID="bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901" Dec 12 17:37:12.986899 containerd[1621]: time="2025-12-12T17:37:12.986861513Z" level=info msg="CreateContainer within sandbox \"e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 12 17:37:12.987650 kubelet[2861]: I1212 17:37:12.987630 2861 scope.go:117] "RemoveContainer" containerID="203b2898e75a178fdf010cabdf25f4fdf994b901fbd2c19357778d5247a7eff4" Dec 12 17:37:12.989124 containerd[1621]: time="2025-12-12T17:37:12.989091444Z" level=info msg="CreateContainer within sandbox \"7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 12 17:37:12.990321 kubelet[2861]: I1212 17:37:12.990099 2861 scope.go:117] "RemoveContainer" containerID="f91df6b525c9b82b7ac3cadf1f527d5ed3f9dc8c78d41ae435e266c282bcf7c0" Dec 12 17:37:12.991743 containerd[1621]: time="2025-12-12T17:37:12.991715218Z" level=info msg="CreateContainer within sandbox \"abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 12 17:37:13.000238 containerd[1621]: time="2025-12-12T17:37:13.000208461Z" level=info msg="Container c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:37:13.006359 containerd[1621]: time="2025-12-12T17:37:13.006328092Z" level=info msg="Container eb9981270cc61f90f051d638c4017f2b43be62edb2a13e552609a63dbb5872cb: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:37:13.010325 containerd[1621]: time="2025-12-12T17:37:13.010293992Z" level=info msg="Container e10b39eec8f4a36df1ed2334308a8bf6736ba1174a6a219ebbd31db97473c0dc: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:37:13.013569 containerd[1621]: time="2025-12-12T17:37:13.013534528Z" level=info msg="CreateContainer within sandbox \"e2b5b21d99b6adc24615784701877194a8b4c6ab522d6da4b47a8f5e4bf70641\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144\"" Dec 12 17:37:13.014050 containerd[1621]: time="2025-12-12T17:37:13.014013891Z" level=info msg="StartContainer for \"c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144\"" Dec 12 17:37:13.014832 containerd[1621]: time="2025-12-12T17:37:13.014806055Z" level=info msg="connecting to shim c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144" address="unix:///run/containerd/s/6a00247a1588c3021098e7167c16463db4f8da5e24074137242a7517d3460dba" protocol=ttrpc version=3 Dec 12 17:37:13.020101 containerd[1621]: time="2025-12-12T17:37:13.020062201Z" level=info msg="CreateContainer within sandbox \"7f0f9c9bdbe434b709f610be50c04a4f27af0127ae4f0dd7f5a83f7f588bb90c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"eb9981270cc61f90f051d638c4017f2b43be62edb2a13e552609a63dbb5872cb\"" Dec 12 17:37:13.021850 containerd[1621]: time="2025-12-12T17:37:13.021802890Z" level=info msg="StartContainer for \"eb9981270cc61f90f051d638c4017f2b43be62edb2a13e552609a63dbb5872cb\"" Dec 12 17:37:13.025598 containerd[1621]: time="2025-12-12T17:37:13.025110667Z" level=info msg="CreateContainer within sandbox \"abded2355fbd0938da046c31b3d92c6a37bba332a34ce974e58acfbf7938118d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"e10b39eec8f4a36df1ed2334308a8bf6736ba1174a6a219ebbd31db97473c0dc\"" Dec 12 17:37:13.026737 containerd[1621]: time="2025-12-12T17:37:13.026589754Z" level=info msg="connecting to shim eb9981270cc61f90f051d638c4017f2b43be62edb2a13e552609a63dbb5872cb" address="unix:///run/containerd/s/e2c53c410e026fd3f5cc3108996155643c442317bd06fd82bcc2b03e93271bf4" protocol=ttrpc version=3 Dec 12 17:37:13.028262 containerd[1621]: time="2025-12-12T17:37:13.028228123Z" level=info msg="StartContainer for \"e10b39eec8f4a36df1ed2334308a8bf6736ba1174a6a219ebbd31db97473c0dc\"" Dec 12 17:37:13.029323 containerd[1621]: time="2025-12-12T17:37:13.029286168Z" level=info msg="connecting to shim e10b39eec8f4a36df1ed2334308a8bf6736ba1174a6a219ebbd31db97473c0dc" address="unix:///run/containerd/s/c7dd64f998353b9179b47ec62c5cbb78f56d9b0b32b9613932a3126ee76c3baa" protocol=ttrpc version=3 Dec 12 17:37:13.032177 systemd[1]: Started cri-containerd-c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144.scope - libcontainer container c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144. Dec 12 17:37:13.050121 systemd[1]: Started cri-containerd-e10b39eec8f4a36df1ed2334308a8bf6736ba1174a6a219ebbd31db97473c0dc.scope - libcontainer container e10b39eec8f4a36df1ed2334308a8bf6736ba1174a6a219ebbd31db97473c0dc. Dec 12 17:37:13.053358 systemd[1]: Started cri-containerd-eb9981270cc61f90f051d638c4017f2b43be62edb2a13e552609a63dbb5872cb.scope - libcontainer container eb9981270cc61f90f051d638c4017f2b43be62edb2a13e552609a63dbb5872cb. Dec 12 17:37:13.088545 containerd[1621]: time="2025-12-12T17:37:13.088483028Z" level=info msg="StartContainer for \"c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144\" returns successfully" Dec 12 17:37:13.106317 containerd[1621]: time="2025-12-12T17:37:13.106277398Z" level=info msg="StartContainer for \"eb9981270cc61f90f051d638c4017f2b43be62edb2a13e552609a63dbb5872cb\" returns successfully" Dec 12 17:37:13.110255 containerd[1621]: time="2025-12-12T17:37:13.110052297Z" level=info msg="StartContainer for \"e10b39eec8f4a36df1ed2334308a8bf6736ba1174a6a219ebbd31db97473c0dc\" returns successfully" Dec 12 17:37:13.419734 kubelet[2861]: E1212 17:37:13.419585 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:37:14.421822 kubelet[2861]: E1212 17:37:14.421653 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:37:15.419822 kubelet[2861]: E1212 17:37:15.419770 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:37:15.848816 kubelet[2861]: I1212 17:37:15.848759 2861 status_manager.go:895] "Failed to get status for pod" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.78:60944->10.0.8.20:2379: read: connection timed out" Dec 12 17:37:15.849306 kubelet[2861]: E1212 17:37:15.849068 2861 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.78:60840->10.0.8.20:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-6d5697548f-vqbgd.1880883949e53a0b calico-apiserver 1793 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-6d5697548f-vqbgd,UID:f0f2e696-f5c9-4490-a447-8711a361f9d0,APIVersion:v1,ResourceVersion:840,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4459-2-2-3-c846c80ac0,},FirstTimestamp:2025-12-12 17:33:53 +0000 UTC,LastTimestamp:2025-12-12 17:37:05.420095355 +0000 UTC m=+239.083650210,Count:12,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-3-c846c80ac0,}" Dec 12 17:37:18.422746 kubelet[2861]: E1212 17:37:18.422633 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-78745dcfb4-7t4w7" podUID="c7e92205-dc44-4937-bdc7-2e9e83a2cc4e" Dec 12 17:37:19.419696 kubelet[2861]: E1212 17:37:19.419643 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0" Dec 12 17:37:22.521118 kubelet[2861]: E1212 17:37:22.521065 2861 controller.go:195] "Failed to update lease" err="Put \"https://10.0.8.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-3-c846c80ac0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 17:37:24.301005 systemd[1]: cri-containerd-c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144.scope: Deactivated successfully. Dec 12 17:37:24.302213 containerd[1621]: time="2025-12-12T17:37:24.302179734Z" level=info msg="received container exit event container_id:\"c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144\" id:\"c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144\" pid:5608 exit_status:1 exited_at:{seconds:1765561044 nanos:301757091}" Dec 12 17:37:24.319396 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144-rootfs.mount: Deactivated successfully. Dec 12 17:37:24.419980 kubelet[2861]: E1212 17:37:24.419917 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-q7brm" podUID="48d6a2b5-a80f-4de9-91aa-e8709a4fec3b" Dec 12 17:37:25.023380 kubelet[2861]: I1212 17:37:25.023325 2861 scope.go:117] "RemoveContainer" containerID="bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901" Dec 12 17:37:25.023851 kubelet[2861]: I1212 17:37:25.023811 2861 scope.go:117] "RemoveContainer" containerID="c4c25735386723f6a33821cae014b44cba19637325a0f4b40b0bde47e1cab144" Dec 12 17:37:25.024157 kubelet[2861]: E1212 17:37:25.024121 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-54kkt_tigera-operator(829cd46b-689a-4cd7-b99f-90e6da1e4b08)\"" pod="tigera-operator/tigera-operator-7dcd859c48-54kkt" podUID="829cd46b-689a-4cd7-b99f-90e6da1e4b08" Dec 12 17:37:25.025522 containerd[1621]: time="2025-12-12T17:37:25.025493848Z" level=info msg="RemoveContainer for \"bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901\"" Dec 12 17:37:25.030476 containerd[1621]: time="2025-12-12T17:37:25.030435713Z" level=info msg="RemoveContainer for \"bccfba69c3357fca773694d2c19f176a0c05a20d86a77c0ddfc9e6be8ea64901\" returns successfully" Dec 12 17:37:26.990990 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Dec 12 17:37:27.420165 kubelet[2861]: E1212 17:37:27.420039 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-9vgpb" podUID="764a9b76-22f0-48f8-92a8-02d3a1323d4b" Dec 12 17:37:28.420008 kubelet[2861]: E1212 17:37:28.419942 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-698c665495-s6zxl" podUID="ff7ea540-d740-4fa7-9163-b17e2194ee80" Dec 12 17:37:29.420597 kubelet[2861]: E1212 17:37:29.420537 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-j96bs" podUID="d21fdd1b-5217-4220-a07c-5b154ce8fa0d" Dec 12 17:37:31.419917 kubelet[2861]: E1212 17:37:31.419846 2861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6d5697548f-vqbgd" podUID="f0f2e696-f5c9-4490-a447-8711a361f9d0"