Dec 12 17:25:15.794363 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 12 17:25:15.794385 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 12 17:25:15.794412 kernel: KASLR enabled Dec 12 17:25:15.794420 kernel: efi: EFI v2.7 by EDK II Dec 12 17:25:15.794425 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Dec 12 17:25:15.794431 kernel: random: crng init done Dec 12 17:25:15.794437 kernel: secureboot: Secure boot disabled Dec 12 17:25:15.794443 kernel: ACPI: Early table checksum verification disabled Dec 12 17:25:15.794448 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Dec 12 17:25:15.794454 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Dec 12 17:25:15.794462 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794467 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794473 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794479 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794486 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794492 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794499 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794505 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794512 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794518 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:25:15.794524 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Dec 12 17:25:15.794530 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 12 17:25:15.794535 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 12 17:25:15.794542 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Dec 12 17:25:15.794548 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Dec 12 17:25:15.794553 kernel: Zone ranges: Dec 12 17:25:15.794561 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 12 17:25:15.794567 kernel: DMA32 empty Dec 12 17:25:15.794573 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 12 17:25:15.794578 kernel: Device empty Dec 12 17:25:15.794584 kernel: Movable zone start for each node Dec 12 17:25:15.794590 kernel: Early memory node ranges Dec 12 17:25:15.794596 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Dec 12 17:25:15.794602 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Dec 12 17:25:15.794608 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Dec 12 17:25:15.794614 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Dec 12 17:25:15.794620 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Dec 12 17:25:15.794626 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Dec 12 17:25:15.794634 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 12 17:25:15.794640 kernel: psci: probing for conduit method from ACPI. Dec 12 17:25:15.794649 kernel: psci: PSCIv1.3 detected in firmware. Dec 12 17:25:15.794655 kernel: psci: Using standard PSCI v0.2 function IDs Dec 12 17:25:15.794661 kernel: psci: Trusted OS migration not required Dec 12 17:25:15.794669 kernel: psci: SMC Calling Convention v1.1 Dec 12 17:25:15.794675 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 12 17:25:15.794682 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 12 17:25:15.794688 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 12 17:25:15.794694 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Dec 12 17:25:15.794700 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Dec 12 17:25:15.794707 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 12 17:25:15.794713 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 12 17:25:15.794720 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Dec 12 17:25:15.794726 kernel: Detected PIPT I-cache on CPU0 Dec 12 17:25:15.794732 kernel: CPU features: detected: GIC system register CPU interface Dec 12 17:25:15.794739 kernel: CPU features: detected: Spectre-v4 Dec 12 17:25:15.794746 kernel: CPU features: detected: Spectre-BHB Dec 12 17:25:15.794753 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 12 17:25:15.794759 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 12 17:25:15.794765 kernel: CPU features: detected: ARM erratum 1418040 Dec 12 17:25:15.794771 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 12 17:25:15.794778 kernel: alternatives: applying boot alternatives Dec 12 17:25:15.794785 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:25:15.794792 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 12 17:25:15.794798 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 12 17:25:15.794804 kernel: Fallback order for Node 0: 0 Dec 12 17:25:15.794812 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Dec 12 17:25:15.794819 kernel: Policy zone: Normal Dec 12 17:25:15.794825 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 17:25:15.794831 kernel: software IO TLB: area num 4. Dec 12 17:25:15.794837 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 12 17:25:15.794844 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Dec 12 17:25:15.794850 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 17:25:15.794857 kernel: rcu: RCU event tracing is enabled. Dec 12 17:25:15.794863 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Dec 12 17:25:15.794870 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 17:25:15.794876 kernel: Tracing variant of Tasks RCU enabled. Dec 12 17:25:15.794882 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 17:25:15.794890 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Dec 12 17:25:15.794897 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 12 17:25:15.794903 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 12 17:25:15.794909 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 12 17:25:15.794916 kernel: GICv3: 256 SPIs implemented Dec 12 17:25:15.794922 kernel: GICv3: 0 Extended SPIs implemented Dec 12 17:25:15.794933 kernel: Root IRQ handler: gic_handle_irq Dec 12 17:25:15.794939 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 12 17:25:15.794945 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 12 17:25:15.794952 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 12 17:25:15.794958 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 12 17:25:15.794965 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Dec 12 17:25:15.794973 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Dec 12 17:25:15.794980 kernel: GICv3: using LPI property table @0x0000000100130000 Dec 12 17:25:15.794987 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Dec 12 17:25:15.794993 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 17:25:15.795000 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:25:15.795006 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 12 17:25:15.795013 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 12 17:25:15.795019 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 12 17:25:15.795026 kernel: arm-pv: using stolen time PV Dec 12 17:25:15.795033 kernel: Console: colour dummy device 80x25 Dec 12 17:25:15.795041 kernel: ACPI: Core revision 20240827 Dec 12 17:25:15.795048 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 12 17:25:15.795055 kernel: pid_max: default: 32768 minimum: 301 Dec 12 17:25:15.795062 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 17:25:15.795068 kernel: landlock: Up and running. Dec 12 17:25:15.795075 kernel: SELinux: Initializing. Dec 12 17:25:15.795082 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:25:15.795088 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:25:15.795095 kernel: rcu: Hierarchical SRCU implementation. Dec 12 17:25:15.795102 kernel: rcu: Max phase no-delay instances is 400. Dec 12 17:25:15.795110 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 17:25:15.795117 kernel: Remapping and enabling EFI services. Dec 12 17:25:15.795124 kernel: smp: Bringing up secondary CPUs ... Dec 12 17:25:15.795130 kernel: Detected PIPT I-cache on CPU1 Dec 12 17:25:15.795137 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 12 17:25:15.795144 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Dec 12 17:25:15.795150 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:25:15.795157 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 12 17:25:15.795163 kernel: Detected PIPT I-cache on CPU2 Dec 12 17:25:15.795176 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Dec 12 17:25:15.795183 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Dec 12 17:25:15.795189 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:25:15.795197 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Dec 12 17:25:15.795204 kernel: Detected PIPT I-cache on CPU3 Dec 12 17:25:15.795211 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Dec 12 17:25:15.795218 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Dec 12 17:25:15.795225 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:25:15.795233 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Dec 12 17:25:15.795239 kernel: smp: Brought up 1 node, 4 CPUs Dec 12 17:25:15.795246 kernel: SMP: Total of 4 processors activated. Dec 12 17:25:15.795253 kernel: CPU: All CPU(s) started at EL1 Dec 12 17:25:15.795260 kernel: CPU features: detected: 32-bit EL0 Support Dec 12 17:25:15.795267 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 12 17:25:15.795274 kernel: CPU features: detected: Common not Private translations Dec 12 17:25:15.795281 kernel: CPU features: detected: CRC32 instructions Dec 12 17:25:15.795288 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 12 17:25:15.795296 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 12 17:25:15.795303 kernel: CPU features: detected: LSE atomic instructions Dec 12 17:25:15.795309 kernel: CPU features: detected: Privileged Access Never Dec 12 17:25:15.795316 kernel: CPU features: detected: RAS Extension Support Dec 12 17:25:15.795323 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 12 17:25:15.795330 kernel: alternatives: applying system-wide alternatives Dec 12 17:25:15.795337 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Dec 12 17:25:15.795345 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Dec 12 17:25:15.795352 kernel: devtmpfs: initialized Dec 12 17:25:15.795360 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 17:25:15.795367 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Dec 12 17:25:15.795374 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 12 17:25:15.795381 kernel: 0 pages in range for non-PLT usage Dec 12 17:25:15.795387 kernel: 508400 pages in range for PLT usage Dec 12 17:25:15.795394 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 17:25:15.795415 kernel: SMBIOS 3.0.0 present. Dec 12 17:25:15.795423 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Dec 12 17:25:15.795429 kernel: DMI: Memory slots populated: 1/1 Dec 12 17:25:15.795438 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 17:25:15.795445 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 12 17:25:15.795452 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 12 17:25:15.795459 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 12 17:25:15.795466 kernel: audit: initializing netlink subsys (disabled) Dec 12 17:25:15.795473 kernel: audit: type=2000 audit(0.041:1): state=initialized audit_enabled=0 res=1 Dec 12 17:25:15.795479 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 17:25:15.795486 kernel: cpuidle: using governor menu Dec 12 17:25:15.795493 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 12 17:25:15.795501 kernel: ASID allocator initialised with 32768 entries Dec 12 17:25:15.795508 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 17:25:15.795515 kernel: Serial: AMBA PL011 UART driver Dec 12 17:25:15.795522 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 17:25:15.795529 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 17:25:15.795535 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 12 17:25:15.795542 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 12 17:25:15.795549 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 17:25:15.795556 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 17:25:15.795564 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 12 17:25:15.795571 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 12 17:25:15.795579 kernel: ACPI: Added _OSI(Module Device) Dec 12 17:25:15.795588 kernel: ACPI: Added _OSI(Processor Device) Dec 12 17:25:15.795595 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 17:25:15.795603 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 17:25:15.795610 kernel: ACPI: Interpreter enabled Dec 12 17:25:15.795618 kernel: ACPI: Using GIC for interrupt routing Dec 12 17:25:15.795625 kernel: ACPI: MCFG table detected, 1 entries Dec 12 17:25:15.795634 kernel: ACPI: CPU0 has been hot-added Dec 12 17:25:15.795641 kernel: ACPI: CPU1 has been hot-added Dec 12 17:25:15.795647 kernel: ACPI: CPU2 has been hot-added Dec 12 17:25:15.795654 kernel: ACPI: CPU3 has been hot-added Dec 12 17:25:15.795661 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 12 17:25:15.795668 kernel: printk: legacy console [ttyAMA0] enabled Dec 12 17:25:15.795675 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 12 17:25:15.795812 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 17:25:15.795881 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 12 17:25:15.795942 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 12 17:25:15.795999 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 12 17:25:15.796055 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 12 17:25:15.796065 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 12 17:25:15.796072 kernel: PCI host bridge to bus 0000:00 Dec 12 17:25:15.796137 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 12 17:25:15.796194 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 12 17:25:15.796246 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 12 17:25:15.796297 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 12 17:25:15.796380 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 12 17:25:15.796561 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.796629 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Dec 12 17:25:15.796690 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 17:25:15.796775 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Dec 12 17:25:15.796839 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 12 17:25:15.796908 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.796968 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Dec 12 17:25:15.797026 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 12 17:25:15.797084 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Dec 12 17:25:15.797149 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.797211 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Dec 12 17:25:15.797270 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 12 17:25:15.797328 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Dec 12 17:25:15.797385 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 12 17:25:15.797473 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.797536 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Dec 12 17:25:15.797598 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 12 17:25:15.797656 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 12 17:25:15.797722 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.797781 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Dec 12 17:25:15.797839 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 12 17:25:15.797897 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Dec 12 17:25:15.797955 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 12 17:25:15.798023 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.798084 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Dec 12 17:25:15.798142 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 12 17:25:15.798200 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Dec 12 17:25:15.798257 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 12 17:25:15.798322 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.798382 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Dec 12 17:25:15.798454 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 12 17:25:15.798523 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.798582 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Dec 12 17:25:15.798640 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 12 17:25:15.798713 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.798773 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Dec 12 17:25:15.798831 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 12 17:25:15.798896 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.798958 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Dec 12 17:25:15.799016 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 12 17:25:15.799080 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.799139 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Dec 12 17:25:15.799199 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 12 17:25:15.799266 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.799325 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Dec 12 17:25:15.799383 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 12 17:25:15.799465 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.799527 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Dec 12 17:25:15.799585 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 12 17:25:15.799651 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.799713 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Dec 12 17:25:15.799772 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 12 17:25:15.799839 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.799898 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Dec 12 17:25:15.799956 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 12 17:25:15.800021 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.800079 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Dec 12 17:25:15.800139 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 12 17:25:15.800206 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.800265 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Dec 12 17:25:15.800323 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 12 17:25:15.800393 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.800469 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Dec 12 17:25:15.800530 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 12 17:25:15.800588 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Dec 12 17:25:15.800647 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Dec 12 17:25:15.800712 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.800789 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Dec 12 17:25:15.800851 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 12 17:25:15.800909 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Dec 12 17:25:15.800966 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Dec 12 17:25:15.801035 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.801094 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Dec 12 17:25:15.801152 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 12 17:25:15.801209 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Dec 12 17:25:15.801270 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Dec 12 17:25:15.801336 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.801411 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Dec 12 17:25:15.801485 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 12 17:25:15.801559 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Dec 12 17:25:15.801624 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Dec 12 17:25:15.801700 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.801764 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Dec 12 17:25:15.801825 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 12 17:25:15.801915 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Dec 12 17:25:15.801980 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Dec 12 17:25:15.802045 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.802104 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Dec 12 17:25:15.802162 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 12 17:25:15.802221 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Dec 12 17:25:15.802278 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Dec 12 17:25:15.802343 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.802419 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Dec 12 17:25:15.802484 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 12 17:25:15.802543 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Dec 12 17:25:15.802601 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Dec 12 17:25:15.802669 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.802729 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Dec 12 17:25:15.802788 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 12 17:25:15.802845 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Dec 12 17:25:15.802907 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:25:15.802972 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.803031 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Dec 12 17:25:15.803089 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 12 17:25:15.803147 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Dec 12 17:25:15.803205 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:25:15.803275 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.803344 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Dec 12 17:25:15.803414 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 12 17:25:15.803475 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Dec 12 17:25:15.803535 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:25:15.803600 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.803659 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Dec 12 17:25:15.803718 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 12 17:25:15.803778 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Dec 12 17:25:15.803835 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:25:15.803899 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.803958 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Dec 12 17:25:15.804017 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 12 17:25:15.804076 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Dec 12 17:25:15.804134 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:25:15.804199 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.804259 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Dec 12 17:25:15.804318 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 12 17:25:15.804382 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Dec 12 17:25:15.804474 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:25:15.804553 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.804615 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Dec 12 17:25:15.804676 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 12 17:25:15.804762 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Dec 12 17:25:15.804838 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:25:15.804911 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.804976 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Dec 12 17:25:15.805035 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 12 17:25:15.805093 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Dec 12 17:25:15.805151 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:25:15.805216 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:25:15.805276 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Dec 12 17:25:15.805342 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 12 17:25:15.805421 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Dec 12 17:25:15.805490 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:25:15.805563 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 17:25:15.805632 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Dec 12 17:25:15.805696 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 12 17:25:15.805757 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 17:25:15.805832 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 12 17:25:15.805894 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Dec 12 17:25:15.805967 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 12 17:25:15.806030 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Dec 12 17:25:15.806092 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 12 17:25:15.806163 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:25:15.806229 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 12 17:25:15.806312 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:25:15.806386 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Dec 12 17:25:15.806468 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 12 17:25:15.806539 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Dec 12 17:25:15.806600 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Dec 12 17:25:15.806661 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 12 17:25:15.806724 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 12 17:25:15.806787 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:25:15.806846 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:25:15.806910 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 12 17:25:15.806969 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 12 17:25:15.807031 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 12 17:25:15.807098 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 12 17:25:15.807159 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:25:15.807218 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:25:15.807297 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 12 17:25:15.807360 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 12 17:25:15.807433 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 12 17:25:15.807499 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 12 17:25:15.807563 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:25:15.807622 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:25:15.807685 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 17:25:15.807744 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:25:15.807802 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:25:15.807867 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 17:25:15.807925 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Dec 12 17:25:15.807985 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Dec 12 17:25:15.808049 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 17:25:15.808107 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:25:15.808165 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:25:15.808228 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 17:25:15.808293 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:25:15.808353 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:25:15.808428 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 12 17:25:15.808490 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Dec 12 17:25:15.808550 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Dec 12 17:25:15.808618 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 12 17:25:15.808677 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 12 17:25:15.808754 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Dec 12 17:25:15.808829 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 12 17:25:15.808891 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Dec 12 17:25:15.808950 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Dec 12 17:25:15.809014 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 12 17:25:15.809074 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Dec 12 17:25:15.809132 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Dec 12 17:25:15.809195 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 12 17:25:15.809256 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Dec 12 17:25:15.809315 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Dec 12 17:25:15.809379 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 12 17:25:15.809454 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Dec 12 17:25:15.809517 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Dec 12 17:25:15.809582 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 12 17:25:15.809642 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Dec 12 17:25:15.809704 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Dec 12 17:25:15.809766 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 12 17:25:15.809832 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Dec 12 17:25:15.809892 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Dec 12 17:25:15.809957 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 12 17:25:15.810021 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Dec 12 17:25:15.810083 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Dec 12 17:25:15.810147 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 12 17:25:15.810206 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Dec 12 17:25:15.810263 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Dec 12 17:25:15.810325 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 12 17:25:15.810385 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Dec 12 17:25:15.810486 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Dec 12 17:25:15.810554 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 12 17:25:15.810616 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Dec 12 17:25:15.810675 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Dec 12 17:25:15.810738 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 12 17:25:15.810798 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Dec 12 17:25:15.810857 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Dec 12 17:25:15.810919 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 12 17:25:15.810981 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Dec 12 17:25:15.811039 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Dec 12 17:25:15.811103 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 12 17:25:15.811161 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Dec 12 17:25:15.811219 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Dec 12 17:25:15.811281 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 12 17:25:15.811342 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Dec 12 17:25:15.811411 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Dec 12 17:25:15.811477 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 12 17:25:15.811536 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Dec 12 17:25:15.811595 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Dec 12 17:25:15.811657 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 12 17:25:15.811717 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Dec 12 17:25:15.811777 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Dec 12 17:25:15.811841 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 12 17:25:15.811901 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Dec 12 17:25:15.811960 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Dec 12 17:25:15.812024 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 12 17:25:15.812086 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Dec 12 17:25:15.812143 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Dec 12 17:25:15.812207 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 12 17:25:15.812266 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Dec 12 17:25:15.812323 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Dec 12 17:25:15.812384 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 12 17:25:15.812463 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Dec 12 17:25:15.812523 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Dec 12 17:25:15.812585 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 12 17:25:15.812650 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Dec 12 17:25:15.812708 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Dec 12 17:25:15.812789 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 12 17:25:15.812851 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Dec 12 17:25:15.812910 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Dec 12 17:25:15.812971 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 12 17:25:15.813034 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 12 17:25:15.813097 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 12 17:25:15.813155 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 12 17:25:15.813216 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 12 17:25:15.813275 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 12 17:25:15.813337 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 12 17:25:15.813408 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 12 17:25:15.813478 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 12 17:25:15.813543 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 12 17:25:15.813605 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 12 17:25:15.813663 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 12 17:25:15.813725 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 12 17:25:15.813783 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 12 17:25:15.813844 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 12 17:25:15.813903 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 12 17:25:15.813963 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 12 17:25:15.814025 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 12 17:25:15.814087 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Dec 12 17:25:15.814146 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Dec 12 17:25:15.814206 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Dec 12 17:25:15.814264 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Dec 12 17:25:15.814324 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Dec 12 17:25:15.814382 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Dec 12 17:25:15.814457 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Dec 12 17:25:15.814520 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Dec 12 17:25:15.814581 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Dec 12 17:25:15.814640 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Dec 12 17:25:15.814701 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Dec 12 17:25:15.814760 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Dec 12 17:25:15.814821 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Dec 12 17:25:15.814880 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Dec 12 17:25:15.814942 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Dec 12 17:25:15.815001 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Dec 12 17:25:15.815061 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Dec 12 17:25:15.815120 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Dec 12 17:25:15.815181 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Dec 12 17:25:15.815241 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Dec 12 17:25:15.815303 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Dec 12 17:25:15.815362 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Dec 12 17:25:15.815451 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Dec 12 17:25:15.815513 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Dec 12 17:25:15.815573 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Dec 12 17:25:15.815632 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Dec 12 17:25:15.815692 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Dec 12 17:25:15.815753 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Dec 12 17:25:15.815813 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Dec 12 17:25:15.815871 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Dec 12 17:25:15.815932 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Dec 12 17:25:15.815991 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Dec 12 17:25:15.816051 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Dec 12 17:25:15.816109 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Dec 12 17:25:15.816169 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Dec 12 17:25:15.816228 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Dec 12 17:25:15.816288 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Dec 12 17:25:15.816346 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Dec 12 17:25:15.816426 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Dec 12 17:25:15.816489 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Dec 12 17:25:15.816551 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Dec 12 17:25:15.816610 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Dec 12 17:25:15.816670 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Dec 12 17:25:15.816729 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Dec 12 17:25:15.816809 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Dec 12 17:25:15.816871 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Dec 12 17:25:15.816935 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Dec 12 17:25:15.816993 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Dec 12 17:25:15.817054 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Dec 12 17:25:15.817112 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:25:15.817171 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Dec 12 17:25:15.817229 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:25:15.817289 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Dec 12 17:25:15.817350 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:25:15.817425 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Dec 12 17:25:15.817488 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:25:15.817548 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Dec 12 17:25:15.817607 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:25:15.817667 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Dec 12 17:25:15.817725 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:25:15.817785 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Dec 12 17:25:15.817847 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:25:15.817907 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Dec 12 17:25:15.817965 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:25:15.818025 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Dec 12 17:25:15.818084 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:25:15.818144 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Dec 12 17:25:15.818203 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Dec 12 17:25:15.818263 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Dec 12 17:25:15.818323 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Dec 12 17:25:15.818383 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Dec 12 17:25:15.818454 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Dec 12 17:25:15.818516 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Dec 12 17:25:15.818574 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Dec 12 17:25:15.818633 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Dec 12 17:25:15.818692 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Dec 12 17:25:15.818752 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Dec 12 17:25:15.818813 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Dec 12 17:25:15.818873 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Dec 12 17:25:15.818932 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.818990 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.819050 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Dec 12 17:25:15.819110 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.819168 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.819228 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Dec 12 17:25:15.819287 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.819345 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.819419 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Dec 12 17:25:15.819480 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.819539 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.819602 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Dec 12 17:25:15.819662 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.819722 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.819783 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Dec 12 17:25:15.819842 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.819901 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.819961 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Dec 12 17:25:15.820020 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.820080 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.820140 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Dec 12 17:25:15.820199 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.820257 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.820317 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Dec 12 17:25:15.820375 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.820446 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.820507 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Dec 12 17:25:15.820569 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.820628 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.820689 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Dec 12 17:25:15.820766 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.820829 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.820890 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Dec 12 17:25:15.820949 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.821008 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.821072 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Dec 12 17:25:15.821130 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.821189 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.821248 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Dec 12 17:25:15.821307 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.821366 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.821443 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Dec 12 17:25:15.821506 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.821568 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.821630 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Dec 12 17:25:15.821689 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.821747 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.821808 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Dec 12 17:25:15.821867 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.821925 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.821985 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Dec 12 17:25:15.822047 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.822105 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.822164 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:25:15.822225 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:25:15.822283 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:25:15.822344 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:25:15.822418 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:25:15.822483 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:25:15.822546 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:25:15.822609 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:25:15.822673 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:25:15.822736 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Dec 12 17:25:15.822796 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Dec 12 17:25:15.822856 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Dec 12 17:25:15.822920 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Dec 12 17:25:15.822983 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Dec 12 17:25:15.823045 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Dec 12 17:25:15.823106 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.823168 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.823228 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.823288 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.823349 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.823424 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.823488 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.823549 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.823610 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.823674 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.823735 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.823794 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.823855 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.823916 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.823977 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824038 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.824099 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824178 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.824238 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824299 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.824359 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824437 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.824500 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824560 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.824621 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824683 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.824757 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824820 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.824882 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.824941 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.825001 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.825063 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.825124 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.825190 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.825252 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:25:15.825313 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:25:15.825384 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 12 17:25:15.825462 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 12 17:25:15.825526 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 12 17:25:15.825587 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 17:25:15.825647 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:25:15.825709 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:25:15.825775 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 12 17:25:15.825834 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 12 17:25:15.825909 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:25:15.825969 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:25:15.826036 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 12 17:25:15.826099 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 12 17:25:15.826160 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 12 17:25:15.826219 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:25:15.826281 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:25:15.826349 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 12 17:25:15.826437 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 12 17:25:15.826499 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:25:15.826559 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:25:15.826626 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 12 17:25:15.826690 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 12 17:25:15.826750 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 12 17:25:15.826809 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:25:15.826879 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:25:15.826947 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 12 17:25:15.827011 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 12 17:25:15.827074 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 12 17:25:15.827135 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:25:15.827208 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:25:15.827269 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 12 17:25:15.827329 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:25:15.827391 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:25:15.827462 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 12 17:25:15.827522 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:25:15.827580 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:25:15.827641 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 12 17:25:15.827700 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:25:15.827761 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:25:15.827822 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 12 17:25:15.827881 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Dec 12 17:25:15.827940 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Dec 12 17:25:15.828000 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 12 17:25:15.828058 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Dec 12 17:25:15.828123 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Dec 12 17:25:15.828188 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 12 17:25:15.828249 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Dec 12 17:25:15.828310 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Dec 12 17:25:15.828370 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 12 17:25:15.828447 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Dec 12 17:25:15.828510 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Dec 12 17:25:15.828586 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 12 17:25:15.828647 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Dec 12 17:25:15.828708 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 12 17:25:15.828783 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 12 17:25:15.828846 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Dec 12 17:25:15.828918 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 12 17:25:15.828989 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 12 17:25:15.829050 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Dec 12 17:25:15.829108 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 12 17:25:15.829168 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 12 17:25:15.829230 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Dec 12 17:25:15.829288 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Dec 12 17:25:15.829353 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 12 17:25:15.829433 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Dec 12 17:25:15.829495 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Dec 12 17:25:15.829560 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 12 17:25:15.829620 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Dec 12 17:25:15.829680 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Dec 12 17:25:15.829751 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Dec 12 17:25:15.829814 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 12 17:25:15.829873 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Dec 12 17:25:15.829936 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Dec 12 17:25:15.829996 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Dec 12 17:25:15.830058 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 12 17:25:15.830119 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Dec 12 17:25:15.830181 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Dec 12 17:25:15.830242 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Dec 12 17:25:15.830305 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 12 17:25:15.830365 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Dec 12 17:25:15.830473 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Dec 12 17:25:15.830539 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 12 17:25:15.830606 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 12 17:25:15.830670 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Dec 12 17:25:15.830732 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Dec 12 17:25:15.830796 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 12 17:25:15.830862 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 12 17:25:15.830939 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Dec 12 17:25:15.831003 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Dec 12 17:25:15.831080 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 12 17:25:15.831146 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 12 17:25:15.831211 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Dec 12 17:25:15.831269 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Dec 12 17:25:15.831337 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Dec 12 17:25:15.831416 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 12 17:25:15.831480 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Dec 12 17:25:15.831539 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Dec 12 17:25:15.831600 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Dec 12 17:25:15.831663 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 12 17:25:15.831723 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Dec 12 17:25:15.831783 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Dec 12 17:25:15.831842 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Dec 12 17:25:15.831906 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 12 17:25:15.831973 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Dec 12 17:25:15.832033 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Dec 12 17:25:15.832094 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Dec 12 17:25:15.832157 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 12 17:25:15.832216 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Dec 12 17:25:15.832275 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Dec 12 17:25:15.832337 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Dec 12 17:25:15.832408 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 12 17:25:15.832471 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Dec 12 17:25:15.832532 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Dec 12 17:25:15.832595 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 12 17:25:15.832658 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 12 17:25:15.832718 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Dec 12 17:25:15.832800 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Dec 12 17:25:15.832863 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 12 17:25:15.832926 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 12 17:25:15.832986 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Dec 12 17:25:15.833045 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Dec 12 17:25:15.833105 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 12 17:25:15.833173 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 12 17:25:15.833233 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Dec 12 17:25:15.833294 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Dec 12 17:25:15.833354 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Dec 12 17:25:15.833515 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 12 17:25:15.833580 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 12 17:25:15.833635 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 12 17:25:15.833706 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 12 17:25:15.833764 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:25:15.833830 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 12 17:25:15.833888 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:25:15.833953 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 12 17:25:15.834012 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:25:15.834082 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 12 17:25:15.834145 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:25:15.834219 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 12 17:25:15.834292 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:25:15.834355 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 12 17:25:15.834432 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:25:15.834500 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 12 17:25:15.834555 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:25:15.834617 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 12 17:25:15.834672 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:25:15.834734 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 12 17:25:15.834792 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:25:15.834857 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Dec 12 17:25:15.834932 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Dec 12 17:25:15.835003 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Dec 12 17:25:15.835059 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Dec 12 17:25:15.835125 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Dec 12 17:25:15.835182 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Dec 12 17:25:15.835245 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Dec 12 17:25:15.835304 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Dec 12 17:25:15.835372 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Dec 12 17:25:15.835469 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 12 17:25:15.835536 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Dec 12 17:25:15.835592 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 12 17:25:15.835657 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Dec 12 17:25:15.835718 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 12 17:25:15.835783 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Dec 12 17:25:15.835840 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Dec 12 17:25:15.835907 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Dec 12 17:25:15.835962 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Dec 12 17:25:15.836027 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Dec 12 17:25:15.836085 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Dec 12 17:25:15.836140 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Dec 12 17:25:15.836202 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Dec 12 17:25:15.836258 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Dec 12 17:25:15.836313 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Dec 12 17:25:15.836376 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Dec 12 17:25:15.836450 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Dec 12 17:25:15.836508 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Dec 12 17:25:15.836578 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Dec 12 17:25:15.836634 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Dec 12 17:25:15.836689 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 12 17:25:15.836774 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Dec 12 17:25:15.836834 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Dec 12 17:25:15.836895 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 12 17:25:15.836961 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Dec 12 17:25:15.837017 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Dec 12 17:25:15.837071 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 12 17:25:15.837136 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Dec 12 17:25:15.837193 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Dec 12 17:25:15.837249 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Dec 12 17:25:15.837312 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Dec 12 17:25:15.837367 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Dec 12 17:25:15.837442 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Dec 12 17:25:15.837510 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 12 17:25:15.837568 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Dec 12 17:25:15.837624 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Dec 12 17:25:15.837691 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Dec 12 17:25:15.837748 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Dec 12 17:25:15.837804 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Dec 12 17:25:15.837868 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Dec 12 17:25:15.837924 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Dec 12 17:25:15.837978 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Dec 12 17:25:15.838042 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Dec 12 17:25:15.838101 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Dec 12 17:25:15.838155 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 12 17:25:15.838221 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Dec 12 17:25:15.838279 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Dec 12 17:25:15.838335 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 12 17:25:15.838410 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Dec 12 17:25:15.838470 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Dec 12 17:25:15.838528 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 12 17:25:15.838591 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Dec 12 17:25:15.838648 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Dec 12 17:25:15.838706 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Dec 12 17:25:15.838716 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 12 17:25:15.838723 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 12 17:25:15.838731 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 12 17:25:15.838740 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 12 17:25:15.838747 kernel: iommu: Default domain type: Translated Dec 12 17:25:15.838755 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 12 17:25:15.838762 kernel: efivars: Registered efivars operations Dec 12 17:25:15.838769 kernel: vgaarb: loaded Dec 12 17:25:15.838777 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 12 17:25:15.838784 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 17:25:15.838792 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 17:25:15.838799 kernel: pnp: PnP ACPI init Dec 12 17:25:15.838870 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 12 17:25:15.838880 kernel: pnp: PnP ACPI: found 1 devices Dec 12 17:25:15.838888 kernel: NET: Registered PF_INET protocol family Dec 12 17:25:15.838895 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 17:25:15.838902 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 12 17:25:15.838910 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 17:25:15.838917 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 12 17:25:15.838925 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 12 17:25:15.838934 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 12 17:25:15.838942 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 17:25:15.838949 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 17:25:15.838957 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 17:25:15.839025 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 12 17:25:15.839037 kernel: PCI: CLS 0 bytes, default 64 Dec 12 17:25:15.839045 kernel: kvm [1]: HYP mode not available Dec 12 17:25:15.839052 kernel: Initialise system trusted keyrings Dec 12 17:25:15.839059 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 12 17:25:15.839069 kernel: Key type asymmetric registered Dec 12 17:25:15.839076 kernel: Asymmetric key parser 'x509' registered Dec 12 17:25:15.839083 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 12 17:25:15.839091 kernel: io scheduler mq-deadline registered Dec 12 17:25:15.839098 kernel: io scheduler kyber registered Dec 12 17:25:15.839105 kernel: io scheduler bfq registered Dec 12 17:25:15.839113 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 12 17:25:15.839175 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Dec 12 17:25:15.839235 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Dec 12 17:25:15.839297 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.839359 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Dec 12 17:25:15.839438 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Dec 12 17:25:15.839499 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.839562 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Dec 12 17:25:15.839623 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Dec 12 17:25:15.839682 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.839746 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Dec 12 17:25:15.839809 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Dec 12 17:25:15.839869 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.839931 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Dec 12 17:25:15.839992 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Dec 12 17:25:15.840051 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.840114 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Dec 12 17:25:15.840174 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Dec 12 17:25:15.840233 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.840298 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Dec 12 17:25:15.840359 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Dec 12 17:25:15.840429 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.840494 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Dec 12 17:25:15.840553 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Dec 12 17:25:15.840612 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.840622 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 12 17:25:15.840682 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Dec 12 17:25:15.840771 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Dec 12 17:25:15.840837 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.840901 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Dec 12 17:25:15.840961 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Dec 12 17:25:15.841021 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.841086 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Dec 12 17:25:15.841145 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Dec 12 17:25:15.841207 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.841270 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Dec 12 17:25:15.841330 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Dec 12 17:25:15.841388 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.841475 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Dec 12 17:25:15.841537 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Dec 12 17:25:15.841596 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.841662 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Dec 12 17:25:15.841726 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Dec 12 17:25:15.841786 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.841848 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Dec 12 17:25:15.841909 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Dec 12 17:25:15.841968 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.842030 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Dec 12 17:25:15.842090 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Dec 12 17:25:15.842148 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.842161 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 12 17:25:15.842221 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Dec 12 17:25:15.842281 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Dec 12 17:25:15.842340 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.842421 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Dec 12 17:25:15.842483 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Dec 12 17:25:15.842542 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.842606 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Dec 12 17:25:15.842670 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Dec 12 17:25:15.842729 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.842792 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Dec 12 17:25:15.842852 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Dec 12 17:25:15.842911 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.842973 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Dec 12 17:25:15.843032 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Dec 12 17:25:15.843093 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.843155 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Dec 12 17:25:15.843215 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Dec 12 17:25:15.843275 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.843338 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Dec 12 17:25:15.843406 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Dec 12 17:25:15.843470 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.843536 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Dec 12 17:25:15.843600 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Dec 12 17:25:15.843660 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.843669 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 12 17:25:15.843731 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Dec 12 17:25:15.843790 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Dec 12 17:25:15.843850 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.843913 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Dec 12 17:25:15.843974 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Dec 12 17:25:15.844035 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.844099 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Dec 12 17:25:15.844159 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Dec 12 17:25:15.844219 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.844282 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Dec 12 17:25:15.844342 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Dec 12 17:25:15.844409 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.844474 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Dec 12 17:25:15.844536 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Dec 12 17:25:15.844595 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.844658 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Dec 12 17:25:15.844718 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Dec 12 17:25:15.844793 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.844862 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Dec 12 17:25:15.844925 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Dec 12 17:25:15.844989 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.845054 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Dec 12 17:25:15.845115 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Dec 12 17:25:15.845191 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.845253 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Dec 12 17:25:15.845316 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Dec 12 17:25:15.845376 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:25:15.845386 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 12 17:25:15.845413 kernel: ACPI: button: Power Button [PWRB] Dec 12 17:25:15.845490 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Dec 12 17:25:15.845560 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 12 17:25:15.845570 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 17:25:15.845578 kernel: thunder_xcv, ver 1.0 Dec 12 17:25:15.845585 kernel: thunder_bgx, ver 1.0 Dec 12 17:25:15.845592 kernel: nicpf, ver 1.0 Dec 12 17:25:15.845600 kernel: nicvf, ver 1.0 Dec 12 17:25:15.845675 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 12 17:25:15.845736 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-12T17:25:15 UTC (1765560315) Dec 12 17:25:15.845746 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 17:25:15.845754 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 12 17:25:15.845762 kernel: watchdog: NMI not fully supported Dec 12 17:25:15.845769 kernel: watchdog: Hard watchdog permanently disabled Dec 12 17:25:15.845776 kernel: NET: Registered PF_INET6 protocol family Dec 12 17:25:15.845784 kernel: Segment Routing with IPv6 Dec 12 17:25:15.845791 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 17:25:15.845800 kernel: NET: Registered PF_PACKET protocol family Dec 12 17:25:15.845807 kernel: Key type dns_resolver registered Dec 12 17:25:15.845814 kernel: registered taskstats version 1 Dec 12 17:25:15.845822 kernel: Loading compiled-in X.509 certificates Dec 12 17:25:15.845829 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 12 17:25:15.845836 kernel: Demotion targets for Node 0: null Dec 12 17:25:15.845844 kernel: Key type .fscrypt registered Dec 12 17:25:15.845851 kernel: Key type fscrypt-provisioning registered Dec 12 17:25:15.845858 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 17:25:15.845866 kernel: ima: Allocated hash algorithm: sha1 Dec 12 17:25:15.845874 kernel: ima: No architecture policies found Dec 12 17:25:15.845881 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 12 17:25:15.845889 kernel: clk: Disabling unused clocks Dec 12 17:25:15.845896 kernel: PM: genpd: Disabling unused power domains Dec 12 17:25:15.845903 kernel: Warning: unable to open an initial console. Dec 12 17:25:15.845911 kernel: Freeing unused kernel memory: 39552K Dec 12 17:25:15.845918 kernel: Run /init as init process Dec 12 17:25:15.845925 kernel: with arguments: Dec 12 17:25:15.845934 kernel: /init Dec 12 17:25:15.845941 kernel: with environment: Dec 12 17:25:15.845948 kernel: HOME=/ Dec 12 17:25:15.845956 kernel: TERM=linux Dec 12 17:25:15.845964 systemd[1]: Successfully made /usr/ read-only. Dec 12 17:25:15.845975 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:25:15.845983 systemd[1]: Detected virtualization kvm. Dec 12 17:25:15.845991 systemd[1]: Detected architecture arm64. Dec 12 17:25:15.846000 systemd[1]: Running in initrd. Dec 12 17:25:15.846007 systemd[1]: No hostname configured, using default hostname. Dec 12 17:25:15.846015 systemd[1]: Hostname set to . Dec 12 17:25:15.846023 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:25:15.846030 systemd[1]: Queued start job for default target initrd.target. Dec 12 17:25:15.846038 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:25:15.846054 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:25:15.846065 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 17:25:15.846073 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:25:15.846081 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 17:25:15.846091 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 17:25:15.846101 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 17:25:15.846109 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 17:25:15.846117 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:25:15.846125 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:25:15.846133 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:25:15.846141 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:25:15.846150 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:25:15.846158 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:25:15.846166 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:25:15.846174 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:25:15.846182 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 17:25:15.846190 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 17:25:15.846198 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:25:15.846212 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:25:15.846221 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:25:15.846231 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:25:15.846239 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 17:25:15.846249 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:25:15.846257 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 17:25:15.846266 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 17:25:15.846276 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 17:25:15.846284 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:25:15.846294 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:25:15.846302 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:25:15.846310 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 17:25:15.846319 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:25:15.846329 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 17:25:15.846338 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:25:15.846370 systemd-journald[313]: Collecting audit messages is disabled. Dec 12 17:25:15.846391 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:25:15.846419 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 17:25:15.846428 kernel: Bridge firewalling registered Dec 12 17:25:15.846437 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 17:25:15.846445 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:25:15.846454 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:25:15.846462 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:25:15.846471 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:25:15.846479 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:25:15.846489 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:25:15.846497 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:25:15.846505 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 17:25:15.846514 systemd-journald[313]: Journal started Dec 12 17:25:15.846533 systemd-journald[313]: Runtime Journal (/run/log/journal/e7ecadd67cea4e62908c6b554711c28b) is 8M, max 319.5M, 311.5M free. Dec 12 17:25:15.790096 systemd-modules-load[314]: Inserted module 'overlay' Dec 12 17:25:15.848069 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:25:15.805351 systemd-modules-load[314]: Inserted module 'br_netfilter' Dec 12 17:25:15.859176 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:25:15.867492 systemd-tmpfiles[350]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 17:25:15.870419 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:25:15.871433 dracut-cmdline[347]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:25:15.877075 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:25:15.912172 systemd-resolved[380]: Positive Trust Anchors: Dec 12 17:25:15.912196 systemd-resolved[380]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:25:15.912227 systemd-resolved[380]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:25:15.917934 systemd-resolved[380]: Defaulting to hostname 'linux'. Dec 12 17:25:15.919358 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:25:15.920326 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:25:15.953412 kernel: SCSI subsystem initialized Dec 12 17:25:15.957424 kernel: Loading iSCSI transport class v2.0-870. Dec 12 17:25:15.965442 kernel: iscsi: registered transport (tcp) Dec 12 17:25:15.978429 kernel: iscsi: registered transport (qla4xxx) Dec 12 17:25:15.978494 kernel: QLogic iSCSI HBA Driver Dec 12 17:25:15.996377 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:25:16.013417 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:25:16.014788 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:25:16.061588 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 17:25:16.063885 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 17:25:16.128432 kernel: raid6: neonx8 gen() 15804 MB/s Dec 12 17:25:16.145416 kernel: raid6: neonx4 gen() 15817 MB/s Dec 12 17:25:16.162409 kernel: raid6: neonx2 gen() 13217 MB/s Dec 12 17:25:16.179413 kernel: raid6: neonx1 gen() 10469 MB/s Dec 12 17:25:16.196436 kernel: raid6: int64x8 gen() 6906 MB/s Dec 12 17:25:16.213413 kernel: raid6: int64x4 gen() 7350 MB/s Dec 12 17:25:16.230410 kernel: raid6: int64x2 gen() 6106 MB/s Dec 12 17:25:16.247425 kernel: raid6: int64x1 gen() 4943 MB/s Dec 12 17:25:16.247447 kernel: raid6: using algorithm neonx4 gen() 15817 MB/s Dec 12 17:25:16.264455 kernel: raid6: .... xor() 12330 MB/s, rmw enabled Dec 12 17:25:16.264504 kernel: raid6: using neon recovery algorithm Dec 12 17:25:16.269621 kernel: xor: measuring software checksum speed Dec 12 17:25:16.269639 kernel: 8regs : 21630 MB/sec Dec 12 17:25:16.270762 kernel: 32regs : 21641 MB/sec Dec 12 17:25:16.270777 kernel: arm64_neon : 28013 MB/sec Dec 12 17:25:16.270786 kernel: xor: using function: arm64_neon (28013 MB/sec) Dec 12 17:25:16.324429 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 17:25:16.331302 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:25:16.333876 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:25:16.359192 systemd-udevd[569]: Using default interface naming scheme 'v255'. Dec 12 17:25:16.363327 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:25:16.365646 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 17:25:16.386970 dracut-pre-trigger[577]: rd.md=0: removing MD RAID activation Dec 12 17:25:16.410928 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:25:16.413167 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:25:16.491473 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:25:16.495108 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 17:25:16.538439 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Dec 12 17:25:16.541494 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 12 17:25:16.550013 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 12 17:25:16.550063 kernel: GPT:17805311 != 104857599 Dec 12 17:25:16.550074 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 12 17:25:16.550460 kernel: GPT:17805311 != 104857599 Dec 12 17:25:16.551766 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 12 17:25:16.551798 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:25:16.558526 kernel: ACPI: bus type USB registered Dec 12 17:25:16.558570 kernel: usbcore: registered new interface driver usbfs Dec 12 17:25:16.562810 kernel: usbcore: registered new interface driver hub Dec 12 17:25:16.562862 kernel: usbcore: registered new device driver usb Dec 12 17:25:16.584223 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:25:16.584507 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 12 17:25:16.585833 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 12 17:25:16.586177 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:25:16.588670 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:25:16.588849 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 12 17:25:16.588929 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 12 17:25:16.586374 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:25:16.592081 kernel: hub 1-0:1.0: USB hub found Dec 12 17:25:16.592223 kernel: hub 1-0:1.0: 4 ports detected Dec 12 17:25:16.592308 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 12 17:25:16.592172 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:25:16.595912 kernel: hub 2-0:1.0: USB hub found Dec 12 17:25:16.596056 kernel: hub 2-0:1.0: 4 ports detected Dec 12 17:25:16.595330 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:25:16.629761 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:25:16.631498 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 17:25:16.640368 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 12 17:25:16.650318 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 12 17:25:16.658331 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 17:25:16.664846 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 12 17:25:16.665877 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 12 17:25:16.668344 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:25:16.670298 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:25:16.672132 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:25:16.674627 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 17:25:16.676191 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 17:25:16.696943 disk-uuid[668]: Primary Header is updated. Dec 12 17:25:16.696943 disk-uuid[668]: Secondary Entries is updated. Dec 12 17:25:16.696943 disk-uuid[668]: Secondary Header is updated. Dec 12 17:25:16.701789 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:25:16.705450 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:25:16.834463 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 12 17:25:16.964776 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 12 17:25:16.964860 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 12 17:25:16.965095 kernel: usbcore: registered new interface driver usbhid Dec 12 17:25:16.965679 kernel: usbhid: USB HID core driver Dec 12 17:25:17.071438 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 12 17:25:17.196418 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 12 17:25:17.249431 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 12 17:25:17.717121 disk-uuid[671]: The operation has completed successfully. Dec 12 17:25:17.718169 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:25:17.755089 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 17:25:17.756121 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 17:25:17.786981 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 17:25:17.811137 sh[691]: Success Dec 12 17:25:17.825079 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 17:25:17.825120 kernel: device-mapper: uevent: version 1.0.3 Dec 12 17:25:17.825131 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 17:25:17.832416 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 12 17:25:17.883331 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 17:25:17.885850 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 17:25:17.902099 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 17:25:17.918414 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (704) Dec 12 17:25:17.920787 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 12 17:25:17.920809 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:25:17.933452 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 17:25:17.933520 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 17:25:17.935596 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 17:25:17.936833 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:25:17.937887 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 17:25:17.938712 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 17:25:17.941530 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 17:25:17.969432 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (736) Dec 12 17:25:17.971791 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:25:17.971825 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:25:17.977016 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:25:17.977061 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:25:17.982419 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:25:17.983623 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 17:25:17.985721 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 17:25:18.044001 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:25:18.047637 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:25:18.090672 systemd-networkd[875]: lo: Link UP Dec 12 17:25:18.090686 systemd-networkd[875]: lo: Gained carrier Dec 12 17:25:18.091663 systemd-networkd[875]: Enumeration completed Dec 12 17:25:18.092097 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:25:18.092100 systemd-networkd[875]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:25:18.092308 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:25:18.092996 systemd-networkd[875]: eth0: Link UP Dec 12 17:25:18.093091 systemd-networkd[875]: eth0: Gained carrier Dec 12 17:25:18.093102 systemd-networkd[875]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:25:18.093829 systemd[1]: Reached target network.target - Network. Dec 12 17:25:18.113694 systemd-networkd[875]: eth0: DHCPv4 address 10.0.17.31/25, gateway 10.0.17.1 acquired from 10.0.17.1 Dec 12 17:25:18.164387 ignition[793]: Ignition 2.22.0 Dec 12 17:25:18.164420 ignition[793]: Stage: fetch-offline Dec 12 17:25:18.164457 ignition[793]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:25:18.166321 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:25:18.164465 ignition[793]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:25:18.164544 ignition[793]: parsed url from cmdline: "" Dec 12 17:25:18.169063 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 17:25:18.164548 ignition[793]: no config URL provided Dec 12 17:25:18.164552 ignition[793]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:25:18.164559 ignition[793]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:25:18.164563 ignition[793]: failed to fetch config: resource requires networking Dec 12 17:25:18.164885 ignition[793]: Ignition finished successfully Dec 12 17:25:18.198492 ignition[889]: Ignition 2.22.0 Dec 12 17:25:18.198508 ignition[889]: Stage: fetch Dec 12 17:25:18.198628 ignition[889]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:25:18.198637 ignition[889]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:25:18.198708 ignition[889]: parsed url from cmdline: "" Dec 12 17:25:18.198711 ignition[889]: no config URL provided Dec 12 17:25:18.198716 ignition[889]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:25:18.198722 ignition[889]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:25:18.198948 ignition[889]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 12 17:25:18.199319 ignition[889]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 12 17:25:18.199339 ignition[889]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 12 17:25:18.481085 ignition[889]: GET result: OK Dec 12 17:25:18.481189 ignition[889]: parsing config with SHA512: faec81b05307b2dd61f0755b1a44f18485a9afde7a4bce4bf5839d2c9f94d5eb361f2f05b665f34fc670ab664dcfccfa368a1bc0c0e5e8d4a22bf126e3386b73 Dec 12 17:25:18.485086 unknown[889]: fetched base config from "system" Dec 12 17:25:18.485098 unknown[889]: fetched base config from "system" Dec 12 17:25:18.485441 ignition[889]: fetch: fetch complete Dec 12 17:25:18.485104 unknown[889]: fetched user config from "openstack" Dec 12 17:25:18.485447 ignition[889]: fetch: fetch passed Dec 12 17:25:18.487313 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 17:25:18.485488 ignition[889]: Ignition finished successfully Dec 12 17:25:18.489862 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 17:25:18.525371 ignition[897]: Ignition 2.22.0 Dec 12 17:25:18.525392 ignition[897]: Stage: kargs Dec 12 17:25:18.525617 ignition[897]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:25:18.525627 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:25:18.528824 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 17:25:18.526439 ignition[897]: kargs: kargs passed Dec 12 17:25:18.526484 ignition[897]: Ignition finished successfully Dec 12 17:25:18.530872 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 17:25:18.560021 ignition[905]: Ignition 2.22.0 Dec 12 17:25:18.560036 ignition[905]: Stage: disks Dec 12 17:25:18.560164 ignition[905]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:25:18.560173 ignition[905]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:25:18.560928 ignition[905]: disks: disks passed Dec 12 17:25:18.563079 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 17:25:18.560978 ignition[905]: Ignition finished successfully Dec 12 17:25:18.564893 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 17:25:18.566065 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 17:25:18.567630 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:25:18.568943 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:25:18.570433 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:25:18.573030 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 17:25:18.619083 systemd-fsck[915]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 17:25:18.623776 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 17:25:18.625752 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 17:25:18.731446 kernel: EXT4-fs (vda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 12 17:25:18.731958 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 17:25:18.733103 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 17:25:18.735957 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:25:18.738203 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 17:25:18.739846 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 12 17:25:18.750204 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 12 17:25:18.751317 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 17:25:18.751352 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:25:18.753731 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 17:25:18.755690 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 17:25:18.766418 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (923) Dec 12 17:25:18.768579 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:25:18.768594 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:25:18.774559 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:25:18.774584 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:25:18.776723 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:25:18.824454 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:18.842905 initrd-setup-root[952]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 17:25:18.847868 initrd-setup-root[959]: cut: /sysroot/etc/group: No such file or directory Dec 12 17:25:18.853218 initrd-setup-root[966]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 17:25:18.857927 initrd-setup-root[973]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 17:25:18.958309 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 17:25:18.960608 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 17:25:18.962044 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 17:25:18.978340 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 17:25:18.980167 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:25:19.000476 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 17:25:19.011670 ignition[1040]: INFO : Ignition 2.22.0 Dec 12 17:25:19.011670 ignition[1040]: INFO : Stage: mount Dec 12 17:25:19.013258 ignition[1040]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:25:19.013258 ignition[1040]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:25:19.013258 ignition[1040]: INFO : mount: mount passed Dec 12 17:25:19.013258 ignition[1040]: INFO : Ignition finished successfully Dec 12 17:25:19.014341 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 17:25:19.440769 systemd-networkd[875]: eth0: Gained IPv6LL Dec 12 17:25:19.867438 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:21.871454 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:25.877461 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:25.886703 coreos-metadata[925]: Dec 12 17:25:25.886 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:25:25.903387 coreos-metadata[925]: Dec 12 17:25:25.903 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 17:25:26.029301 coreos-metadata[925]: Dec 12 17:25:26.029 INFO Fetch successful Dec 12 17:25:26.030268 coreos-metadata[925]: Dec 12 17:25:26.029 INFO wrote hostname ci-4459-2-2-2-0ba9591bbe to /sysroot/etc/hostname Dec 12 17:25:26.032207 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 12 17:25:26.032315 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 12 17:25:26.034545 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 17:25:26.057336 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:25:26.102463 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1059) Dec 12 17:25:26.105612 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:25:26.105647 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:25:26.110445 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:25:26.110487 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:25:26.112729 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:25:26.143441 ignition[1077]: INFO : Ignition 2.22.0 Dec 12 17:25:26.143441 ignition[1077]: INFO : Stage: files Dec 12 17:25:26.144994 ignition[1077]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:25:26.144994 ignition[1077]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:25:26.144994 ignition[1077]: DEBUG : files: compiled without relabeling support, skipping Dec 12 17:25:26.147862 ignition[1077]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 17:25:26.147862 ignition[1077]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 17:25:26.150338 ignition[1077]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 17:25:26.150338 ignition[1077]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 17:25:26.150338 ignition[1077]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 17:25:26.148996 unknown[1077]: wrote ssh authorized keys file for user: core Dec 12 17:25:26.154517 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Dec 12 17:25:26.154517 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Dec 12 17:25:26.209879 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 17:25:26.425437 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Dec 12 17:25:26.425437 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:25:26.429450 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Dec 12 17:25:26.542862 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 17:25:27.225854 ignition[1077]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:25:27.225854 ignition[1077]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 17:25:27.229541 ignition[1077]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:25:27.232253 ignition[1077]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:25:27.232253 ignition[1077]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 17:25:27.232253 ignition[1077]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 12 17:25:27.236425 ignition[1077]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 17:25:27.236425 ignition[1077]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:25:27.236425 ignition[1077]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:25:27.236425 ignition[1077]: INFO : files: files passed Dec 12 17:25:27.236425 ignition[1077]: INFO : Ignition finished successfully Dec 12 17:25:27.235527 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 17:25:27.239527 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 17:25:27.242676 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 17:25:27.261174 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 17:25:27.261291 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 17:25:27.266084 initrd-setup-root-after-ignition[1108]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:25:27.266084 initrd-setup-root-after-ignition[1108]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:25:27.268688 initrd-setup-root-after-ignition[1112]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:25:27.268353 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:25:27.269800 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 17:25:27.272305 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 17:25:27.300753 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 17:25:27.300889 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 17:25:27.302773 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 17:25:27.304267 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 17:25:27.305844 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 17:25:27.306685 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 17:25:27.341383 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:25:27.343684 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 17:25:27.362842 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:25:27.363933 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:25:27.365598 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 17:25:27.367077 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 17:25:27.367206 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:25:27.369276 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 17:25:27.370983 systemd[1]: Stopped target basic.target - Basic System. Dec 12 17:25:27.372362 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 17:25:27.373874 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:25:27.375387 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 17:25:27.377197 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:25:27.378849 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 17:25:27.380332 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:25:27.382075 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 17:25:27.383658 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 17:25:27.385105 systemd[1]: Stopped target swap.target - Swaps. Dec 12 17:25:27.386332 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 17:25:27.386480 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:25:27.388524 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:25:27.390131 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:25:27.391722 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 17:25:27.391846 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:25:27.393430 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 17:25:27.393547 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 17:25:27.395971 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 17:25:27.396096 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:25:27.397645 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 17:25:27.397740 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 17:25:27.399922 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 17:25:27.402198 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 17:25:27.403627 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 17:25:27.403739 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:25:27.405345 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 17:25:27.405461 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:25:27.409981 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 17:25:27.414619 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 17:25:27.425036 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 17:25:27.430499 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 17:25:27.431369 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 17:25:27.433141 ignition[1132]: INFO : Ignition 2.22.0 Dec 12 17:25:27.433141 ignition[1132]: INFO : Stage: umount Dec 12 17:25:27.433141 ignition[1132]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:25:27.433141 ignition[1132]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:25:27.433141 ignition[1132]: INFO : umount: umount passed Dec 12 17:25:27.433141 ignition[1132]: INFO : Ignition finished successfully Dec 12 17:25:27.434938 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 17:25:27.435044 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 17:25:27.436166 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 17:25:27.436208 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 17:25:27.437670 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 17:25:27.437710 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 17:25:27.438953 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 17:25:27.438995 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 17:25:27.440240 systemd[1]: Stopped target network.target - Network. Dec 12 17:25:27.441611 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 17:25:27.441663 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:25:27.443200 systemd[1]: Stopped target paths.target - Path Units. Dec 12 17:25:27.444470 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 17:25:27.445473 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:25:27.446871 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 17:25:27.448137 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 17:25:27.449524 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 17:25:27.449567 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:25:27.450877 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 17:25:27.450916 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:25:27.452222 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 17:25:27.452282 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 17:25:27.453743 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 17:25:27.453785 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 17:25:27.455208 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 17:25:27.455254 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 17:25:27.456729 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 17:25:27.458228 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 17:25:27.467246 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 17:25:27.467378 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 17:25:27.471245 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 17:25:27.471461 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 17:25:27.471560 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 17:25:27.474158 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 17:25:27.475316 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 17:25:27.476953 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 17:25:27.477012 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:25:27.479461 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 17:25:27.480863 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 17:25:27.480921 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:25:27.482781 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 17:25:27.482821 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:25:27.485083 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 17:25:27.485124 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 17:25:27.486786 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 17:25:27.486835 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:25:27.489296 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:25:27.492898 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 17:25:27.492957 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:25:27.500548 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 17:25:27.500689 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 17:25:27.505004 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 17:25:27.505144 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:25:27.507109 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 17:25:27.507147 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 17:25:27.508528 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 17:25:27.508554 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:25:27.510014 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 17:25:27.510066 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:25:27.512299 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 17:25:27.512342 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 17:25:27.514670 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 17:25:27.514716 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:25:27.517856 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 17:25:27.518746 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 17:25:27.518801 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:25:27.521440 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 17:25:27.521492 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:25:27.524080 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 12 17:25:27.524124 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:25:27.527129 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 17:25:27.527170 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:25:27.529239 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:25:27.529281 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:25:27.533069 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 12 17:25:27.533116 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Dec 12 17:25:27.533145 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 12 17:25:27.533175 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:25:27.535164 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 17:25:27.536455 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 17:25:27.537979 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 17:25:27.540436 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 17:25:27.573833 systemd[1]: Switching root. Dec 12 17:25:27.604268 systemd-journald[313]: Journal stopped Dec 12 17:25:28.429949 systemd-journald[313]: Received SIGTERM from PID 1 (systemd). Dec 12 17:25:28.430035 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 17:25:28.430051 kernel: SELinux: policy capability open_perms=1 Dec 12 17:25:28.430064 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 17:25:28.430077 kernel: SELinux: policy capability always_check_network=0 Dec 12 17:25:28.430086 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 17:25:28.430099 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 17:25:28.430109 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 17:25:28.430122 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 17:25:28.430132 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 17:25:28.430141 kernel: audit: type=1403 audit(1765560327.748:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 17:25:28.430155 systemd[1]: Successfully loaded SELinux policy in 51.099ms. Dec 12 17:25:28.430172 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.680ms. Dec 12 17:25:28.430190 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:25:28.430201 systemd[1]: Detected virtualization kvm. Dec 12 17:25:28.430211 systemd[1]: Detected architecture arm64. Dec 12 17:25:28.430220 systemd[1]: Detected first boot. Dec 12 17:25:28.430231 systemd[1]: Hostname set to . Dec 12 17:25:28.430241 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:25:28.430251 zram_generator::config[1177]: No configuration found. Dec 12 17:25:28.430262 kernel: NET: Registered PF_VSOCK protocol family Dec 12 17:25:28.430271 systemd[1]: Populated /etc with preset unit settings. Dec 12 17:25:28.430282 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 17:25:28.430292 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 17:25:28.430302 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 17:25:28.430314 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 17:25:28.430331 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 17:25:28.430344 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 17:25:28.430356 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 17:25:28.430368 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 17:25:28.430380 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 17:25:28.430391 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 17:25:28.430414 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 17:25:28.430426 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 17:25:28.430437 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:25:28.430447 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:25:28.430457 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 17:25:28.430468 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 17:25:28.430478 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 17:25:28.430488 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:25:28.430499 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 12 17:25:28.430510 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:25:28.430521 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:25:28.430531 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 17:25:28.430542 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 17:25:28.430557 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 17:25:28.430568 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 17:25:28.430579 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:25:28.430593 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:25:28.430605 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:25:28.430615 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:25:28.430626 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 17:25:28.430636 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 17:25:28.430647 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 17:25:28.430659 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:25:28.430670 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:25:28.430681 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:25:28.430691 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 17:25:28.430703 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 17:25:28.430714 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 17:25:28.430724 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 17:25:28.430735 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 17:25:28.430745 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 17:25:28.430757 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 17:25:28.430781 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 17:25:28.430865 systemd[1]: Reached target machines.target - Containers. Dec 12 17:25:28.430886 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 17:25:28.430898 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:25:28.430908 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:25:28.430921 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 17:25:28.430934 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:25:28.430950 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:25:28.430960 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:25:28.430971 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 17:25:28.430981 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:25:28.430993 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 17:25:28.431004 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 17:25:28.431015 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 17:25:28.431026 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 17:25:28.431038 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 17:25:28.431049 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:25:28.431060 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:25:28.431071 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:25:28.431080 kernel: fuse: init (API version 7.41) Dec 12 17:25:28.431090 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:25:28.431102 kernel: loop: module loaded Dec 12 17:25:28.431112 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 17:25:28.431122 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 17:25:28.431155 systemd-journald[1248]: Collecting audit messages is disabled. Dec 12 17:25:28.431179 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:25:28.431190 kernel: ACPI: bus type drm_connector registered Dec 12 17:25:28.431200 systemd-journald[1248]: Journal started Dec 12 17:25:28.431222 systemd-journald[1248]: Runtime Journal (/run/log/journal/e7ecadd67cea4e62908c6b554711c28b) is 8M, max 319.5M, 311.5M free. Dec 12 17:25:28.431258 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 17:25:28.228264 systemd[1]: Queued start job for default target multi-user.target. Dec 12 17:25:28.253871 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 12 17:25:28.432683 systemd[1]: Stopped verity-setup.service. Dec 12 17:25:28.254302 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 17:25:28.438474 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:25:28.438017 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 17:25:28.439059 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 17:25:28.440257 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 17:25:28.441347 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 17:25:28.442518 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 17:25:28.443579 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 17:25:28.444690 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 17:25:28.447438 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:25:28.448744 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 17:25:28.448910 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 17:25:28.450756 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:25:28.450904 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:25:28.452042 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:25:28.452196 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:25:28.453367 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:25:28.453557 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:25:28.454744 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 17:25:28.454904 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 17:25:28.456015 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:25:28.456186 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:25:28.457421 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:25:28.458586 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:25:28.459835 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 17:25:28.461140 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 17:25:28.471220 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:25:28.473503 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 17:25:28.475235 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 17:25:28.476328 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 17:25:28.476368 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:25:28.478145 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 17:25:28.490579 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 17:25:28.491654 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:25:28.492831 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 17:25:28.494904 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 17:25:28.496043 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:25:28.497190 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 17:25:28.499153 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:25:28.501967 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:25:28.508620 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 17:25:28.512579 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:25:28.515215 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 17:25:28.516669 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 17:25:28.517880 systemd-journald[1248]: Time spent on flushing to /var/log/journal/e7ecadd67cea4e62908c6b554711c28b is 29.113ms for 1689 entries. Dec 12 17:25:28.517880 systemd-journald[1248]: System Journal (/var/log/journal/e7ecadd67cea4e62908c6b554711c28b) is 8M, max 584.8M, 576.8M free. Dec 12 17:25:28.553540 systemd-journald[1248]: Received client request to flush runtime journal. Dec 12 17:25:28.553591 kernel: loop0: detected capacity change from 0 to 207008 Dec 12 17:25:28.519443 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 17:25:28.520972 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:25:28.527139 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 17:25:28.529863 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 17:25:28.544665 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:25:28.548719 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Dec 12 17:25:28.548732 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Dec 12 17:25:28.552099 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:25:28.555975 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 17:25:28.558636 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 17:25:28.566593 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 17:25:28.569734 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 17:25:28.585451 kernel: loop1: detected capacity change from 0 to 1632 Dec 12 17:25:28.606576 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 17:25:28.609634 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:25:28.610416 kernel: loop2: detected capacity change from 0 to 119840 Dec 12 17:25:28.633327 systemd-tmpfiles[1318]: ACLs are not supported, ignoring. Dec 12 17:25:28.633348 systemd-tmpfiles[1318]: ACLs are not supported, ignoring. Dec 12 17:25:28.638454 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:25:28.649441 kernel: loop3: detected capacity change from 0 to 100632 Dec 12 17:25:28.691471 kernel: loop4: detected capacity change from 0 to 207008 Dec 12 17:25:28.711443 kernel: loop5: detected capacity change from 0 to 1632 Dec 12 17:25:28.718431 kernel: loop6: detected capacity change from 0 to 119840 Dec 12 17:25:28.732422 kernel: loop7: detected capacity change from 0 to 100632 Dec 12 17:25:28.745435 (sd-merge)[1324]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 12 17:25:28.745917 (sd-merge)[1324]: Merged extensions into '/usr'. Dec 12 17:25:28.749557 systemd[1]: Reload requested from client PID 1296 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 17:25:28.749678 systemd[1]: Reloading... Dec 12 17:25:28.795537 zram_generator::config[1350]: No configuration found. Dec 12 17:25:28.957947 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 17:25:28.958517 systemd[1]: Reloading finished in 208 ms. Dec 12 17:25:28.977466 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 17:25:28.978944 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 17:25:28.994053 systemd[1]: Starting ensure-sysext.service... Dec 12 17:25:28.996019 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:25:28.998678 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:25:29.007185 systemd[1]: Reload requested from client PID 1389 ('systemctl') (unit ensure-sysext.service)... Dec 12 17:25:29.007204 systemd[1]: Reloading... Dec 12 17:25:29.014287 systemd-tmpfiles[1390]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 17:25:29.014320 systemd-tmpfiles[1390]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 17:25:29.014946 systemd-tmpfiles[1390]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 17:25:29.015223 systemd-tmpfiles[1390]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 17:25:29.015937 systemd-tmpfiles[1390]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 17:25:29.016266 systemd-tmpfiles[1390]: ACLs are not supported, ignoring. Dec 12 17:25:29.016379 systemd-tmpfiles[1390]: ACLs are not supported, ignoring. Dec 12 17:25:29.023662 systemd-udevd[1391]: Using default interface naming scheme 'v255'. Dec 12 17:25:29.025260 systemd-tmpfiles[1390]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:25:29.025270 systemd-tmpfiles[1390]: Skipping /boot Dec 12 17:25:29.033059 systemd-tmpfiles[1390]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:25:29.033165 systemd-tmpfiles[1390]: Skipping /boot Dec 12 17:25:29.053428 zram_generator::config[1418]: No configuration found. Dec 12 17:25:29.216539 systemd[1]: Reloading finished in 208 ms. Dec 12 17:25:29.223269 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:25:29.227016 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:25:29.240051 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 12 17:25:29.252428 ldconfig[1291]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 17:25:29.253354 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 17:25:29.258383 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 17:25:29.263180 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:25:29.266878 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 17:25:29.269234 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:25:29.276976 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:25:29.279834 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:25:29.282020 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:25:29.283145 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:25:29.284243 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 17:25:29.285526 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:25:29.286686 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 17:25:29.290220 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:25:29.297053 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:25:29.300850 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 17:25:29.305710 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:25:29.305925 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:25:29.308490 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:25:29.308721 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:25:29.312113 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:25:29.312304 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:25:29.321207 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 17:25:29.326965 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:25:29.333458 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 17:25:29.334608 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:25:29.337543 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:25:29.341125 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:25:29.344078 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:25:29.346873 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 12 17:25:29.349581 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:25:29.349760 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:25:29.349983 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 17:25:29.352770 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:25:29.357619 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:25:29.360166 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:25:29.360460 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:25:29.363220 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:25:29.363392 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:25:29.368917 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 12 17:25:29.368991 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 12 17:25:29.370330 systemd[1]: Finished ensure-sysext.service. Dec 12 17:25:29.372150 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 17:25:29.374149 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:25:29.374388 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:25:29.380157 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 17:25:29.381488 kernel: PTP clock support registered Dec 12 17:25:29.384539 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 12 17:25:29.384746 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 12 17:25:29.386660 augenrules[1545]: No rules Dec 12 17:25:29.392667 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:25:29.393096 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:25:29.396830 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 17:25:29.403447 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:25:29.403545 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:25:29.405992 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 17:25:29.412715 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 17:25:29.415275 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:25:29.423434 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Dec 12 17:25:29.426410 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 12 17:25:29.426473 kernel: [drm] features: -context_init Dec 12 17:25:29.428605 kernel: [drm] number of scanouts: 1 Dec 12 17:25:29.430710 kernel: [drm] number of cap sets: 0 Dec 12 17:25:29.447673 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:25:29.449828 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 17:25:29.456454 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Dec 12 17:25:29.460560 kernel: Console: switching to colour frame buffer device 160x50 Dec 12 17:25:29.464411 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 12 17:25:29.474520 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:25:29.474766 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:25:29.477336 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:25:29.479627 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:25:29.493842 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 17:25:29.543808 systemd-networkd[1510]: lo: Link UP Dec 12 17:25:29.543818 systemd-networkd[1510]: lo: Gained carrier Dec 12 17:25:29.544865 systemd-networkd[1510]: Enumeration completed Dec 12 17:25:29.545330 systemd-networkd[1510]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:25:29.545341 systemd-networkd[1510]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:25:29.545910 systemd-networkd[1510]: eth0: Link UP Dec 12 17:25:29.546040 systemd-networkd[1510]: eth0: Gained carrier Dec 12 17:25:29.546059 systemd-networkd[1510]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:25:29.546481 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:25:29.548642 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 17:25:29.550995 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 17:25:29.558517 systemd-networkd[1510]: eth0: DHCPv4 address 10.0.17.31/25, gateway 10.0.17.1 acquired from 10.0.17.1 Dec 12 17:25:29.564190 systemd-resolved[1514]: Positive Trust Anchors: Dec 12 17:25:29.564211 systemd-resolved[1514]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:25:29.564242 systemd-resolved[1514]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:25:29.564961 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:25:29.568072 systemd-resolved[1514]: Using system hostname 'ci-4459-2-2-2-0ba9591bbe'. Dec 12 17:25:29.568674 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 17:25:29.569785 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:25:29.571138 systemd[1]: Reached target network.target - Network. Dec 12 17:25:29.571951 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:25:29.572990 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:25:29.573940 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 17:25:29.574958 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 17:25:29.576142 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 17:25:29.577281 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 17:25:29.578377 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 17:25:29.579331 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 17:25:29.579367 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:25:29.580172 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:25:29.582347 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 17:25:29.584511 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 17:25:29.587045 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 17:25:29.588280 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 17:25:29.589437 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 17:25:29.593112 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 17:25:29.594298 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 17:25:29.595908 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 17:25:29.596915 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:25:29.597721 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:25:29.598484 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:25:29.598511 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:25:29.600749 systemd[1]: Starting chronyd.service - NTP client/server... Dec 12 17:25:29.603268 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 17:25:29.605216 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 17:25:29.609528 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 17:25:29.611145 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 17:25:29.611438 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:29.614154 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 17:25:29.615879 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 17:25:29.616775 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 17:25:29.620537 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 17:25:29.622464 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 17:25:29.622777 jq[1603]: false Dec 12 17:25:29.624616 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 17:25:29.626321 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 17:25:29.631424 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 17:25:29.633154 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 17:25:29.634283 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 17:25:29.635532 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 17:25:29.638345 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 17:25:29.641904 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 17:25:29.643308 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 17:25:29.643536 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 17:25:29.644517 extend-filesystems[1604]: Found /dev/vda6 Dec 12 17:25:29.646885 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 17:25:29.647786 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 17:25:29.653001 jq[1616]: true Dec 12 17:25:29.653182 extend-filesystems[1604]: Found /dev/vda9 Dec 12 17:25:29.655563 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 17:25:29.656494 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 17:25:29.659823 extend-filesystems[1604]: Checking size of /dev/vda9 Dec 12 17:25:29.669732 chronyd[1596]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 12 17:25:29.671745 (ntainerd)[1633]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 17:25:29.676985 jq[1630]: true Dec 12 17:25:29.672688 chronyd[1596]: Loaded seccomp filter (level 2) Dec 12 17:25:29.672806 systemd[1]: Started chronyd.service - NTP client/server. Dec 12 17:25:29.679073 tar[1623]: linux-arm64/LICENSE Dec 12 17:25:29.679285 tar[1623]: linux-arm64/helm Dec 12 17:25:29.681515 extend-filesystems[1604]: Resized partition /dev/vda9 Dec 12 17:25:29.685427 extend-filesystems[1645]: resize2fs 1.47.3 (8-Jul-2025) Dec 12 17:25:29.699423 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 12 17:25:29.718046 dbus-daemon[1599]: [system] SELinux support is enabled Dec 12 17:25:29.718204 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 17:25:29.721693 update_engine[1615]: I20251212 17:25:29.719790 1615 main.cc:92] Flatcar Update Engine starting Dec 12 17:25:29.722951 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 17:25:29.722982 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 17:25:29.724538 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 17:25:29.724558 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 17:25:29.729890 update_engine[1615]: I20251212 17:25:29.727068 1615 update_check_scheduler.cc:74] Next update check in 9m19s Dec 12 17:25:29.727809 systemd[1]: Started update-engine.service - Update Engine. Dec 12 17:25:29.732548 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 17:25:29.760453 systemd-logind[1612]: New seat seat0. Dec 12 17:25:29.802912 systemd-logind[1612]: Watching system buttons on /dev/input/event0 (Power Button) Dec 12 17:25:29.802940 systemd-logind[1612]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 12 17:25:29.803421 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 17:25:29.815298 locksmithd[1650]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 17:25:29.834403 bash[1662]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:25:29.836724 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 17:25:29.840731 systemd[1]: Starting sshkeys.service... Dec 12 17:25:29.864283 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 12 17:25:29.867266 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 12 17:25:29.879327 containerd[1633]: time="2025-12-12T17:25:29Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 17:25:29.882404 containerd[1633]: time="2025-12-12T17:25:29.880264520Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 17:25:29.885413 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:29.895314 containerd[1633]: time="2025-12-12T17:25:29.895269640Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.32µs" Dec 12 17:25:29.895314 containerd[1633]: time="2025-12-12T17:25:29.895308440Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 17:25:29.895467 containerd[1633]: time="2025-12-12T17:25:29.895328200Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 17:25:29.895522 containerd[1633]: time="2025-12-12T17:25:29.895498880Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 17:25:29.895546 containerd[1633]: time="2025-12-12T17:25:29.895521360Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 17:25:29.895564 containerd[1633]: time="2025-12-12T17:25:29.895546680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:25:29.895618 containerd[1633]: time="2025-12-12T17:25:29.895600920Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:25:29.895618 containerd[1633]: time="2025-12-12T17:25:29.895615320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:25:29.895864 containerd[1633]: time="2025-12-12T17:25:29.895841240Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:25:29.895889 containerd[1633]: time="2025-12-12T17:25:29.895861640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:25:29.895907 containerd[1633]: time="2025-12-12T17:25:29.895873000Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:25:29.895907 containerd[1633]: time="2025-12-12T17:25:29.895896400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 17:25:29.895987 containerd[1633]: time="2025-12-12T17:25:29.895967440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 17:25:29.896170 containerd[1633]: time="2025-12-12T17:25:29.896148480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:25:29.896201 containerd[1633]: time="2025-12-12T17:25:29.896183720Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:25:29.896201 containerd[1633]: time="2025-12-12T17:25:29.896195280Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 17:25:29.896248 containerd[1633]: time="2025-12-12T17:25:29.896234440Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 17:25:29.896511 containerd[1633]: time="2025-12-12T17:25:29.896490600Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 17:25:29.896579 containerd[1633]: time="2025-12-12T17:25:29.896561080Z" level=info msg="metadata content store policy set" policy=shared Dec 12 17:25:29.932912 containerd[1633]: time="2025-12-12T17:25:29.932854400Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 17:25:29.933061 containerd[1633]: time="2025-12-12T17:25:29.933035480Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 17:25:29.934450 containerd[1633]: time="2025-12-12T17:25:29.934417440Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 17:25:29.934479 containerd[1633]: time="2025-12-12T17:25:29.934464280Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 17:25:29.934498 containerd[1633]: time="2025-12-12T17:25:29.934484720Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 17:25:29.934524 containerd[1633]: time="2025-12-12T17:25:29.934496880Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 17:25:29.934524 containerd[1633]: time="2025-12-12T17:25:29.934510080Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 17:25:29.934560 containerd[1633]: time="2025-12-12T17:25:29.934523200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 17:25:29.934560 containerd[1633]: time="2025-12-12T17:25:29.934537080Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 17:25:29.934560 containerd[1633]: time="2025-12-12T17:25:29.934549080Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 17:25:29.934612 containerd[1633]: time="2025-12-12T17:25:29.934558880Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 17:25:29.934612 containerd[1633]: time="2025-12-12T17:25:29.934585240Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 17:25:29.934785 containerd[1633]: time="2025-12-12T17:25:29.934760080Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 17:25:29.934810 containerd[1633]: time="2025-12-12T17:25:29.934795360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 17:25:29.934829 containerd[1633]: time="2025-12-12T17:25:29.934819640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 17:25:29.934850 containerd[1633]: time="2025-12-12T17:25:29.934831320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 17:25:29.934850 containerd[1633]: time="2025-12-12T17:25:29.934841920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 17:25:29.934886 containerd[1633]: time="2025-12-12T17:25:29.934851760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 17:25:29.934886 containerd[1633]: time="2025-12-12T17:25:29.934863560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 17:25:29.934886 containerd[1633]: time="2025-12-12T17:25:29.934873720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 17:25:29.934943 containerd[1633]: time="2025-12-12T17:25:29.934887120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 17:25:29.934943 containerd[1633]: time="2025-12-12T17:25:29.934901240Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 17:25:29.935052 containerd[1633]: time="2025-12-12T17:25:29.934911320Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 17:25:29.935326 containerd[1633]: time="2025-12-12T17:25:29.935305360Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 17:25:29.935353 containerd[1633]: time="2025-12-12T17:25:29.935328440Z" level=info msg="Start snapshots syncer" Dec 12 17:25:29.935371 containerd[1633]: time="2025-12-12T17:25:29.935360920Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 17:25:29.935955 containerd[1633]: time="2025-12-12T17:25:29.935831080Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 17:25:29.936039 containerd[1633]: time="2025-12-12T17:25:29.935986040Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 17:25:29.936149 containerd[1633]: time="2025-12-12T17:25:29.936125560Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 17:25:29.936285 containerd[1633]: time="2025-12-12T17:25:29.936262840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 17:25:29.936854 containerd[1633]: time="2025-12-12T17:25:29.936391960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 17:25:29.936901 containerd[1633]: time="2025-12-12T17:25:29.936863960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 17:25:29.936901 containerd[1633]: time="2025-12-12T17:25:29.936885200Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 17:25:29.936938 containerd[1633]: time="2025-12-12T17:25:29.936901360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 17:25:29.936938 containerd[1633]: time="2025-12-12T17:25:29.936913400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 17:25:29.937051 containerd[1633]: time="2025-12-12T17:25:29.937028960Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 17:25:29.937096 containerd[1633]: time="2025-12-12T17:25:29.937079320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 17:25:29.937125 containerd[1633]: time="2025-12-12T17:25:29.937097880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 17:25:29.937205 containerd[1633]: time="2025-12-12T17:25:29.937112000Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 17:25:29.937258 containerd[1633]: time="2025-12-12T17:25:29.937239720Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:25:29.937284 containerd[1633]: time="2025-12-12T17:25:29.937262960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:25:29.937284 containerd[1633]: time="2025-12-12T17:25:29.937273640Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:25:29.937326 containerd[1633]: time="2025-12-12T17:25:29.937283880Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:25:29.937346 containerd[1633]: time="2025-12-12T17:25:29.937291720Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 17:25:29.937366 containerd[1633]: time="2025-12-12T17:25:29.937349440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 17:25:29.937445 containerd[1633]: time="2025-12-12T17:25:29.937420560Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 17:25:29.937615 containerd[1633]: time="2025-12-12T17:25:29.937531720Z" level=info msg="runtime interface created" Dec 12 17:25:29.937640 containerd[1633]: time="2025-12-12T17:25:29.937612400Z" level=info msg="created NRI interface" Dec 12 17:25:29.937640 containerd[1633]: time="2025-12-12T17:25:29.937628400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 17:25:29.937681 containerd[1633]: time="2025-12-12T17:25:29.937642640Z" level=info msg="Connect containerd service" Dec 12 17:25:29.937681 containerd[1633]: time="2025-12-12T17:25:29.937667200Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 17:25:29.939092 containerd[1633]: time="2025-12-12T17:25:29.938965480Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:25:30.004427 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 12 17:25:30.023010 extend-filesystems[1645]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 12 17:25:30.023010 extend-filesystems[1645]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 12 17:25:30.023010 extend-filesystems[1645]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 12 17:25:30.028359 extend-filesystems[1604]: Resized filesystem in /dev/vda9 Dec 12 17:25:30.024393 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 17:25:30.025460 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 17:25:30.033709 containerd[1633]: time="2025-12-12T17:25:30.033644600Z" level=info msg="Start subscribing containerd event" Dec 12 17:25:30.033780 containerd[1633]: time="2025-12-12T17:25:30.033724440Z" level=info msg="Start recovering state" Dec 12 17:25:30.033825 containerd[1633]: time="2025-12-12T17:25:30.033797040Z" level=info msg="Start event monitor" Dec 12 17:25:30.033849 containerd[1633]: time="2025-12-12T17:25:30.033824480Z" level=info msg="Start cni network conf syncer for default" Dec 12 17:25:30.033849 containerd[1633]: time="2025-12-12T17:25:30.033833560Z" level=info msg="Start streaming server" Dec 12 17:25:30.033849 containerd[1633]: time="2025-12-12T17:25:30.033844640Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 17:25:30.033905 containerd[1633]: time="2025-12-12T17:25:30.033852720Z" level=info msg="runtime interface starting up..." Dec 12 17:25:30.033905 containerd[1633]: time="2025-12-12T17:25:30.033858600Z" level=info msg="starting plugins..." Dec 12 17:25:30.033905 containerd[1633]: time="2025-12-12T17:25:30.033874360Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 17:25:30.034337 containerd[1633]: time="2025-12-12T17:25:30.034312120Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 17:25:30.034375 containerd[1633]: time="2025-12-12T17:25:30.034361320Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 17:25:30.034496 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 17:25:30.035457 containerd[1633]: time="2025-12-12T17:25:30.035430040Z" level=info msg="containerd successfully booted in 0.156486s" Dec 12 17:25:30.120049 tar[1623]: linux-arm64/README.md Dec 12 17:25:30.136637 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 17:25:30.641425 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:30.897426 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:31.280659 systemd-networkd[1510]: eth0: Gained IPv6LL Dec 12 17:25:31.282996 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 17:25:31.284893 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 17:25:31.288384 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:31.290737 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 17:25:31.325869 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 17:25:31.358367 sshd_keygen[1634]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 17:25:31.378746 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 17:25:31.382208 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 17:25:31.403289 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 17:25:31.403545 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 17:25:31.406166 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 17:25:31.429493 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 17:25:31.432961 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 17:25:31.435703 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 12 17:25:31.437062 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 17:25:32.242226 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:32.246221 (kubelet)[1735]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:32.651447 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:32.781898 kubelet[1735]: E1212 17:25:32.781828 1735 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:32.784276 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:32.784432 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:32.784752 systemd[1]: kubelet.service: Consumed 773ms CPU time, 256.7M memory peak. Dec 12 17:25:32.908430 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:36.658443 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:36.673416 coreos-metadata[1598]: Dec 12 17:25:36.673 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:25:36.771958 coreos-metadata[1598]: Dec 12 17:25:36.771 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 12 17:25:36.918425 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:25:36.924933 coreos-metadata[1675]: Dec 12 17:25:36.924 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:25:36.937637 coreos-metadata[1675]: Dec 12 17:25:36.937 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 12 17:25:37.062684 coreos-metadata[1598]: Dec 12 17:25:37.062 INFO Fetch successful Dec 12 17:25:37.063140 coreos-metadata[1598]: Dec 12 17:25:37.063 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 17:25:37.150238 coreos-metadata[1675]: Dec 12 17:25:37.150 INFO Fetch successful Dec 12 17:25:37.150238 coreos-metadata[1675]: Dec 12 17:25:37.150 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 12 17:25:37.302344 coreos-metadata[1598]: Dec 12 17:25:37.302 INFO Fetch successful Dec 12 17:25:37.302344 coreos-metadata[1598]: Dec 12 17:25:37.302 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 12 17:25:37.388368 coreos-metadata[1675]: Dec 12 17:25:37.388 INFO Fetch successful Dec 12 17:25:37.390319 unknown[1675]: wrote ssh authorized keys file for user: core Dec 12 17:25:37.415023 update-ssh-keys[1754]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:25:37.415951 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 12 17:25:37.417875 systemd[1]: Finished sshkeys.service. Dec 12 17:25:37.431779 coreos-metadata[1598]: Dec 12 17:25:37.431 INFO Fetch successful Dec 12 17:25:37.431779 coreos-metadata[1598]: Dec 12 17:25:37.431 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 12 17:25:39.060578 coreos-metadata[1598]: Dec 12 17:25:39.060 INFO Fetch successful Dec 12 17:25:39.061064 coreos-metadata[1598]: Dec 12 17:25:39.060 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 12 17:25:39.186223 coreos-metadata[1598]: Dec 12 17:25:39.186 INFO Fetch successful Dec 12 17:25:39.186285 coreos-metadata[1598]: Dec 12 17:25:39.186 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 12 17:25:39.314196 coreos-metadata[1598]: Dec 12 17:25:39.314 INFO Fetch successful Dec 12 17:25:39.351456 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 17:25:39.351932 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 17:25:39.352078 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 17:25:39.355605 systemd[1]: Startup finished in 2.925s (kernel) + 12.146s (initrd) + 11.658s (userspace) = 26.730s. Dec 12 17:25:43.035094 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 17:25:43.037159 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:43.174709 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:43.178064 (kubelet)[1771]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:43.221909 kubelet[1771]: E1212 17:25:43.221650 1771 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:43.224926 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:43.225059 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:43.225562 systemd[1]: kubelet.service: Consumed 151ms CPU time, 107.5M memory peak. Dec 12 17:25:44.902612 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 17:25:44.903793 systemd[1]: Started sshd@0-10.0.17.31:22-147.75.109.163:52396.service - OpenSSH per-connection server daemon (147.75.109.163:52396). Dec 12 17:25:45.892118 sshd[1780]: Accepted publickey for core from 147.75.109.163 port 52396 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:25:45.894984 sshd-session[1780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:45.901091 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 17:25:45.902128 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 17:25:45.907932 systemd-logind[1612]: New session 1 of user core. Dec 12 17:25:45.922838 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 17:25:45.925468 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 17:25:45.946091 (systemd)[1785]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 17:25:45.948298 systemd-logind[1612]: New session c1 of user core. Dec 12 17:25:46.066898 systemd[1785]: Queued start job for default target default.target. Dec 12 17:25:46.090502 systemd[1785]: Created slice app.slice - User Application Slice. Dec 12 17:25:46.090530 systemd[1785]: Reached target paths.target - Paths. Dec 12 17:25:46.090566 systemd[1785]: Reached target timers.target - Timers. Dec 12 17:25:46.091711 systemd[1785]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 17:25:46.101494 systemd[1785]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 17:25:46.101559 systemd[1785]: Reached target sockets.target - Sockets. Dec 12 17:25:46.101596 systemd[1785]: Reached target basic.target - Basic System. Dec 12 17:25:46.101623 systemd[1785]: Reached target default.target - Main User Target. Dec 12 17:25:46.101647 systemd[1785]: Startup finished in 147ms. Dec 12 17:25:46.101902 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 17:25:46.104012 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 17:25:46.776346 systemd[1]: Started sshd@1-10.0.17.31:22-147.75.109.163:52406.service - OpenSSH per-connection server daemon (147.75.109.163:52406). Dec 12 17:25:47.748368 sshd[1796]: Accepted publickey for core from 147.75.109.163 port 52406 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:25:47.749636 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:47.754157 systemd-logind[1612]: New session 2 of user core. Dec 12 17:25:47.765586 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 17:25:48.411307 sshd[1799]: Connection closed by 147.75.109.163 port 52406 Dec 12 17:25:48.411810 sshd-session[1796]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:48.415098 systemd[1]: sshd@1-10.0.17.31:22-147.75.109.163:52406.service: Deactivated successfully. Dec 12 17:25:48.416590 systemd[1]: session-2.scope: Deactivated successfully. Dec 12 17:25:48.417544 systemd-logind[1612]: Session 2 logged out. Waiting for processes to exit. Dec 12 17:25:48.418993 systemd-logind[1612]: Removed session 2. Dec 12 17:25:48.580385 systemd[1]: Started sshd@2-10.0.17.31:22-147.75.109.163:52422.service - OpenSSH per-connection server daemon (147.75.109.163:52422). Dec 12 17:25:49.568337 sshd[1805]: Accepted publickey for core from 147.75.109.163 port 52422 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:25:49.569742 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:49.573994 systemd-logind[1612]: New session 3 of user core. Dec 12 17:25:49.582733 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 17:25:50.233539 sshd[1808]: Connection closed by 147.75.109.163 port 52422 Dec 12 17:25:50.233749 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:50.237192 systemd[1]: sshd@2-10.0.17.31:22-147.75.109.163:52422.service: Deactivated successfully. Dec 12 17:25:50.238724 systemd[1]: session-3.scope: Deactivated successfully. Dec 12 17:25:50.240403 systemd-logind[1612]: Session 3 logged out. Waiting for processes to exit. Dec 12 17:25:50.241668 systemd-logind[1612]: Removed session 3. Dec 12 17:25:50.398503 systemd[1]: Started sshd@3-10.0.17.31:22-147.75.109.163:52436.service - OpenSSH per-connection server daemon (147.75.109.163:52436). Dec 12 17:25:51.363479 sshd[1814]: Accepted publickey for core from 147.75.109.163 port 52436 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:25:51.364808 sshd-session[1814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:51.368335 systemd-logind[1612]: New session 4 of user core. Dec 12 17:25:51.377545 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 17:25:52.024911 sshd[1817]: Connection closed by 147.75.109.163 port 52436 Dec 12 17:25:52.025487 sshd-session[1814]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:52.028785 systemd[1]: sshd@3-10.0.17.31:22-147.75.109.163:52436.service: Deactivated successfully. Dec 12 17:25:52.031800 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 17:25:52.032443 systemd-logind[1612]: Session 4 logged out. Waiting for processes to exit. Dec 12 17:25:52.033687 systemd-logind[1612]: Removed session 4. Dec 12 17:25:52.241647 systemd[1]: Started sshd@4-10.0.17.31:22-147.75.109.163:52452.service - OpenSSH per-connection server daemon (147.75.109.163:52452). Dec 12 17:25:53.382434 sshd[1823]: Accepted publickey for core from 147.75.109.163 port 52452 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:25:53.383731 sshd-session[1823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:53.384578 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 17:25:53.386134 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:53.388837 systemd-logind[1612]: New session 5 of user core. Dec 12 17:25:53.400824 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 17:25:53.458324 chronyd[1596]: Selected source PHC0 Dec 12 17:25:53.521483 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:53.524814 (kubelet)[1835]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:53.554229 kubelet[1835]: E1212 17:25:53.554177 1835 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:53.556455 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:53.556572 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:53.556824 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.1M memory peak. Dec 12 17:25:53.937537 sudo[1843]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 17:25:53.937768 sudo[1843]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:53.958104 sudo[1843]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:54.125459 sshd[1829]: Connection closed by 147.75.109.163 port 52452 Dec 12 17:25:54.125438 sshd-session[1823]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:54.129031 systemd-logind[1612]: Session 5 logged out. Waiting for processes to exit. Dec 12 17:25:54.129708 systemd[1]: sshd@4-10.0.17.31:22-147.75.109.163:52452.service: Deactivated successfully. Dec 12 17:25:54.131181 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 17:25:54.133541 systemd-logind[1612]: Removed session 5. Dec 12 17:25:54.270386 systemd[1]: Started sshd@5-10.0.17.31:22-147.75.109.163:36138.service - OpenSSH per-connection server daemon (147.75.109.163:36138). Dec 12 17:25:55.221415 sshd[1849]: Accepted publickey for core from 147.75.109.163 port 36138 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:25:55.222744 sshd-session[1849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:55.226696 systemd-logind[1612]: New session 6 of user core. Dec 12 17:25:55.240585 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 17:25:55.736314 sudo[1854]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 17:25:55.736602 sudo[1854]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:55.741075 sudo[1854]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:55.745903 sudo[1853]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 17:25:55.746150 sudo[1853]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:55.754636 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:25:55.793010 augenrules[1876]: No rules Dec 12 17:25:55.793766 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:25:55.793961 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:25:55.795859 sudo[1853]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:55.953536 sshd[1852]: Connection closed by 147.75.109.163 port 36138 Dec 12 17:25:55.953995 sshd-session[1849]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:55.957350 systemd[1]: sshd@5-10.0.17.31:22-147.75.109.163:36138.service: Deactivated successfully. Dec 12 17:25:55.958832 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 17:25:55.959510 systemd-logind[1612]: Session 6 logged out. Waiting for processes to exit. Dec 12 17:25:55.960510 systemd-logind[1612]: Removed session 6. Dec 12 17:25:56.118591 systemd[1]: Started sshd@6-10.0.17.31:22-147.75.109.163:36148.service - OpenSSH per-connection server daemon (147.75.109.163:36148). Dec 12 17:25:57.095367 sshd[1885]: Accepted publickey for core from 147.75.109.163 port 36148 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:25:57.096724 sshd-session[1885]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:57.100442 systemd-logind[1612]: New session 7 of user core. Dec 12 17:25:57.121583 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 17:25:57.604962 sudo[1889]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 17:25:57.605216 sudo[1889]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:57.942177 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 17:25:57.968035 (dockerd)[1911]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 17:25:58.193393 dockerd[1911]: time="2025-12-12T17:25:58.193263913Z" level=info msg="Starting up" Dec 12 17:25:58.194227 dockerd[1911]: time="2025-12-12T17:25:58.194207675Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 17:25:58.204140 dockerd[1911]: time="2025-12-12T17:25:58.204090621Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 17:25:58.248236 dockerd[1911]: time="2025-12-12T17:25:58.248185215Z" level=info msg="Loading containers: start." Dec 12 17:25:58.256463 kernel: Initializing XFRM netlink socket Dec 12 17:25:58.465819 systemd-networkd[1510]: docker0: Link UP Dec 12 17:25:58.471671 dockerd[1911]: time="2025-12-12T17:25:58.471630836Z" level=info msg="Loading containers: done." Dec 12 17:25:58.483791 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2401625968-merged.mount: Deactivated successfully. Dec 12 17:25:58.486279 dockerd[1911]: time="2025-12-12T17:25:58.486225954Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 17:25:58.486375 dockerd[1911]: time="2025-12-12T17:25:58.486319714Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 17:25:58.486454 dockerd[1911]: time="2025-12-12T17:25:58.486436354Z" level=info msg="Initializing buildkit" Dec 12 17:25:58.514274 dockerd[1911]: time="2025-12-12T17:25:58.514223106Z" level=info msg="Completed buildkit initialization" Dec 12 17:25:58.519094 dockerd[1911]: time="2025-12-12T17:25:58.519055839Z" level=info msg="Daemon has completed initialization" Dec 12 17:25:58.519366 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 17:25:58.520100 dockerd[1911]: time="2025-12-12T17:25:58.519160799Z" level=info msg="API listen on /run/docker.sock" Dec 12 17:25:59.837109 containerd[1633]: time="2025-12-12T17:25:59.837063103Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 12 17:26:00.494917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3453688001.mount: Deactivated successfully. Dec 12 17:26:01.296446 containerd[1633]: time="2025-12-12T17:26:01.295893452Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:01.297155 containerd[1633]: time="2025-12-12T17:26:01.296848574Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=26432057" Dec 12 17:26:01.297874 containerd[1633]: time="2025-12-12T17:26:01.297845897Z" level=info msg="ImageCreate event name:\"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:01.301334 containerd[1633]: time="2025-12-12T17:26:01.301306586Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:01.302322 containerd[1633]: time="2025-12-12T17:26:01.302291389Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"26428558\" in 1.465188526s" Dec 12 17:26:01.302371 containerd[1633]: time="2025-12-12T17:26:01.302327109Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\"" Dec 12 17:26:01.303037 containerd[1633]: time="2025-12-12T17:26:01.303013151Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 12 17:26:02.433616 containerd[1633]: time="2025-12-12T17:26:02.433548187Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:02.435372 containerd[1633]: time="2025-12-12T17:26:02.435332153Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=22618975" Dec 12 17:26:02.438065 containerd[1633]: time="2025-12-12T17:26:02.437989122Z" level=info msg="ImageCreate event name:\"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:02.441378 containerd[1633]: time="2025-12-12T17:26:02.441328253Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:02.442783 containerd[1633]: time="2025-12-12T17:26:02.442743818Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"24203439\" in 1.139697827s" Dec 12 17:26:02.442783 containerd[1633]: time="2025-12-12T17:26:02.442779418Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\"" Dec 12 17:26:02.443197 containerd[1633]: time="2025-12-12T17:26:02.443166779Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 12 17:26:03.371828 containerd[1633]: time="2025-12-12T17:26:03.371716172Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:03.373500 containerd[1633]: time="2025-12-12T17:26:03.373456336Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=17618456" Dec 12 17:26:03.374492 containerd[1633]: time="2025-12-12T17:26:03.374449659Z" level=info msg="ImageCreate event name:\"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:03.376999 containerd[1633]: time="2025-12-12T17:26:03.376952785Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:03.378428 containerd[1633]: time="2025-12-12T17:26:03.377890548Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"19202938\" in 934.689529ms" Dec 12 17:26:03.378428 containerd[1633]: time="2025-12-12T17:26:03.377923628Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\"" Dec 12 17:26:03.378773 containerd[1633]: time="2025-12-12T17:26:03.378739230Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 12 17:26:03.638492 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 17:26:03.639894 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:03.795068 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:03.798783 (kubelet)[2202]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:26:03.837236 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:26:03.880641 kubelet[2202]: E1212 17:26:03.835492 2202 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:26:03.837345 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:26:03.837639 systemd[1]: kubelet.service: Consumed 140ms CPU time, 107.9M memory peak. Dec 12 17:26:04.372569 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3918069825.mount: Deactivated successfully. Dec 12 17:26:04.596249 containerd[1633]: time="2025-12-12T17:26:04.596196233Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:04.597745 containerd[1633]: time="2025-12-12T17:26:04.597485196Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=27561825" Dec 12 17:26:04.598789 containerd[1633]: time="2025-12-12T17:26:04.598759279Z" level=info msg="ImageCreate event name:\"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:04.601848 containerd[1633]: time="2025-12-12T17:26:04.601802367Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:04.602548 containerd[1633]: time="2025-12-12T17:26:04.602498049Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"27560818\" in 1.223653459s" Dec 12 17:26:04.602548 containerd[1633]: time="2025-12-12T17:26:04.602529009Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\"" Dec 12 17:26:04.603021 containerd[1633]: time="2025-12-12T17:26:04.602997090Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 12 17:26:05.340441 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount659128449.mount: Deactivated successfully. Dec 12 17:26:05.882026 containerd[1633]: time="2025-12-12T17:26:05.881959093Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:05.883191 containerd[1633]: time="2025-12-12T17:26:05.883144616Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Dec 12 17:26:05.884061 containerd[1633]: time="2025-12-12T17:26:05.884027178Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:05.887041 containerd[1633]: time="2025-12-12T17:26:05.886994626Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:05.888079 containerd[1633]: time="2025-12-12T17:26:05.888032948Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.285003698s" Dec 12 17:26:05.888079 containerd[1633]: time="2025-12-12T17:26:05.888073188Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Dec 12 17:26:05.888591 containerd[1633]: time="2025-12-12T17:26:05.888553470Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 12 17:26:06.417762 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1306882229.mount: Deactivated successfully. Dec 12 17:26:06.427918 containerd[1633]: time="2025-12-12T17:26:06.427788710Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:26:06.428962 containerd[1633]: time="2025-12-12T17:26:06.428917793Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Dec 12 17:26:06.430514 containerd[1633]: time="2025-12-12T17:26:06.430467037Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:26:06.432805 containerd[1633]: time="2025-12-12T17:26:06.432760923Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:26:06.433419 containerd[1633]: time="2025-12-12T17:26:06.433378165Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 544.795175ms" Dec 12 17:26:06.433458 containerd[1633]: time="2025-12-12T17:26:06.433428405Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Dec 12 17:26:06.433938 containerd[1633]: time="2025-12-12T17:26:06.433827046Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 12 17:26:07.006145 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1388305505.mount: Deactivated successfully. Dec 12 17:26:08.160979 containerd[1633]: time="2025-12-12T17:26:08.160920853Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:08.162416 containerd[1633]: time="2025-12-12T17:26:08.162130736Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943239" Dec 12 17:26:08.163902 containerd[1633]: time="2025-12-12T17:26:08.163852340Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:08.167196 containerd[1633]: time="2025-12-12T17:26:08.167128469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:08.168447 containerd[1633]: time="2025-12-12T17:26:08.168309112Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 1.734452866s" Dec 12 17:26:08.168447 containerd[1633]: time="2025-12-12T17:26:08.168343552Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Dec 12 17:26:13.087538 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:13.087695 systemd[1]: kubelet.service: Consumed 140ms CPU time, 107.9M memory peak. Dec 12 17:26:13.089577 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:13.110968 systemd[1]: Reload requested from client PID 2360 ('systemctl') (unit session-7.scope)... Dec 12 17:26:13.110989 systemd[1]: Reloading... Dec 12 17:26:13.192448 zram_generator::config[2403]: No configuration found. Dec 12 17:26:13.355254 systemd[1]: Reloading finished in 243 ms. Dec 12 17:26:13.413244 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 17:26:13.413319 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 17:26:13.413564 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:13.413611 systemd[1]: kubelet.service: Consumed 90ms CPU time, 95.1M memory peak. Dec 12 17:26:13.415076 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:13.520677 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:13.524272 (kubelet)[2451]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:26:13.561561 kubelet[2451]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:26:13.561561 kubelet[2451]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:26:13.561561 kubelet[2451]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:26:13.561881 kubelet[2451]: I1212 17:26:13.561614 2451 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:26:14.355975 kubelet[2451]: I1212 17:26:14.355885 2451 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 17:26:14.355975 kubelet[2451]: I1212 17:26:14.355920 2451 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:26:14.356258 kubelet[2451]: I1212 17:26:14.356181 2451 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 17:26:14.388581 kubelet[2451]: E1212 17:26:14.388537 2451 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.17.31:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.17.31:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:26:14.391758 kubelet[2451]: I1212 17:26:14.391024 2451 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:26:14.397783 kubelet[2451]: I1212 17:26:14.397764 2451 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:26:14.400703 kubelet[2451]: I1212 17:26:14.400680 2451 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:26:14.401818 kubelet[2451]: I1212 17:26:14.401767 2451 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:26:14.401979 kubelet[2451]: I1212 17:26:14.401815 2451 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-2-0ba9591bbe","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:26:14.402080 kubelet[2451]: I1212 17:26:14.402069 2451 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:26:14.402080 kubelet[2451]: I1212 17:26:14.402081 2451 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 17:26:14.402292 kubelet[2451]: I1212 17:26:14.402280 2451 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:26:14.412161 kubelet[2451]: I1212 17:26:14.412094 2451 kubelet.go:446] "Attempting to sync node with API server" Dec 12 17:26:14.412161 kubelet[2451]: I1212 17:26:14.412127 2451 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:26:14.412161 kubelet[2451]: I1212 17:26:14.412151 2451 kubelet.go:352] "Adding apiserver pod source" Dec 12 17:26:14.412161 kubelet[2451]: I1212 17:26:14.412161 2451 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:26:14.416949 kubelet[2451]: W1212 17:26:14.416892 2451 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.17.31:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-2-0ba9591bbe&limit=500&resourceVersion=0": dial tcp 10.0.17.31:6443: connect: connection refused Dec 12 17:26:14.417012 kubelet[2451]: E1212 17:26:14.416962 2451 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.17.31:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-2-0ba9591bbe&limit=500&resourceVersion=0\": dial tcp 10.0.17.31:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:26:14.418194 kubelet[2451]: I1212 17:26:14.418055 2451 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:26:14.418743 kubelet[2451]: W1212 17:26:14.418703 2451 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.17.31:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.17.31:6443: connect: connection refused Dec 12 17:26:14.418785 kubelet[2451]: E1212 17:26:14.418751 2451 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.17.31:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.17.31:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:26:14.418835 kubelet[2451]: I1212 17:26:14.418809 2451 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 17:26:14.418999 kubelet[2451]: W1212 17:26:14.418965 2451 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 17:26:14.420111 kubelet[2451]: I1212 17:26:14.420082 2451 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:26:14.420172 kubelet[2451]: I1212 17:26:14.420130 2451 server.go:1287] "Started kubelet" Dec 12 17:26:14.421702 kubelet[2451]: I1212 17:26:14.421679 2451 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:26:14.423126 kubelet[2451]: I1212 17:26:14.423050 2451 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:26:14.424155 kubelet[2451]: I1212 17:26:14.424133 2451 server.go:479] "Adding debug handlers to kubelet server" Dec 12 17:26:14.426765 kubelet[2451]: E1212 17:26:14.426748 2451 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:26:14.427495 kubelet[2451]: E1212 17:26:14.427225 2451 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" Dec 12 17:26:14.427495 kubelet[2451]: I1212 17:26:14.427264 2451 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:26:14.427495 kubelet[2451]: I1212 17:26:14.427251 2451 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:26:14.427495 kubelet[2451]: I1212 17:26:14.427438 2451 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:26:14.427629 kubelet[2451]: I1212 17:26:14.427503 2451 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:26:14.427629 kubelet[2451]: I1212 17:26:14.427557 2451 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:26:14.427923 kubelet[2451]: E1212 17:26:14.427449 2451 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.17.31:6443/api/v1/namespaces/default/events\": dial tcp 10.0.17.31:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-2-0ba9591bbe.188087ce62d0de0a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-2-0ba9591bbe,UID:ci-4459-2-2-2-0ba9591bbe,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-2-0ba9591bbe,},FirstTimestamp:2025-12-12 17:26:14.420102666 +0000 UTC m=+0.893003761,LastTimestamp:2025-12-12 17:26:14.420102666 +0000 UTC m=+0.893003761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-2-0ba9591bbe,}" Dec 12 17:26:14.428122 kubelet[2451]: I1212 17:26:14.428106 2451 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:26:14.428180 kubelet[2451]: I1212 17:26:14.428121 2451 factory.go:221] Registration of the systemd container factory successfully Dec 12 17:26:14.428311 kubelet[2451]: I1212 17:26:14.428293 2451 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:26:14.429628 kubelet[2451]: W1212 17:26:14.429583 2451 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.17.31:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.17.31:6443: connect: connection refused Dec 12 17:26:14.429703 kubelet[2451]: E1212 17:26:14.429642 2451 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.17.31:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.17.31:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:26:14.429735 kubelet[2451]: E1212 17:26:14.429707 2451 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.17.31:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-2-0ba9591bbe?timeout=10s\": dial tcp 10.0.17.31:6443: connect: connection refused" interval="200ms" Dec 12 17:26:14.430727 kubelet[2451]: I1212 17:26:14.430702 2451 factory.go:221] Registration of the containerd container factory successfully Dec 12 17:26:14.442197 kubelet[2451]: I1212 17:26:14.442158 2451 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:26:14.442197 kubelet[2451]: I1212 17:26:14.442178 2451 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:26:14.442197 kubelet[2451]: I1212 17:26:14.442199 2451 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:26:14.442358 kubelet[2451]: I1212 17:26:14.442259 2451 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 17:26:14.443748 kubelet[2451]: I1212 17:26:14.443709 2451 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 17:26:14.443748 kubelet[2451]: I1212 17:26:14.443738 2451 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 17:26:14.443930 kubelet[2451]: I1212 17:26:14.443758 2451 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:26:14.443930 kubelet[2451]: I1212 17:26:14.443766 2451 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 17:26:14.443930 kubelet[2451]: E1212 17:26:14.443803 2451 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:26:14.444380 kubelet[2451]: W1212 17:26:14.444358 2451 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.17.31:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.17.31:6443: connect: connection refused Dec 12 17:26:14.444642 kubelet[2451]: E1212 17:26:14.444597 2451 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.17.31:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.17.31:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:26:14.445182 kubelet[2451]: I1212 17:26:14.445147 2451 policy_none.go:49] "None policy: Start" Dec 12 17:26:14.445182 kubelet[2451]: I1212 17:26:14.445177 2451 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:26:14.445256 kubelet[2451]: I1212 17:26:14.445191 2451 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:26:14.451286 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 17:26:14.466538 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 17:26:14.469695 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 17:26:14.492309 kubelet[2451]: I1212 17:26:14.492261 2451 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 17:26:14.492626 kubelet[2451]: I1212 17:26:14.492523 2451 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:26:14.492626 kubelet[2451]: I1212 17:26:14.492544 2451 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:26:14.492844 kubelet[2451]: I1212 17:26:14.492804 2451 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:26:14.494869 kubelet[2451]: E1212 17:26:14.494805 2451 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:26:14.494869 kubelet[2451]: E1212 17:26:14.494850 2451 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-2-0ba9591bbe\" not found" Dec 12 17:26:14.552988 systemd[1]: Created slice kubepods-burstable-pode011b1b97e519516f532a5ccc7150459.slice - libcontainer container kubepods-burstable-pode011b1b97e519516f532a5ccc7150459.slice. Dec 12 17:26:14.572761 kubelet[2451]: E1212 17:26:14.572706 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.575522 systemd[1]: Created slice kubepods-burstable-pod9499756627e45f14c10a9abe956015b7.slice - libcontainer container kubepods-burstable-pod9499756627e45f14c10a9abe956015b7.slice. Dec 12 17:26:14.577283 kubelet[2451]: E1212 17:26:14.577257 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.579120 systemd[1]: Created slice kubepods-burstable-pod1c0158734360a056bfe1160eca0b990c.slice - libcontainer container kubepods-burstable-pod1c0158734360a056bfe1160eca0b990c.slice. Dec 12 17:26:14.580802 kubelet[2451]: E1212 17:26:14.580777 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.594667 kubelet[2451]: I1212 17:26:14.594642 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.595133 kubelet[2451]: E1212 17:26:14.595107 2451 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.17.31:6443/api/v1/nodes\": dial tcp 10.0.17.31:6443: connect: connection refused" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.625883 update_engine[1615]: I20251212 17:26:14.625624 1615 update_attempter.cc:509] Updating boot flags... Dec 12 17:26:14.629667 kubelet[2451]: I1212 17:26:14.629514 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e011b1b97e519516f532a5ccc7150459-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" (UID: \"e011b1b97e519516f532a5ccc7150459\") " pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629667 kubelet[2451]: I1212 17:26:14.629550 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629667 kubelet[2451]: I1212 17:26:14.629571 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629667 kubelet[2451]: I1212 17:26:14.629587 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629667 kubelet[2451]: I1212 17:26:14.629605 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629859 kubelet[2451]: I1212 17:26:14.629623 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e011b1b97e519516f532a5ccc7150459-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" (UID: \"e011b1b97e519516f532a5ccc7150459\") " pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629859 kubelet[2451]: I1212 17:26:14.629640 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e011b1b97e519516f532a5ccc7150459-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" (UID: \"e011b1b97e519516f532a5ccc7150459\") " pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629859 kubelet[2451]: I1212 17:26:14.629656 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.629859 kubelet[2451]: I1212 17:26:14.629688 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9499756627e45f14c10a9abe956015b7-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-2-0ba9591bbe\" (UID: \"9499756627e45f14c10a9abe956015b7\") " pod="kube-system/kube-scheduler-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.630321 kubelet[2451]: E1212 17:26:14.630288 2451 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.17.31:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-2-0ba9591bbe?timeout=10s\": dial tcp 10.0.17.31:6443: connect: connection refused" interval="400ms" Dec 12 17:26:14.797439 kubelet[2451]: I1212 17:26:14.797287 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.797672 kubelet[2451]: E1212 17:26:14.797643 2451 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.17.31:6443/api/v1/nodes\": dial tcp 10.0.17.31:6443: connect: connection refused" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:14.874549 containerd[1633]: time="2025-12-12T17:26:14.874509046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-2-0ba9591bbe,Uid:e011b1b97e519516f532a5ccc7150459,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:14.878240 containerd[1633]: time="2025-12-12T17:26:14.878099936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-2-0ba9591bbe,Uid:9499756627e45f14c10a9abe956015b7,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:14.881866 containerd[1633]: time="2025-12-12T17:26:14.881812625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-2-0ba9591bbe,Uid:1c0158734360a056bfe1160eca0b990c,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:14.911478 containerd[1633]: time="2025-12-12T17:26:14.911426942Z" level=info msg="connecting to shim 3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229" address="unix:///run/containerd/s/bcf42dcba21f88cf4b4a0338396bd841ce83b965ea129123d9fcb41bb3394aef" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:14.913666 containerd[1633]: time="2025-12-12T17:26:14.913554468Z" level=info msg="connecting to shim cc51f85091830731dc1902a41d17608a89b844291d38bc2257753bbcf1688dcd" address="unix:///run/containerd/s/2ecb11bcc9785d1397b79c0d42bfc4d59c8c08dd125251f06e709d026f99c0b5" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:14.931435 containerd[1633]: time="2025-12-12T17:26:14.931352474Z" level=info msg="connecting to shim 5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7" address="unix:///run/containerd/s/48a06d9e8ff417ba1af96f191f63fd138aa890474e05458aff18524216c5a883" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:14.943621 systemd[1]: Started cri-containerd-3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229.scope - libcontainer container 3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229. Dec 12 17:26:14.944703 systemd[1]: Started cri-containerd-cc51f85091830731dc1902a41d17608a89b844291d38bc2257753bbcf1688dcd.scope - libcontainer container cc51f85091830731dc1902a41d17608a89b844291d38bc2257753bbcf1688dcd. Dec 12 17:26:14.950293 systemd[1]: Started cri-containerd-5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7.scope - libcontainer container 5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7. Dec 12 17:26:14.990001 containerd[1633]: time="2025-12-12T17:26:14.989961666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-2-0ba9591bbe,Uid:e011b1b97e519516f532a5ccc7150459,Namespace:kube-system,Attempt:0,} returns sandbox id \"cc51f85091830731dc1902a41d17608a89b844291d38bc2257753bbcf1688dcd\"" Dec 12 17:26:14.993199 containerd[1633]: time="2025-12-12T17:26:14.992592233Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-2-0ba9591bbe,Uid:9499756627e45f14c10a9abe956015b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229\"" Dec 12 17:26:14.994027 containerd[1633]: time="2025-12-12T17:26:14.993999077Z" level=info msg="CreateContainer within sandbox \"cc51f85091830731dc1902a41d17608a89b844291d38bc2257753bbcf1688dcd\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 17:26:14.995388 containerd[1633]: time="2025-12-12T17:26:14.995351920Z" level=info msg="CreateContainer within sandbox \"3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 17:26:15.007207 containerd[1633]: time="2025-12-12T17:26:15.007159831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-2-0ba9591bbe,Uid:1c0158734360a056bfe1160eca0b990c,Namespace:kube-system,Attempt:0,} returns sandbox id \"5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7\"" Dec 12 17:26:15.011151 containerd[1633]: time="2025-12-12T17:26:15.011106321Z" level=info msg="CreateContainer within sandbox \"5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 17:26:15.012977 containerd[1633]: time="2025-12-12T17:26:15.012908766Z" level=info msg="Container 5c7095e3eb1fc0349d6e5790c423a5c189b07abe9a4a834276063642106e70dc: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:15.019377 containerd[1633]: time="2025-12-12T17:26:15.019257542Z" level=info msg="Container 90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:15.023026 containerd[1633]: time="2025-12-12T17:26:15.022985392Z" level=info msg="CreateContainer within sandbox \"cc51f85091830731dc1902a41d17608a89b844291d38bc2257753bbcf1688dcd\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5c7095e3eb1fc0349d6e5790c423a5c189b07abe9a4a834276063642106e70dc\"" Dec 12 17:26:15.024166 containerd[1633]: time="2025-12-12T17:26:15.023850594Z" level=info msg="StartContainer for \"5c7095e3eb1fc0349d6e5790c423a5c189b07abe9a4a834276063642106e70dc\"" Dec 12 17:26:15.025067 containerd[1633]: time="2025-12-12T17:26:15.025010717Z" level=info msg="Container 82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:15.025343 containerd[1633]: time="2025-12-12T17:26:15.025298798Z" level=info msg="connecting to shim 5c7095e3eb1fc0349d6e5790c423a5c189b07abe9a4a834276063642106e70dc" address="unix:///run/containerd/s/2ecb11bcc9785d1397b79c0d42bfc4d59c8c08dd125251f06e709d026f99c0b5" protocol=ttrpc version=3 Dec 12 17:26:15.031536 containerd[1633]: time="2025-12-12T17:26:15.031478334Z" level=info msg="CreateContainer within sandbox \"3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624\"" Dec 12 17:26:15.031856 kubelet[2451]: E1212 17:26:15.031648 2451 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.17.31:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-2-0ba9591bbe?timeout=10s\": dial tcp 10.0.17.31:6443: connect: connection refused" interval="800ms" Dec 12 17:26:15.032094 containerd[1633]: time="2025-12-12T17:26:15.032053976Z" level=info msg="StartContainer for \"90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624\"" Dec 12 17:26:15.033246 containerd[1633]: time="2025-12-12T17:26:15.033213859Z" level=info msg="connecting to shim 90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624" address="unix:///run/containerd/s/bcf42dcba21f88cf4b4a0338396bd841ce83b965ea129123d9fcb41bb3394aef" protocol=ttrpc version=3 Dec 12 17:26:15.035807 containerd[1633]: time="2025-12-12T17:26:15.035773065Z" level=info msg="CreateContainer within sandbox \"5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7\"" Dec 12 17:26:15.036393 containerd[1633]: time="2025-12-12T17:26:15.036368547Z" level=info msg="StartContainer for \"82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7\"" Dec 12 17:26:15.039177 containerd[1633]: time="2025-12-12T17:26:15.038558273Z" level=info msg="connecting to shim 82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7" address="unix:///run/containerd/s/48a06d9e8ff417ba1af96f191f63fd138aa890474e05458aff18524216c5a883" protocol=ttrpc version=3 Dec 12 17:26:15.044583 systemd[1]: Started cri-containerd-5c7095e3eb1fc0349d6e5790c423a5c189b07abe9a4a834276063642106e70dc.scope - libcontainer container 5c7095e3eb1fc0349d6e5790c423a5c189b07abe9a4a834276063642106e70dc. Dec 12 17:26:15.057548 systemd[1]: Started cri-containerd-90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624.scope - libcontainer container 90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624. Dec 12 17:26:15.060714 systemd[1]: Started cri-containerd-82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7.scope - libcontainer container 82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7. Dec 12 17:26:15.098475 containerd[1633]: time="2025-12-12T17:26:15.098356308Z" level=info msg="StartContainer for \"5c7095e3eb1fc0349d6e5790c423a5c189b07abe9a4a834276063642106e70dc\" returns successfully" Dec 12 17:26:15.114779 containerd[1633]: time="2025-12-12T17:26:15.114678710Z" level=info msg="StartContainer for \"90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624\" returns successfully" Dec 12 17:26:15.117782 containerd[1633]: time="2025-12-12T17:26:15.117745118Z" level=info msg="StartContainer for \"82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7\" returns successfully" Dec 12 17:26:15.199850 kubelet[2451]: I1212 17:26:15.199821 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:15.200167 kubelet[2451]: E1212 17:26:15.200113 2451 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.17.31:6443/api/v1/nodes\": dial tcp 10.0.17.31:6443: connect: connection refused" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:15.452553 kubelet[2451]: E1212 17:26:15.452002 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:15.453682 kubelet[2451]: E1212 17:26:15.453661 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:15.456527 kubelet[2451]: E1212 17:26:15.456505 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.001752 kubelet[2451]: I1212 17:26:16.001722 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.458762 kubelet[2451]: E1212 17:26:16.458727 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.459668 kubelet[2451]: E1212 17:26:16.459644 2451 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.811664 kubelet[2451]: E1212 17:26:16.811417 2451 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-2-0ba9591bbe\" not found" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.853515 kubelet[2451]: I1212 17:26:16.853456 2451 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.929953 kubelet[2451]: I1212 17:26:16.929901 2451 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.940049 kubelet[2451]: E1212 17:26:16.940001 2451 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.940049 kubelet[2451]: I1212 17:26:16.940036 2451 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.942406 kubelet[2451]: E1212 17:26:16.942342 2451 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.942406 kubelet[2451]: I1212 17:26:16.942374 2451 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:16.944426 kubelet[2451]: E1212 17:26:16.944386 2451 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-2-0ba9591bbe\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:17.414808 kubelet[2451]: I1212 17:26:17.414718 2451 apiserver.go:52] "Watching apiserver" Dec 12 17:26:17.427841 kubelet[2451]: I1212 17:26:17.427751 2451 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:26:18.756935 kubelet[2451]: I1212 17:26:18.756858 2451 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:18.972622 systemd[1]: Reload requested from client PID 2743 ('systemctl') (unit session-7.scope)... Dec 12 17:26:18.972635 systemd[1]: Reloading... Dec 12 17:26:19.034427 zram_generator::config[2789]: No configuration found. Dec 12 17:26:19.209651 systemd[1]: Reloading finished in 236 ms. Dec 12 17:26:19.238170 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:19.254555 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:26:19.254787 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:19.254851 systemd[1]: kubelet.service: Consumed 1.269s CPU time, 130.7M memory peak. Dec 12 17:26:19.257676 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:19.416349 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:19.420191 (kubelet)[2831]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:26:19.472953 kubelet[2831]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:26:19.472953 kubelet[2831]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:26:19.472953 kubelet[2831]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:26:19.473392 kubelet[2831]: I1212 17:26:19.473278 2831 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:26:19.481571 kubelet[2831]: I1212 17:26:19.481536 2831 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 17:26:19.483427 kubelet[2831]: I1212 17:26:19.481697 2831 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:26:19.483427 kubelet[2831]: I1212 17:26:19.481974 2831 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 17:26:19.483427 kubelet[2831]: I1212 17:26:19.483171 2831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 12 17:26:19.485582 kubelet[2831]: I1212 17:26:19.485547 2831 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:26:19.489150 kubelet[2831]: I1212 17:26:19.489127 2831 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:26:19.492237 kubelet[2831]: I1212 17:26:19.492214 2831 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:26:19.492440 kubelet[2831]: I1212 17:26:19.492386 2831 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:26:19.492632 kubelet[2831]: I1212 17:26:19.492437 2831 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-2-0ba9591bbe","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:26:19.492632 kubelet[2831]: I1212 17:26:19.492624 2831 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:26:19.492632 kubelet[2831]: I1212 17:26:19.492634 2831 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 17:26:19.492759 kubelet[2831]: I1212 17:26:19.492675 2831 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:26:19.492839 kubelet[2831]: I1212 17:26:19.492827 2831 kubelet.go:446] "Attempting to sync node with API server" Dec 12 17:26:19.492870 kubelet[2831]: I1212 17:26:19.492843 2831 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:26:19.492870 kubelet[2831]: I1212 17:26:19.492863 2831 kubelet.go:352] "Adding apiserver pod source" Dec 12 17:26:19.492922 kubelet[2831]: I1212 17:26:19.492876 2831 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:26:19.493896 kubelet[2831]: I1212 17:26:19.493438 2831 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:26:19.494491 kubelet[2831]: I1212 17:26:19.494459 2831 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 17:26:19.495255 kubelet[2831]: I1212 17:26:19.495226 2831 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:26:19.495333 kubelet[2831]: I1212 17:26:19.495275 2831 server.go:1287] "Started kubelet" Dec 12 17:26:19.498142 kubelet[2831]: I1212 17:26:19.498101 2831 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:26:19.502377 kubelet[2831]: I1212 17:26:19.502321 2831 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:26:19.502787 kubelet[2831]: I1212 17:26:19.502741 2831 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:26:19.503152 kubelet[2831]: I1212 17:26:19.503130 2831 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:26:19.504291 kubelet[2831]: I1212 17:26:19.504262 2831 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:26:19.506662 kubelet[2831]: I1212 17:26:19.506635 2831 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:26:19.506868 kubelet[2831]: E1212 17:26:19.506837 2831 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-2-0ba9591bbe\" not found" Dec 12 17:26:19.508963 kubelet[2831]: I1212 17:26:19.508935 2831 server.go:479] "Adding debug handlers to kubelet server" Dec 12 17:26:19.513516 kubelet[2831]: I1212 17:26:19.513477 2831 factory.go:221] Registration of the systemd container factory successfully Dec 12 17:26:19.513619 kubelet[2831]: I1212 17:26:19.513598 2831 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:26:19.517743 kubelet[2831]: I1212 17:26:19.517717 2831 factory.go:221] Registration of the containerd container factory successfully Dec 12 17:26:19.518676 kubelet[2831]: I1212 17:26:19.518082 2831 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:26:19.518676 kubelet[2831]: I1212 17:26:19.518110 2831 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:26:19.522156 kubelet[2831]: E1212 17:26:19.522130 2831 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:26:19.524856 kubelet[2831]: I1212 17:26:19.524819 2831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 17:26:19.527011 kubelet[2831]: I1212 17:26:19.526970 2831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 17:26:19.527011 kubelet[2831]: I1212 17:26:19.527003 2831 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 17:26:19.527111 kubelet[2831]: I1212 17:26:19.527020 2831 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:26:19.527111 kubelet[2831]: I1212 17:26:19.527028 2831 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 17:26:19.527111 kubelet[2831]: E1212 17:26:19.527068 2831 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.555925 2831 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.555943 2831 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.555964 2831 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.556150 2831 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.556160 2831 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.556178 2831 policy_none.go:49] "None policy: Start" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.556185 2831 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.556194 2831 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:26:19.556900 kubelet[2831]: I1212 17:26:19.556283 2831 state_mem.go:75] "Updated machine memory state" Dec 12 17:26:19.560165 kubelet[2831]: I1212 17:26:19.560139 2831 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 17:26:19.560328 kubelet[2831]: I1212 17:26:19.560307 2831 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:26:19.560374 kubelet[2831]: I1212 17:26:19.560321 2831 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:26:19.560646 kubelet[2831]: I1212 17:26:19.560625 2831 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:26:19.562027 kubelet[2831]: E1212 17:26:19.561999 2831 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:26:19.628536 kubelet[2831]: I1212 17:26:19.628354 2831 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.628536 kubelet[2831]: I1212 17:26:19.628356 2831 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.628883 kubelet[2831]: I1212 17:26:19.628849 2831 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.634942 kubelet[2831]: E1212 17:26:19.634815 2831 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.663593 kubelet[2831]: I1212 17:26:19.663561 2831 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.672295 kubelet[2831]: I1212 17:26:19.671827 2831 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.672295 kubelet[2831]: I1212 17:26:19.671897 2831 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720225 kubelet[2831]: I1212 17:26:19.720183 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e011b1b97e519516f532a5ccc7150459-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" (UID: \"e011b1b97e519516f532a5ccc7150459\") " pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720225 kubelet[2831]: I1212 17:26:19.720222 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720225 kubelet[2831]: I1212 17:26:19.720243 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720434 kubelet[2831]: I1212 17:26:19.720261 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720434 kubelet[2831]: I1212 17:26:19.720281 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9499756627e45f14c10a9abe956015b7-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-2-0ba9591bbe\" (UID: \"9499756627e45f14c10a9abe956015b7\") " pod="kube-system/kube-scheduler-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720434 kubelet[2831]: I1212 17:26:19.720295 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e011b1b97e519516f532a5ccc7150459-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" (UID: \"e011b1b97e519516f532a5ccc7150459\") " pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720434 kubelet[2831]: I1212 17:26:19.720311 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e011b1b97e519516f532a5ccc7150459-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-2-0ba9591bbe\" (UID: \"e011b1b97e519516f532a5ccc7150459\") " pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720434 kubelet[2831]: I1212 17:26:19.720325 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:19.720542 kubelet[2831]: I1212 17:26:19.720340 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1c0158734360a056bfe1160eca0b990c-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" (UID: \"1c0158734360a056bfe1160eca0b990c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:20.494269 kubelet[2831]: I1212 17:26:20.494095 2831 apiserver.go:52] "Watching apiserver" Dec 12 17:26:20.519257 kubelet[2831]: I1212 17:26:20.519175 2831 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:26:20.542809 kubelet[2831]: I1212 17:26:20.542256 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-2-0ba9591bbe" podStartSLOduration=2.542220329 podStartE2EDuration="2.542220329s" podCreationTimestamp="2025-12-12 17:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:20.542000369 +0000 UTC m=+1.117817625" watchObservedRunningTime="2025-12-12 17:26:20.542220329 +0000 UTC m=+1.118037585" Dec 12 17:26:20.548878 kubelet[2831]: I1212 17:26:20.548610 2831 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:20.559164 kubelet[2831]: I1212 17:26:20.558780 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" podStartSLOduration=1.5587628119999999 podStartE2EDuration="1.558762812s" podCreationTimestamp="2025-12-12 17:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:20.556560366 +0000 UTC m=+1.132377582" watchObservedRunningTime="2025-12-12 17:26:20.558762812 +0000 UTC m=+1.134580068" Dec 12 17:26:20.559164 kubelet[2831]: E1212 17:26:20.559009 2831 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-2-0ba9591bbe\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:20.566025 kubelet[2831]: I1212 17:26:20.565514 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-2-0ba9591bbe" podStartSLOduration=1.56549979 podStartE2EDuration="1.56549979s" podCreationTimestamp="2025-12-12 17:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:20.565052508 +0000 UTC m=+1.140869764" watchObservedRunningTime="2025-12-12 17:26:20.56549979 +0000 UTC m=+1.141317046" Dec 12 17:26:24.921635 kubelet[2831]: I1212 17:26:24.921531 2831 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 17:26:24.922047 kubelet[2831]: I1212 17:26:24.922016 2831 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 17:26:24.922076 containerd[1633]: time="2025-12-12T17:26:24.921840706Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 17:26:25.573286 systemd[1]: Created slice kubepods-besteffort-pode9831c4f_8f80_487e_8d79_9a04c685189d.slice - libcontainer container kubepods-besteffort-pode9831c4f_8f80_487e_8d79_9a04c685189d.slice. Dec 12 17:26:25.654988 kubelet[2831]: I1212 17:26:25.654930 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e9831c4f-8f80-487e-8d79-9a04c685189d-kube-proxy\") pod \"kube-proxy-q4j4p\" (UID: \"e9831c4f-8f80-487e-8d79-9a04c685189d\") " pod="kube-system/kube-proxy-q4j4p" Dec 12 17:26:25.654988 kubelet[2831]: I1212 17:26:25.654981 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzzs\" (UniqueName: \"kubernetes.io/projected/e9831c4f-8f80-487e-8d79-9a04c685189d-kube-api-access-xmzzs\") pod \"kube-proxy-q4j4p\" (UID: \"e9831c4f-8f80-487e-8d79-9a04c685189d\") " pod="kube-system/kube-proxy-q4j4p" Dec 12 17:26:25.655145 kubelet[2831]: I1212 17:26:25.655039 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9831c4f-8f80-487e-8d79-9a04c685189d-lib-modules\") pod \"kube-proxy-q4j4p\" (UID: \"e9831c4f-8f80-487e-8d79-9a04c685189d\") " pod="kube-system/kube-proxy-q4j4p" Dec 12 17:26:25.655145 kubelet[2831]: I1212 17:26:25.655093 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e9831c4f-8f80-487e-8d79-9a04c685189d-xtables-lock\") pod \"kube-proxy-q4j4p\" (UID: \"e9831c4f-8f80-487e-8d79-9a04c685189d\") " pod="kube-system/kube-proxy-q4j4p" Dec 12 17:26:25.768539 kubelet[2831]: E1212 17:26:25.767616 2831 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Dec 12 17:26:25.768768 kubelet[2831]: E1212 17:26:25.768636 2831 projected.go:194] Error preparing data for projected volume kube-api-access-xmzzs for pod kube-system/kube-proxy-q4j4p: configmap "kube-root-ca.crt" not found Dec 12 17:26:25.768904 kubelet[2831]: E1212 17:26:25.768872 2831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9831c4f-8f80-487e-8d79-9a04c685189d-kube-api-access-xmzzs podName:e9831c4f-8f80-487e-8d79-9a04c685189d nodeName:}" failed. No retries permitted until 2025-12-12 17:26:26.268836106 +0000 UTC m=+6.844653362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xmzzs" (UniqueName: "kubernetes.io/projected/e9831c4f-8f80-487e-8d79-9a04c685189d-kube-api-access-xmzzs") pod "kube-proxy-q4j4p" (UID: "e9831c4f-8f80-487e-8d79-9a04c685189d") : configmap "kube-root-ca.crt" not found Dec 12 17:26:25.995108 systemd[1]: Created slice kubepods-besteffort-podcc00ffae_48a8_4f33_94b6_55ffd138b99b.slice - libcontainer container kubepods-besteffort-podcc00ffae_48a8_4f33_94b6_55ffd138b99b.slice. Dec 12 17:26:26.057906 kubelet[2831]: I1212 17:26:26.057873 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/cc00ffae-48a8-4f33-94b6-55ffd138b99b-var-lib-calico\") pod \"tigera-operator-7dcd859c48-kphbl\" (UID: \"cc00ffae-48a8-4f33-94b6-55ffd138b99b\") " pod="tigera-operator/tigera-operator-7dcd859c48-kphbl" Dec 12 17:26:26.058224 kubelet[2831]: I1212 17:26:26.057922 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x98t\" (UniqueName: \"kubernetes.io/projected/cc00ffae-48a8-4f33-94b6-55ffd138b99b-kube-api-access-9x98t\") pod \"tigera-operator-7dcd859c48-kphbl\" (UID: \"cc00ffae-48a8-4f33-94b6-55ffd138b99b\") " pod="tigera-operator/tigera-operator-7dcd859c48-kphbl" Dec 12 17:26:26.298844 containerd[1633]: time="2025-12-12T17:26:26.298695362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-kphbl,Uid:cc00ffae-48a8-4f33-94b6-55ffd138b99b,Namespace:tigera-operator,Attempt:0,}" Dec 12 17:26:26.317841 containerd[1633]: time="2025-12-12T17:26:26.317803492Z" level=info msg="connecting to shim 9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8" address="unix:///run/containerd/s/0ca6ed5de74bdef08f6783aa3e0b17c90cb02b90dc4ad02afd4d4cd1ab9b7ee1" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:26.343804 systemd[1]: Started cri-containerd-9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8.scope - libcontainer container 9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8. Dec 12 17:26:26.376047 containerd[1633]: time="2025-12-12T17:26:26.375996923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-kphbl,Uid:cc00ffae-48a8-4f33-94b6-55ffd138b99b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8\"" Dec 12 17:26:26.379003 containerd[1633]: time="2025-12-12T17:26:26.378969891Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 17:26:26.487031 containerd[1633]: time="2025-12-12T17:26:26.486987051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q4j4p,Uid:e9831c4f-8f80-487e-8d79-9a04c685189d,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:26.505894 containerd[1633]: time="2025-12-12T17:26:26.505841100Z" level=info msg="connecting to shim 9a8630078b6cfc940be6b5e0266407c7f6ed3e2579614d2bc8a97d6472d706ff" address="unix:///run/containerd/s/af5be3c4914115d6eb03dc6e9ddadaa1990cc08d556ce9adf407c49f2c76b254" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:26.530555 systemd[1]: Started cri-containerd-9a8630078b6cfc940be6b5e0266407c7f6ed3e2579614d2bc8a97d6472d706ff.scope - libcontainer container 9a8630078b6cfc940be6b5e0266407c7f6ed3e2579614d2bc8a97d6472d706ff. Dec 12 17:26:26.553755 containerd[1633]: time="2025-12-12T17:26:26.553646584Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-q4j4p,Uid:e9831c4f-8f80-487e-8d79-9a04c685189d,Namespace:kube-system,Attempt:0,} returns sandbox id \"9a8630078b6cfc940be6b5e0266407c7f6ed3e2579614d2bc8a97d6472d706ff\"" Dec 12 17:26:26.556459 containerd[1633]: time="2025-12-12T17:26:26.556428912Z" level=info msg="CreateContainer within sandbox \"9a8630078b6cfc940be6b5e0266407c7f6ed3e2579614d2bc8a97d6472d706ff\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 17:26:26.566786 containerd[1633]: time="2025-12-12T17:26:26.566753058Z" level=info msg="Container 95f4be81158b1ba459174d33773086891e367bcf50612030a5b0912f136e3121: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:26.574250 containerd[1633]: time="2025-12-12T17:26:26.574215118Z" level=info msg="CreateContainer within sandbox \"9a8630078b6cfc940be6b5e0266407c7f6ed3e2579614d2bc8a97d6472d706ff\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"95f4be81158b1ba459174d33773086891e367bcf50612030a5b0912f136e3121\"" Dec 12 17:26:26.574969 containerd[1633]: time="2025-12-12T17:26:26.574945440Z" level=info msg="StartContainer for \"95f4be81158b1ba459174d33773086891e367bcf50612030a5b0912f136e3121\"" Dec 12 17:26:26.576896 containerd[1633]: time="2025-12-12T17:26:26.576777085Z" level=info msg="connecting to shim 95f4be81158b1ba459174d33773086891e367bcf50612030a5b0912f136e3121" address="unix:///run/containerd/s/af5be3c4914115d6eb03dc6e9ddadaa1990cc08d556ce9adf407c49f2c76b254" protocol=ttrpc version=3 Dec 12 17:26:26.598824 systemd[1]: Started cri-containerd-95f4be81158b1ba459174d33773086891e367bcf50612030a5b0912f136e3121.scope - libcontainer container 95f4be81158b1ba459174d33773086891e367bcf50612030a5b0912f136e3121. Dec 12 17:26:26.671186 containerd[1633]: time="2025-12-12T17:26:26.671105170Z" level=info msg="StartContainer for \"95f4be81158b1ba459174d33773086891e367bcf50612030a5b0912f136e3121\" returns successfully" Dec 12 17:26:28.067472 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3530932981.mount: Deactivated successfully. Dec 12 17:26:28.368969 containerd[1633]: time="2025-12-12T17:26:28.368850180Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:28.370030 containerd[1633]: time="2025-12-12T17:26:28.369876742Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 12 17:26:28.371172 containerd[1633]: time="2025-12-12T17:26:28.371122226Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:28.374079 containerd[1633]: time="2025-12-12T17:26:28.374044393Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:28.375189 containerd[1633]: time="2025-12-12T17:26:28.375161796Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 1.994717861s" Dec 12 17:26:28.375373 containerd[1633]: time="2025-12-12T17:26:28.375281756Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 12 17:26:28.377466 containerd[1633]: time="2025-12-12T17:26:28.377435362Z" level=info msg="CreateContainer within sandbox \"9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 17:26:28.391194 containerd[1633]: time="2025-12-12T17:26:28.390652756Z" level=info msg="Container 5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:28.391807 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3419332020.mount: Deactivated successfully. Dec 12 17:26:28.398446 containerd[1633]: time="2025-12-12T17:26:28.398356576Z" level=info msg="CreateContainer within sandbox \"9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781\"" Dec 12 17:26:28.399173 containerd[1633]: time="2025-12-12T17:26:28.399146098Z" level=info msg="StartContainer for \"5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781\"" Dec 12 17:26:28.400148 containerd[1633]: time="2025-12-12T17:26:28.400117861Z" level=info msg="connecting to shim 5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781" address="unix:///run/containerd/s/0ca6ed5de74bdef08f6783aa3e0b17c90cb02b90dc4ad02afd4d4cd1ab9b7ee1" protocol=ttrpc version=3 Dec 12 17:26:28.417596 systemd[1]: Started cri-containerd-5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781.scope - libcontainer container 5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781. Dec 12 17:26:28.444289 containerd[1633]: time="2025-12-12T17:26:28.444194135Z" level=info msg="StartContainer for \"5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781\" returns successfully" Dec 12 17:26:28.576974 kubelet[2831]: I1212 17:26:28.576688 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-q4j4p" podStartSLOduration=3.576669159 podStartE2EDuration="3.576669159s" podCreationTimestamp="2025-12-12 17:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:27.573028672 +0000 UTC m=+8.148845928" watchObservedRunningTime="2025-12-12 17:26:28.576669159 +0000 UTC m=+9.152486415" Dec 12 17:26:28.576974 kubelet[2831]: I1212 17:26:28.576802 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-kphbl" podStartSLOduration=1.579284052 podStartE2EDuration="3.5767984s" podCreationTimestamp="2025-12-12 17:26:25 +0000 UTC" firstStartedPulling="2025-12-12 17:26:26.37852909 +0000 UTC m=+6.954346346" lastFinishedPulling="2025-12-12 17:26:28.376043438 +0000 UTC m=+8.951860694" observedRunningTime="2025-12-12 17:26:28.576647879 +0000 UTC m=+9.152465135" watchObservedRunningTime="2025-12-12 17:26:28.5767984 +0000 UTC m=+9.152615656" Dec 12 17:26:33.594095 sudo[1889]: pam_unix(sudo:session): session closed for user root Dec 12 17:26:33.751919 sshd[1888]: Connection closed by 147.75.109.163 port 36148 Dec 12 17:26:33.752291 sshd-session[1885]: pam_unix(sshd:session): session closed for user core Dec 12 17:26:33.757871 systemd-logind[1612]: Session 7 logged out. Waiting for processes to exit. Dec 12 17:26:33.758127 systemd[1]: sshd@6-10.0.17.31:22-147.75.109.163:36148.service: Deactivated successfully. Dec 12 17:26:33.761090 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 17:26:33.761352 systemd[1]: session-7.scope: Consumed 6.467s CPU time, 221.5M memory peak. Dec 12 17:26:33.763274 systemd-logind[1612]: Removed session 7. Dec 12 17:26:42.260636 kubelet[2831]: I1212 17:26:42.260578 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c71cd8-1729-4714-905c-a4f9a700dc00-tigera-ca-bundle\") pod \"calico-typha-59b8d58ddd-4hbrn\" (UID: \"01c71cd8-1729-4714-905c-a4f9a700dc00\") " pod="calico-system/calico-typha-59b8d58ddd-4hbrn" Dec 12 17:26:42.260636 kubelet[2831]: I1212 17:26:42.260628 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/01c71cd8-1729-4714-905c-a4f9a700dc00-typha-certs\") pod \"calico-typha-59b8d58ddd-4hbrn\" (UID: \"01c71cd8-1729-4714-905c-a4f9a700dc00\") " pod="calico-system/calico-typha-59b8d58ddd-4hbrn" Dec 12 17:26:42.261086 kubelet[2831]: I1212 17:26:42.260654 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bncw2\" (UniqueName: \"kubernetes.io/projected/01c71cd8-1729-4714-905c-a4f9a700dc00-kube-api-access-bncw2\") pod \"calico-typha-59b8d58ddd-4hbrn\" (UID: \"01c71cd8-1729-4714-905c-a4f9a700dc00\") " pod="calico-system/calico-typha-59b8d58ddd-4hbrn" Dec 12 17:26:42.266528 systemd[1]: Created slice kubepods-besteffort-pod01c71cd8_1729_4714_905c_a4f9a700dc00.slice - libcontainer container kubepods-besteffort-pod01c71cd8_1729_4714_905c_a4f9a700dc00.slice. Dec 12 17:26:42.448558 systemd[1]: Created slice kubepods-besteffort-pod368277b3_5da6_4c40_b7d0_98b15476b2e5.slice - libcontainer container kubepods-besteffort-pod368277b3_5da6_4c40_b7d0_98b15476b2e5.slice. Dec 12 17:26:42.461845 kubelet[2831]: I1212 17:26:42.461782 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-var-run-calico\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.461845 kubelet[2831]: I1212 17:26:42.461827 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-xtables-lock\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.461845 kubelet[2831]: I1212 17:26:42.461855 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-cni-bin-dir\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462011 kubelet[2831]: I1212 17:26:42.461871 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pr2n\" (UniqueName: \"kubernetes.io/projected/368277b3-5da6-4c40-b7d0-98b15476b2e5-kube-api-access-7pr2n\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462011 kubelet[2831]: I1212 17:26:42.461894 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-cni-net-dir\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462011 kubelet[2831]: I1212 17:26:42.461909 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-flexvol-driver-host\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462011 kubelet[2831]: I1212 17:26:42.461925 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-lib-modules\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462011 kubelet[2831]: I1212 17:26:42.461941 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-policysync\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462208 kubelet[2831]: I1212 17:26:42.461955 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-var-lib-calico\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462208 kubelet[2831]: I1212 17:26:42.462003 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/368277b3-5da6-4c40-b7d0-98b15476b2e5-tigera-ca-bundle\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462208 kubelet[2831]: I1212 17:26:42.462036 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/368277b3-5da6-4c40-b7d0-98b15476b2e5-cni-log-dir\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.462208 kubelet[2831]: I1212 17:26:42.462052 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/368277b3-5da6-4c40-b7d0-98b15476b2e5-node-certs\") pod \"calico-node-vndvx\" (UID: \"368277b3-5da6-4c40-b7d0-98b15476b2e5\") " pod="calico-system/calico-node-vndvx" Dec 12 17:26:42.564179 kubelet[2831]: E1212 17:26:42.564037 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.564179 kubelet[2831]: W1212 17:26:42.564061 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.564179 kubelet[2831]: E1212 17:26:42.564085 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.564323 kubelet[2831]: E1212 17:26:42.564278 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.564323 kubelet[2831]: W1212 17:26:42.564286 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.564323 kubelet[2831]: E1212 17:26:42.564297 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.565308 kubelet[2831]: E1212 17:26:42.564427 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.565308 kubelet[2831]: W1212 17:26:42.564438 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.565308 kubelet[2831]: E1212 17:26:42.564448 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.565308 kubelet[2831]: E1212 17:26:42.564826 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.565308 kubelet[2831]: W1212 17:26:42.564838 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.565308 kubelet[2831]: E1212 17:26:42.564850 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.565308 kubelet[2831]: E1212 17:26:42.565047 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.565308 kubelet[2831]: W1212 17:26:42.565056 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.565308 kubelet[2831]: E1212 17:26:42.565067 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.565896 kubelet[2831]: E1212 17:26:42.565691 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.565896 kubelet[2831]: W1212 17:26:42.565705 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.565896 kubelet[2831]: E1212 17:26:42.565719 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.567412 kubelet[2831]: E1212 17:26:42.567283 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.567412 kubelet[2831]: W1212 17:26:42.567309 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.567412 kubelet[2831]: E1212 17:26:42.567332 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.570177 containerd[1633]: time="2025-12-12T17:26:42.570140186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59b8d58ddd-4hbrn,Uid:01c71cd8-1729-4714-905c-a4f9a700dc00,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:42.575964 kubelet[2831]: E1212 17:26:42.575939 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.575964 kubelet[2831]: W1212 17:26:42.575961 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.576089 kubelet[2831]: E1212 17:26:42.575977 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.593199 containerd[1633]: time="2025-12-12T17:26:42.593105205Z" level=info msg="connecting to shim 2316d3ac1bd814f1504fb1ba627f30779d4ba45946ed83f1ed07914401c36d19" address="unix:///run/containerd/s/f07d52d62cd17db48c7ebf452a7e76146a7c5a0c706b2942172644bbf4efe0e9" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:42.622017 systemd[1]: Started cri-containerd-2316d3ac1bd814f1504fb1ba627f30779d4ba45946ed83f1ed07914401c36d19.scope - libcontainer container 2316d3ac1bd814f1504fb1ba627f30779d4ba45946ed83f1ed07914401c36d19. Dec 12 17:26:42.641807 kubelet[2831]: E1212 17:26:42.641644 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:26:42.654798 kubelet[2831]: E1212 17:26:42.654766 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.654798 kubelet[2831]: W1212 17:26:42.654790 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.654948 kubelet[2831]: E1212 17:26:42.654810 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.655138 kubelet[2831]: E1212 17:26:42.655119 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.655176 kubelet[2831]: W1212 17:26:42.655133 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.655176 kubelet[2831]: E1212 17:26:42.655175 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.656173 kubelet[2831]: E1212 17:26:42.656057 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.656238 kubelet[2831]: W1212 17:26:42.656173 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.656238 kubelet[2831]: E1212 17:26:42.656190 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.656558 kubelet[2831]: E1212 17:26:42.656540 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.656558 kubelet[2831]: W1212 17:26:42.656556 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.656661 kubelet[2831]: E1212 17:26:42.656568 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.657493 kubelet[2831]: E1212 17:26:42.657476 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.657493 kubelet[2831]: W1212 17:26:42.657492 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.657580 kubelet[2831]: E1212 17:26:42.657507 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.657929 kubelet[2831]: E1212 17:26:42.657905 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.657929 kubelet[2831]: W1212 17:26:42.657926 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.658071 kubelet[2831]: E1212 17:26:42.657937 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.658389 kubelet[2831]: E1212 17:26:42.658369 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.658389 kubelet[2831]: W1212 17:26:42.658385 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.658389 kubelet[2831]: E1212 17:26:42.658438 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.658751 kubelet[2831]: E1212 17:26:42.658736 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.658751 kubelet[2831]: W1212 17:26:42.658751 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.658816 kubelet[2831]: E1212 17:26:42.658763 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.659340 kubelet[2831]: E1212 17:26:42.659322 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.659371 kubelet[2831]: W1212 17:26:42.659342 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.659371 kubelet[2831]: E1212 17:26:42.659356 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.659593 kubelet[2831]: E1212 17:26:42.659507 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.659593 kubelet[2831]: W1212 17:26:42.659523 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.659593 kubelet[2831]: E1212 17:26:42.659534 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.660177 kubelet[2831]: E1212 17:26:42.660155 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.660177 kubelet[2831]: W1212 17:26:42.660175 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.660260 kubelet[2831]: E1212 17:26:42.660192 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.660466 kubelet[2831]: E1212 17:26:42.660389 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.660466 kubelet[2831]: W1212 17:26:42.660449 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.660466 kubelet[2831]: E1212 17:26:42.660462 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.660693 kubelet[2831]: E1212 17:26:42.660629 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.660693 kubelet[2831]: W1212 17:26:42.660639 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.660693 kubelet[2831]: E1212 17:26:42.660647 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.660788 kubelet[2831]: E1212 17:26:42.660764 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.660788 kubelet[2831]: W1212 17:26:42.660774 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.660827 kubelet[2831]: E1212 17:26:42.660791 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.661088 kubelet[2831]: E1212 17:26:42.661047 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.661088 kubelet[2831]: W1212 17:26:42.661071 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.661088 kubelet[2831]: E1212 17:26:42.661085 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.661527 kubelet[2831]: E1212 17:26:42.661286 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.661527 kubelet[2831]: W1212 17:26:42.661299 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.661527 kubelet[2831]: E1212 17:26:42.661309 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.661787 kubelet[2831]: E1212 17:26:42.661769 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.661822 kubelet[2831]: W1212 17:26:42.661788 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.661822 kubelet[2831]: E1212 17:26:42.661800 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.661984 kubelet[2831]: E1212 17:26:42.661971 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.662010 kubelet[2831]: W1212 17:26:42.661984 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.662010 kubelet[2831]: E1212 17:26:42.661996 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.662143 kubelet[2831]: E1212 17:26:42.662131 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.662174 kubelet[2831]: W1212 17:26:42.662143 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.662174 kubelet[2831]: E1212 17:26:42.662152 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.662284 kubelet[2831]: E1212 17:26:42.662274 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.662284 kubelet[2831]: W1212 17:26:42.662283 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.662333 kubelet[2831]: E1212 17:26:42.662291 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.663767 kubelet[2831]: E1212 17:26:42.663611 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.663767 kubelet[2831]: W1212 17:26:42.663630 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.663767 kubelet[2831]: E1212 17:26:42.663644 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.663767 kubelet[2831]: I1212 17:26:42.663674 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8dq\" (UniqueName: \"kubernetes.io/projected/eda1ca3a-3908-4257-9a19-d316969a4cc3-kube-api-access-6q8dq\") pod \"csi-node-driver-ppkws\" (UID: \"eda1ca3a-3908-4257-9a19-d316969a4cc3\") " pod="calico-system/csi-node-driver-ppkws" Dec 12 17:26:42.663984 kubelet[2831]: E1212 17:26:42.663968 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.664044 kubelet[2831]: W1212 17:26:42.664033 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.664105 kubelet[2831]: E1212 17:26:42.664093 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.664177 kubelet[2831]: I1212 17:26:42.664153 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eda1ca3a-3908-4257-9a19-d316969a4cc3-kubelet-dir\") pod \"csi-node-driver-ppkws\" (UID: \"eda1ca3a-3908-4257-9a19-d316969a4cc3\") " pod="calico-system/csi-node-driver-ppkws" Dec 12 17:26:42.664424 kubelet[2831]: E1212 17:26:42.664394 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.664424 kubelet[2831]: W1212 17:26:42.664424 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.664500 kubelet[2831]: E1212 17:26:42.664444 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.664594 kubelet[2831]: E1212 17:26:42.664580 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.664594 kubelet[2831]: W1212 17:26:42.664592 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.664656 kubelet[2831]: E1212 17:26:42.664606 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.664755 kubelet[2831]: E1212 17:26:42.664742 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.664789 kubelet[2831]: W1212 17:26:42.664755 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.664789 kubelet[2831]: E1212 17:26:42.664768 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.664829 kubelet[2831]: I1212 17:26:42.664791 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eda1ca3a-3908-4257-9a19-d316969a4cc3-socket-dir\") pod \"csi-node-driver-ppkws\" (UID: \"eda1ca3a-3908-4257-9a19-d316969a4cc3\") " pod="calico-system/csi-node-driver-ppkws" Dec 12 17:26:42.665508 kubelet[2831]: E1212 17:26:42.665474 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.665508 kubelet[2831]: W1212 17:26:42.665492 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.665508 kubelet[2831]: E1212 17:26:42.665510 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.665645 kubelet[2831]: I1212 17:26:42.665529 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/eda1ca3a-3908-4257-9a19-d316969a4cc3-varrun\") pod \"csi-node-driver-ppkws\" (UID: \"eda1ca3a-3908-4257-9a19-d316969a4cc3\") " pod="calico-system/csi-node-driver-ppkws" Dec 12 17:26:42.665763 kubelet[2831]: E1212 17:26:42.665746 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.665805 kubelet[2831]: W1212 17:26:42.665763 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.665890 kubelet[2831]: E1212 17:26:42.665857 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.665921 kubelet[2831]: I1212 17:26:42.665889 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eda1ca3a-3908-4257-9a19-d316969a4cc3-registration-dir\") pod \"csi-node-driver-ppkws\" (UID: \"eda1ca3a-3908-4257-9a19-d316969a4cc3\") " pod="calico-system/csi-node-driver-ppkws" Dec 12 17:26:42.666082 kubelet[2831]: E1212 17:26:42.666067 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.666082 kubelet[2831]: W1212 17:26:42.666078 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.666545 kubelet[2831]: E1212 17:26:42.666107 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.666545 kubelet[2831]: E1212 17:26:42.666215 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.666545 kubelet[2831]: W1212 17:26:42.666222 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.666545 kubelet[2831]: E1212 17:26:42.666244 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.666545 kubelet[2831]: E1212 17:26:42.666356 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.666545 kubelet[2831]: W1212 17:26:42.666362 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.666545 kubelet[2831]: E1212 17:26:42.666376 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.666545 kubelet[2831]: E1212 17:26:42.666499 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.666545 kubelet[2831]: W1212 17:26:42.666506 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.666545 kubelet[2831]: E1212 17:26:42.666521 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.666761 kubelet[2831]: E1212 17:26:42.666682 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.666761 kubelet[2831]: W1212 17:26:42.666689 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.666761 kubelet[2831]: E1212 17:26:42.666696 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.666863 kubelet[2831]: E1212 17:26:42.666843 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.666863 kubelet[2831]: W1212 17:26:42.666854 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.666863 kubelet[2831]: E1212 17:26:42.666862 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.667014 kubelet[2831]: E1212 17:26:42.667001 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.667014 kubelet[2831]: W1212 17:26:42.667012 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.667085 kubelet[2831]: E1212 17:26:42.667021 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.667214 kubelet[2831]: E1212 17:26:42.667201 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.667214 kubelet[2831]: W1212 17:26:42.667214 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.667297 kubelet[2831]: E1212 17:26:42.667223 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.678004 containerd[1633]: time="2025-12-12T17:26:42.677956306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59b8d58ddd-4hbrn,Uid:01c71cd8-1729-4714-905c-a4f9a700dc00,Namespace:calico-system,Attempt:0,} returns sandbox id \"2316d3ac1bd814f1504fb1ba627f30779d4ba45946ed83f1ed07914401c36d19\"" Dec 12 17:26:42.679634 containerd[1633]: time="2025-12-12T17:26:42.679595590Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 17:26:42.753732 containerd[1633]: time="2025-12-12T17:26:42.753692423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vndvx,Uid:368277b3-5da6-4c40-b7d0-98b15476b2e5,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:42.767284 kubelet[2831]: E1212 17:26:42.767250 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.767512 kubelet[2831]: W1212 17:26:42.767367 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.767512 kubelet[2831]: E1212 17:26:42.767389 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.767881 kubelet[2831]: E1212 17:26:42.767849 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.767881 kubelet[2831]: W1212 17:26:42.767866 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.768063 kubelet[2831]: E1212 17:26:42.767992 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.768307 kubelet[2831]: E1212 17:26:42.768293 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.768389 kubelet[2831]: W1212 17:26:42.768372 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.768569 kubelet[2831]: E1212 17:26:42.768547 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.768739 kubelet[2831]: E1212 17:26:42.768634 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.768739 kubelet[2831]: W1212 17:26:42.768733 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.768808 kubelet[2831]: E1212 17:26:42.768746 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.768956 kubelet[2831]: E1212 17:26:42.768941 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.768999 kubelet[2831]: W1212 17:26:42.768957 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.768999 kubelet[2831]: E1212 17:26:42.768973 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.769250 kubelet[2831]: E1212 17:26:42.769212 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.769250 kubelet[2831]: W1212 17:26:42.769224 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.769250 kubelet[2831]: E1212 17:26:42.769237 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.769444 kubelet[2831]: E1212 17:26:42.769419 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.769444 kubelet[2831]: W1212 17:26:42.769432 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.769444 kubelet[2831]: E1212 17:26:42.769468 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.769444 kubelet[2831]: E1212 17:26:42.769571 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.769444 kubelet[2831]: W1212 17:26:42.769578 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.769444 kubelet[2831]: E1212 17:26:42.769607 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.769444 kubelet[2831]: E1212 17:26:42.769713 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.769444 kubelet[2831]: W1212 17:26:42.769720 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.769444 kubelet[2831]: E1212 17:26:42.769730 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.770537 kubelet[2831]: E1212 17:26:42.769904 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.770537 kubelet[2831]: W1212 17:26:42.769911 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.770537 kubelet[2831]: E1212 17:26:42.769951 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.770537 kubelet[2831]: E1212 17:26:42.770035 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.770537 kubelet[2831]: W1212 17:26:42.770041 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.770537 kubelet[2831]: E1212 17:26:42.770074 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.770537 kubelet[2831]: E1212 17:26:42.770142 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.770537 kubelet[2831]: W1212 17:26:42.770149 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.770537 kubelet[2831]: E1212 17:26:42.770158 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.770537 kubelet[2831]: E1212 17:26:42.770295 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.770757 kubelet[2831]: W1212 17:26:42.770301 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.770757 kubelet[2831]: E1212 17:26:42.770309 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.770757 kubelet[2831]: E1212 17:26:42.770487 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.770757 kubelet[2831]: W1212 17:26:42.770494 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.770757 kubelet[2831]: E1212 17:26:42.770503 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.770757 kubelet[2831]: E1212 17:26:42.770648 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.770757 kubelet[2831]: W1212 17:26:42.770655 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.770757 kubelet[2831]: E1212 17:26:42.770664 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.771018 kubelet[2831]: E1212 17:26:42.770777 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.771018 kubelet[2831]: W1212 17:26:42.770784 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.771018 kubelet[2831]: E1212 17:26:42.770792 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.771018 kubelet[2831]: E1212 17:26:42.770995 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.771018 kubelet[2831]: W1212 17:26:42.771005 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.771378 kubelet[2831]: E1212 17:26:42.771024 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.771378 kubelet[2831]: E1212 17:26:42.771180 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.771378 kubelet[2831]: W1212 17:26:42.771188 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.771378 kubelet[2831]: E1212 17:26:42.771227 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.771378 kubelet[2831]: E1212 17:26:42.771309 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.771378 kubelet[2831]: W1212 17:26:42.771316 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.771378 kubelet[2831]: E1212 17:26:42.771340 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.771592 kubelet[2831]: E1212 17:26:42.771518 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.771592 kubelet[2831]: W1212 17:26:42.771528 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.771592 kubelet[2831]: E1212 17:26:42.771545 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.772074 kubelet[2831]: E1212 17:26:42.771802 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.772074 kubelet[2831]: W1212 17:26:42.771843 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.772074 kubelet[2831]: E1212 17:26:42.771889 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.772351 kubelet[2831]: E1212 17:26:42.772150 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.772351 kubelet[2831]: W1212 17:26:42.772163 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.772351 kubelet[2831]: E1212 17:26:42.772182 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.773725 kubelet[2831]: E1212 17:26:42.773703 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.773782 kubelet[2831]: W1212 17:26:42.773740 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.773782 kubelet[2831]: E1212 17:26:42.773762 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.774138 kubelet[2831]: E1212 17:26:42.774114 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.774138 kubelet[2831]: W1212 17:26:42.774136 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.774194 kubelet[2831]: E1212 17:26:42.774165 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.774362 kubelet[2831]: E1212 17:26:42.774346 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.774402 kubelet[2831]: W1212 17:26:42.774363 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.774402 kubelet[2831]: E1212 17:26:42.774375 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.778444 containerd[1633]: time="2025-12-12T17:26:42.778297926Z" level=info msg="connecting to shim 8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a" address="unix:///run/containerd/s/0b75e650b7bcc9d548ca9b11b734b25add7a2e5474e11e62a18e691f2e421298" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:42.780576 kubelet[2831]: E1212 17:26:42.780528 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:42.780576 kubelet[2831]: W1212 17:26:42.780547 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:42.780765 kubelet[2831]: E1212 17:26:42.780665 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:42.801699 systemd[1]: Started cri-containerd-8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a.scope - libcontainer container 8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a. Dec 12 17:26:42.828802 containerd[1633]: time="2025-12-12T17:26:42.828673097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vndvx,Uid:368277b3-5da6-4c40-b7d0-98b15476b2e5,Namespace:calico-system,Attempt:0,} returns sandbox id \"8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a\"" Dec 12 17:26:43.963741 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount956629520.mount: Deactivated successfully. Dec 12 17:26:44.362492 containerd[1633]: time="2025-12-12T17:26:44.362365881Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:44.363587 containerd[1633]: time="2025-12-12T17:26:44.363416964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 12 17:26:44.364424 containerd[1633]: time="2025-12-12T17:26:44.364362766Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:44.366493 containerd[1633]: time="2025-12-12T17:26:44.366450612Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:44.367277 containerd[1633]: time="2025-12-12T17:26:44.367248414Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 1.687621264s" Dec 12 17:26:44.367360 containerd[1633]: time="2025-12-12T17:26:44.367345894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 12 17:26:44.368511 containerd[1633]: time="2025-12-12T17:26:44.368478777Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 17:26:44.377486 containerd[1633]: time="2025-12-12T17:26:44.376948359Z" level=info msg="CreateContainer within sandbox \"2316d3ac1bd814f1504fb1ba627f30779d4ba45946ed83f1ed07914401c36d19\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 17:26:44.384253 containerd[1633]: time="2025-12-12T17:26:44.384201618Z" level=info msg="Container 4c03135aa3d1df93c36a27abe487d1645fb309dd2c3d04fc0203e4751294cfe4: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:44.395216 containerd[1633]: time="2025-12-12T17:26:44.395156926Z" level=info msg="CreateContainer within sandbox \"2316d3ac1bd814f1504fb1ba627f30779d4ba45946ed83f1ed07914401c36d19\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4c03135aa3d1df93c36a27abe487d1645fb309dd2c3d04fc0203e4751294cfe4\"" Dec 12 17:26:44.395803 containerd[1633]: time="2025-12-12T17:26:44.395778008Z" level=info msg="StartContainer for \"4c03135aa3d1df93c36a27abe487d1645fb309dd2c3d04fc0203e4751294cfe4\"" Dec 12 17:26:44.397919 containerd[1633]: time="2025-12-12T17:26:44.397884254Z" level=info msg="connecting to shim 4c03135aa3d1df93c36a27abe487d1645fb309dd2c3d04fc0203e4751294cfe4" address="unix:///run/containerd/s/f07d52d62cd17db48c7ebf452a7e76146a7c5a0c706b2942172644bbf4efe0e9" protocol=ttrpc version=3 Dec 12 17:26:44.420614 systemd[1]: Started cri-containerd-4c03135aa3d1df93c36a27abe487d1645fb309dd2c3d04fc0203e4751294cfe4.scope - libcontainer container 4c03135aa3d1df93c36a27abe487d1645fb309dd2c3d04fc0203e4751294cfe4. Dec 12 17:26:44.457529 containerd[1633]: time="2025-12-12T17:26:44.457329768Z" level=info msg="StartContainer for \"4c03135aa3d1df93c36a27abe487d1645fb309dd2c3d04fc0203e4751294cfe4\" returns successfully" Dec 12 17:26:44.528107 kubelet[2831]: E1212 17:26:44.527994 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:26:44.619841 kubelet[2831]: I1212 17:26:44.619628 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-59b8d58ddd-4hbrn" podStartSLOduration=0.930300602 podStartE2EDuration="2.619336589s" podCreationTimestamp="2025-12-12 17:26:42 +0000 UTC" firstStartedPulling="2025-12-12 17:26:42.679121829 +0000 UTC m=+23.254939085" lastFinishedPulling="2025-12-12 17:26:44.368157816 +0000 UTC m=+24.943975072" observedRunningTime="2025-12-12 17:26:44.618725627 +0000 UTC m=+25.194542883" watchObservedRunningTime="2025-12-12 17:26:44.619336589 +0000 UTC m=+25.195153805" Dec 12 17:26:44.675595 kubelet[2831]: E1212 17:26:44.675540 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.676236 kubelet[2831]: W1212 17:26:44.676024 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.676236 kubelet[2831]: E1212 17:26:44.676067 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.676487 kubelet[2831]: E1212 17:26:44.676469 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.676594 kubelet[2831]: W1212 17:26:44.676545 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.676692 kubelet[2831]: E1212 17:26:44.676676 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.677608 kubelet[2831]: E1212 17:26:44.677579 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.677776 kubelet[2831]: W1212 17:26:44.677701 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.677776 kubelet[2831]: E1212 17:26:44.677724 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.678230 kubelet[2831]: E1212 17:26:44.678164 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.678230 kubelet[2831]: W1212 17:26:44.678181 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.678230 kubelet[2831]: E1212 17:26:44.678193 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.678791 kubelet[2831]: E1212 17:26:44.678737 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.678791 kubelet[2831]: W1212 17:26:44.678764 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.679160 kubelet[2831]: E1212 17:26:44.679067 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.679686 kubelet[2831]: E1212 17:26:44.679394 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.680026 kubelet[2831]: W1212 17:26:44.679927 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.680136 kubelet[2831]: E1212 17:26:44.680112 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.680699 kubelet[2831]: E1212 17:26:44.680684 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.680917 kubelet[2831]: W1212 17:26:44.680802 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.680917 kubelet[2831]: E1212 17:26:44.680818 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.681298 kubelet[2831]: E1212 17:26:44.681284 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.681419 kubelet[2831]: W1212 17:26:44.681351 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.681506 kubelet[2831]: E1212 17:26:44.681492 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.681903 kubelet[2831]: E1212 17:26:44.681808 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.681903 kubelet[2831]: W1212 17:26:44.681834 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.681903 kubelet[2831]: E1212 17:26:44.681846 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.682164 kubelet[2831]: E1212 17:26:44.682150 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.682324 kubelet[2831]: W1212 17:26:44.682197 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.682324 kubelet[2831]: E1212 17:26:44.682209 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.682706 kubelet[2831]: E1212 17:26:44.682613 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.682706 kubelet[2831]: W1212 17:26:44.682629 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.682706 kubelet[2831]: E1212 17:26:44.682641 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.683046 kubelet[2831]: E1212 17:26:44.683030 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.683286 kubelet[2831]: W1212 17:26:44.683122 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.683286 kubelet[2831]: E1212 17:26:44.683164 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.683441 kubelet[2831]: E1212 17:26:44.683428 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.683523 kubelet[2831]: W1212 17:26:44.683512 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.683579 kubelet[2831]: E1212 17:26:44.683569 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.683825 kubelet[2831]: E1212 17:26:44.683800 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.684007 kubelet[2831]: W1212 17:26:44.683894 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.684007 kubelet[2831]: E1212 17:26:44.683911 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.684234 kubelet[2831]: E1212 17:26:44.684110 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.684234 kubelet[2831]: W1212 17:26:44.684118 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.684234 kubelet[2831]: E1212 17:26:44.684127 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.684682 kubelet[2831]: E1212 17:26:44.684615 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.684832 kubelet[2831]: W1212 17:26:44.684770 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.684968 kubelet[2831]: E1212 17:26:44.684955 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.685647 kubelet[2831]: E1212 17:26:44.685614 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.685745 kubelet[2831]: W1212 17:26:44.685629 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.685807 kubelet[2831]: E1212 17:26:44.685797 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.686096 kubelet[2831]: E1212 17:26:44.686076 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.686096 kubelet[2831]: W1212 17:26:44.686095 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.686177 kubelet[2831]: E1212 17:26:44.686114 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.686324 kubelet[2831]: E1212 17:26:44.686312 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.686360 kubelet[2831]: W1212 17:26:44.686324 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.686360 kubelet[2831]: E1212 17:26:44.686341 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.686573 kubelet[2831]: E1212 17:26:44.686559 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.686725 kubelet[2831]: W1212 17:26:44.686572 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.686725 kubelet[2831]: E1212 17:26:44.686594 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.686871 kubelet[2831]: E1212 17:26:44.686853 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.686932 kubelet[2831]: W1212 17:26:44.686920 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.687000 kubelet[2831]: E1212 17:26:44.686990 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.687292 kubelet[2831]: E1212 17:26:44.687279 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.687368 kubelet[2831]: W1212 17:26:44.687356 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.687521 kubelet[2831]: E1212 17:26:44.687447 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.687851 kubelet[2831]: E1212 17:26:44.687733 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.687851 kubelet[2831]: W1212 17:26:44.687746 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.687851 kubelet[2831]: E1212 17:26:44.687761 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.688012 kubelet[2831]: E1212 17:26:44.687998 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.688073 kubelet[2831]: W1212 17:26:44.688061 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.688301 kubelet[2831]: E1212 17:26:44.688267 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.688748 kubelet[2831]: E1212 17:26:44.688732 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.688811 kubelet[2831]: W1212 17:26:44.688799 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.688950 kubelet[2831]: E1212 17:26:44.688918 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.689102 kubelet[2831]: E1212 17:26:44.689089 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.689235 kubelet[2831]: W1212 17:26:44.689158 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.689235 kubelet[2831]: E1212 17:26:44.689202 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.689475 kubelet[2831]: E1212 17:26:44.689463 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.689824 kubelet[2831]: W1212 17:26:44.689528 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.689824 kubelet[2831]: E1212 17:26:44.689547 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.690572 kubelet[2831]: E1212 17:26:44.690547 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.690572 kubelet[2831]: W1212 17:26:44.690569 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.690718 kubelet[2831]: E1212 17:26:44.690593 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.690767 kubelet[2831]: E1212 17:26:44.690755 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.690767 kubelet[2831]: W1212 17:26:44.690765 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.690894 kubelet[2831]: E1212 17:26:44.690803 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.690918 kubelet[2831]: E1212 17:26:44.690904 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.690918 kubelet[2831]: W1212 17:26:44.690912 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.691034 kubelet[2831]: E1212 17:26:44.690963 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.691057 kubelet[2831]: E1212 17:26:44.691036 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.691057 kubelet[2831]: W1212 17:26:44.691043 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.691057 kubelet[2831]: E1212 17:26:44.691053 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.691336 kubelet[2831]: E1212 17:26:44.691239 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.691336 kubelet[2831]: W1212 17:26:44.691250 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.691336 kubelet[2831]: E1212 17:26:44.691259 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:44.692588 kubelet[2831]: E1212 17:26:44.692564 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:44.692648 kubelet[2831]: W1212 17:26:44.692601 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:44.692648 kubelet[2831]: E1212 17:26:44.692618 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.609111 kubelet[2831]: I1212 17:26:45.609042 2831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:26:45.687119 containerd[1633]: time="2025-12-12T17:26:45.686656521Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:45.687478 containerd[1633]: time="2025-12-12T17:26:45.687416323Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 12 17:26:45.688737 containerd[1633]: time="2025-12-12T17:26:45.688705247Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:45.691185 containerd[1633]: time="2025-12-12T17:26:45.691156533Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:45.691862 containerd[1633]: time="2025-12-12T17:26:45.691824295Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.323294558s" Dec 12 17:26:45.691862 containerd[1633]: time="2025-12-12T17:26:45.691862815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 12 17:26:45.692139 kubelet[2831]: E1212 17:26:45.692105 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.692139 kubelet[2831]: W1212 17:26:45.692130 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.692220 kubelet[2831]: E1212 17:26:45.692155 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.692755 kubelet[2831]: E1212 17:26:45.692326 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.692755 kubelet[2831]: W1212 17:26:45.692337 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.692755 kubelet[2831]: E1212 17:26:45.692348 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.692755 kubelet[2831]: E1212 17:26:45.692510 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.692755 kubelet[2831]: W1212 17:26:45.692536 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.692755 kubelet[2831]: E1212 17:26:45.692548 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.692755 kubelet[2831]: E1212 17:26:45.692716 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.692755 kubelet[2831]: W1212 17:26:45.692723 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.692755 kubelet[2831]: E1212 17:26:45.692732 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.693012 kubelet[2831]: E1212 17:26:45.692891 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.693012 kubelet[2831]: W1212 17:26:45.692899 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.693012 kubelet[2831]: E1212 17:26:45.692907 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.693075 kubelet[2831]: E1212 17:26:45.693032 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.693075 kubelet[2831]: W1212 17:26:45.693039 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.693075 kubelet[2831]: E1212 17:26:45.693047 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.693229 kubelet[2831]: E1212 17:26:45.693161 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.693229 kubelet[2831]: W1212 17:26:45.693173 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.693229 kubelet[2831]: E1212 17:26:45.693181 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.693664 kubelet[2831]: E1212 17:26:45.693643 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.693664 kubelet[2831]: W1212 17:26:45.693658 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.693743 kubelet[2831]: E1212 17:26:45.693669 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.693881 kubelet[2831]: E1212 17:26:45.693861 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.693881 kubelet[2831]: W1212 17:26:45.693876 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.693940 kubelet[2831]: E1212 17:26:45.693886 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.694348 kubelet[2831]: E1212 17:26:45.694022 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.694348 kubelet[2831]: W1212 17:26:45.694028 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.694348 kubelet[2831]: E1212 17:26:45.694036 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.694348 kubelet[2831]: E1212 17:26:45.694154 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.694348 kubelet[2831]: W1212 17:26:45.694161 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.694348 kubelet[2831]: E1212 17:26:45.694168 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.694348 kubelet[2831]: E1212 17:26:45.694296 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.694348 kubelet[2831]: W1212 17:26:45.694303 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.694348 kubelet[2831]: E1212 17:26:45.694313 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.694572 kubelet[2831]: E1212 17:26:45.694501 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.694572 kubelet[2831]: W1212 17:26:45.694515 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.694572 kubelet[2831]: E1212 17:26:45.694525 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.695166 kubelet[2831]: E1212 17:26:45.694989 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.695166 kubelet[2831]: W1212 17:26:45.695003 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.695166 kubelet[2831]: E1212 17:26:45.695015 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.695627 kubelet[2831]: E1212 17:26:45.695438 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.695627 kubelet[2831]: W1212 17:26:45.695465 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.695627 kubelet[2831]: E1212 17:26:45.695483 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.696026 kubelet[2831]: E1212 17:26:45.695988 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.696026 kubelet[2831]: W1212 17:26:45.696001 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.696026 kubelet[2831]: E1212 17:26:45.696014 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.696177 kubelet[2831]: E1212 17:26:45.696160 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.696177 kubelet[2831]: W1212 17:26:45.696174 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.697387 kubelet[2831]: E1212 17:26:45.696189 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.697387 kubelet[2831]: E1212 17:26:45.696318 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.697387 kubelet[2831]: W1212 17:26:45.696326 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.697387 kubelet[2831]: E1212 17:26:45.696335 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.697609 containerd[1633]: time="2025-12-12T17:26:45.696180266Z" level=info msg="CreateContainer within sandbox \"8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 17:26:45.697912 kubelet[2831]: E1212 17:26:45.697851 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.697912 kubelet[2831]: W1212 17:26:45.697871 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.698087 kubelet[2831]: E1212 17:26:45.697999 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.698473 kubelet[2831]: E1212 17:26:45.698455 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.698526 kubelet[2831]: W1212 17:26:45.698474 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.698526 kubelet[2831]: E1212 17:26:45.698496 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.698825 kubelet[2831]: E1212 17:26:45.698809 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.698825 kubelet[2831]: W1212 17:26:45.698824 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.698898 kubelet[2831]: E1212 17:26:45.698877 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.699054 kubelet[2831]: E1212 17:26:45.699041 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.699083 kubelet[2831]: W1212 17:26:45.699053 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.699083 kubelet[2831]: E1212 17:26:45.699076 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.699338 kubelet[2831]: E1212 17:26:45.699325 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.699366 kubelet[2831]: W1212 17:26:45.699338 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.699366 kubelet[2831]: E1212 17:26:45.699361 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.699657 kubelet[2831]: E1212 17:26:45.699643 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.699657 kubelet[2831]: W1212 17:26:45.699657 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.699724 kubelet[2831]: E1212 17:26:45.699677 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.700079 kubelet[2831]: E1212 17:26:45.700063 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.700114 kubelet[2831]: W1212 17:26:45.700079 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.700114 kubelet[2831]: E1212 17:26:45.700098 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.700349 kubelet[2831]: E1212 17:26:45.700332 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.700382 kubelet[2831]: W1212 17:26:45.700350 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.700382 kubelet[2831]: E1212 17:26:45.700371 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.700520 kubelet[2831]: E1212 17:26:45.700509 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.700520 kubelet[2831]: W1212 17:26:45.700520 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.700620 kubelet[2831]: E1212 17:26:45.700601 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.700683 kubelet[2831]: E1212 17:26:45.700635 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.700852 kubelet[2831]: W1212 17:26:45.700731 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.700852 kubelet[2831]: E1212 17:26:45.700790 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.701057 kubelet[2831]: E1212 17:26:45.701029 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.701124 kubelet[2831]: W1212 17:26:45.701111 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.701191 kubelet[2831]: E1212 17:26:45.701175 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.701745 kubelet[2831]: E1212 17:26:45.701497 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.701745 kubelet[2831]: W1212 17:26:45.701511 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.701745 kubelet[2831]: E1212 17:26:45.701529 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.702164 kubelet[2831]: E1212 17:26:45.702148 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.702233 kubelet[2831]: W1212 17:26:45.702221 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.702301 kubelet[2831]: E1212 17:26:45.702289 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.702926 kubelet[2831]: E1212 17:26:45.702563 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.702926 kubelet[2831]: W1212 17:26:45.702577 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.702926 kubelet[2831]: E1212 17:26:45.702589 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.703123 kubelet[2831]: E1212 17:26:45.703106 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:45.703201 kubelet[2831]: W1212 17:26:45.703164 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:45.703257 kubelet[2831]: E1212 17:26:45.703247 2831 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:45.709311 containerd[1633]: time="2025-12-12T17:26:45.709270900Z" level=info msg="Container 48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:45.723626 containerd[1633]: time="2025-12-12T17:26:45.723573657Z" level=info msg="CreateContainer within sandbox \"8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1\"" Dec 12 17:26:45.724281 containerd[1633]: time="2025-12-12T17:26:45.724255299Z" level=info msg="StartContainer for \"48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1\"" Dec 12 17:26:45.725779 containerd[1633]: time="2025-12-12T17:26:45.725752263Z" level=info msg="connecting to shim 48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1" address="unix:///run/containerd/s/0b75e650b7bcc9d548ca9b11b734b25add7a2e5474e11e62a18e691f2e421298" protocol=ttrpc version=3 Dec 12 17:26:45.754584 systemd[1]: Started cri-containerd-48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1.scope - libcontainer container 48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1. Dec 12 17:26:45.835211 containerd[1633]: time="2025-12-12T17:26:45.835171667Z" level=info msg="StartContainer for \"48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1\" returns successfully" Dec 12 17:26:45.844708 systemd[1]: cri-containerd-48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1.scope: Deactivated successfully. Dec 12 17:26:45.847555 containerd[1633]: time="2025-12-12T17:26:45.847501459Z" level=info msg="received container exit event container_id:\"48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1\" id:\"48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1\" pid:3554 exited_at:{seconds:1765560405 nanos:846998258}" Dec 12 17:26:45.869869 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-48c0e7408c7d4325812461d563ad67ec507b32668e2d3649255c84f73f462ca1-rootfs.mount: Deactivated successfully. Dec 12 17:26:46.528225 kubelet[2831]: E1212 17:26:46.528122 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:26:46.613964 containerd[1633]: time="2025-12-12T17:26:46.613919490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 17:26:48.527428 kubelet[2831]: E1212 17:26:48.527260 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:26:49.054467 containerd[1633]: time="2025-12-12T17:26:49.054422550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:49.055357 containerd[1633]: time="2025-12-12T17:26:49.055217832Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 12 17:26:49.056694 containerd[1633]: time="2025-12-12T17:26:49.056661515Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:49.058994 containerd[1633]: time="2025-12-12T17:26:49.058945681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:49.060076 containerd[1633]: time="2025-12-12T17:26:49.059942124Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.445982434s" Dec 12 17:26:49.060076 containerd[1633]: time="2025-12-12T17:26:49.059971164Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 12 17:26:49.062073 containerd[1633]: time="2025-12-12T17:26:49.062037049Z" level=info msg="CreateContainer within sandbox \"8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 17:26:49.070917 containerd[1633]: time="2025-12-12T17:26:49.070866912Z" level=info msg="Container bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:49.081012 containerd[1633]: time="2025-12-12T17:26:49.080945258Z" level=info msg="CreateContainer within sandbox \"8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c\"" Dec 12 17:26:49.081663 containerd[1633]: time="2025-12-12T17:26:49.081591220Z" level=info msg="StartContainer for \"bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c\"" Dec 12 17:26:49.083968 containerd[1633]: time="2025-12-12T17:26:49.083934666Z" level=info msg="connecting to shim bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c" address="unix:///run/containerd/s/0b75e650b7bcc9d548ca9b11b734b25add7a2e5474e11e62a18e691f2e421298" protocol=ttrpc version=3 Dec 12 17:26:49.110618 systemd[1]: Started cri-containerd-bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c.scope - libcontainer container bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c. Dec 12 17:26:49.192569 containerd[1633]: time="2025-12-12T17:26:49.192529988Z" level=info msg="StartContainer for \"bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c\" returns successfully" Dec 12 17:26:49.566698 containerd[1633]: time="2025-12-12T17:26:49.566657240Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:26:49.571831 systemd[1]: cri-containerd-bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c.scope: Deactivated successfully. Dec 12 17:26:49.572494 systemd[1]: cri-containerd-bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c.scope: Consumed 469ms CPU time, 188.2M memory peak, 165.9M written to disk. Dec 12 17:26:49.573010 containerd[1633]: time="2025-12-12T17:26:49.572715896Z" level=info msg="received container exit event container_id:\"bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c\" id:\"bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c\" pid:3615 exited_at:{seconds:1765560409 nanos:572296575}" Dec 12 17:26:49.593144 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf0a6b8b8829fcb51bb58e7b597db48b986366b9db21e6b44ba5760a8782af2c-rootfs.mount: Deactivated successfully. Dec 12 17:26:49.664357 kubelet[2831]: I1212 17:26:49.664321 2831 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 12 17:26:49.706500 systemd[1]: Created slice kubepods-burstable-podf3a08696_1c81_4fd3_9d6b_5185ac2ed59c.slice - libcontainer container kubepods-burstable-podf3a08696_1c81_4fd3_9d6b_5185ac2ed59c.slice. Dec 12 17:26:49.717586 systemd[1]: Created slice kubepods-besteffort-podbf0d9458_9e9d_4941_95ae_5b084eb50f31.slice - libcontainer container kubepods-besteffort-podbf0d9458_9e9d_4941_95ae_5b084eb50f31.slice. Dec 12 17:26:49.720746 kubelet[2831]: I1212 17:26:49.720711 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkfhj\" (UniqueName: \"kubernetes.io/projected/f3a08696-1c81-4fd3-9d6b-5185ac2ed59c-kube-api-access-bkfhj\") pod \"coredns-668d6bf9bc-ctm45\" (UID: \"f3a08696-1c81-4fd3-9d6b-5185ac2ed59c\") " pod="kube-system/coredns-668d6bf9bc-ctm45" Dec 12 17:26:49.720846 kubelet[2831]: I1212 17:26:49.720759 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-ca-bundle\") pod \"whisker-979d77f8b-zfs5n\" (UID: \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\") " pod="calico-system/whisker-979d77f8b-zfs5n" Dec 12 17:26:49.720846 kubelet[2831]: I1212 17:26:49.720796 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th8bb\" (UniqueName: \"kubernetes.io/projected/ea5b6410-884a-4c0c-b81e-dd7563ad6122-kube-api-access-th8bb\") pod \"coredns-668d6bf9bc-spt5h\" (UID: \"ea5b6410-884a-4c0c-b81e-dd7563ad6122\") " pod="kube-system/coredns-668d6bf9bc-spt5h" Dec 12 17:26:49.720846 kubelet[2831]: I1212 17:26:49.720816 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhkdj\" (UniqueName: \"kubernetes.io/projected/ef5fc253-58d4-49dc-8ba3-6150ce1f97bc-kube-api-access-dhkdj\") pod \"goldmane-666569f655-v498f\" (UID: \"ef5fc253-58d4-49dc-8ba3-6150ce1f97bc\") " pod="calico-system/goldmane-666569f655-v498f" Dec 12 17:26:49.720846 kubelet[2831]: I1212 17:26:49.720834 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea5b6410-884a-4c0c-b81e-dd7563ad6122-config-volume\") pod \"coredns-668d6bf9bc-spt5h\" (UID: \"ea5b6410-884a-4c0c-b81e-dd7563ad6122\") " pod="kube-system/coredns-668d6bf9bc-spt5h" Dec 12 17:26:49.720945 kubelet[2831]: I1212 17:26:49.720850 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef5fc253-58d4-49dc-8ba3-6150ce1f97bc-goldmane-ca-bundle\") pod \"goldmane-666569f655-v498f\" (UID: \"ef5fc253-58d4-49dc-8ba3-6150ce1f97bc\") " pod="calico-system/goldmane-666569f655-v498f" Dec 12 17:26:49.720945 kubelet[2831]: I1212 17:26:49.720865 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ef5fc253-58d4-49dc-8ba3-6150ce1f97bc-goldmane-key-pair\") pod \"goldmane-666569f655-v498f\" (UID: \"ef5fc253-58d4-49dc-8ba3-6150ce1f97bc\") " pod="calico-system/goldmane-666569f655-v498f" Dec 12 17:26:49.720945 kubelet[2831]: I1212 17:26:49.720880 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3a08696-1c81-4fd3-9d6b-5185ac2ed59c-config-volume\") pod \"coredns-668d6bf9bc-ctm45\" (UID: \"f3a08696-1c81-4fd3-9d6b-5185ac2ed59c\") " pod="kube-system/coredns-668d6bf9bc-ctm45" Dec 12 17:26:49.720945 kubelet[2831]: I1212 17:26:49.720895 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/31cf16b5-737c-43fc-8a73-dab266575bf7-calico-apiserver-certs\") pod \"calico-apiserver-84ff4454f8-4zd4q\" (UID: \"31cf16b5-737c-43fc-8a73-dab266575bf7\") " pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" Dec 12 17:26:49.721035 kubelet[2831]: I1212 17:26:49.720949 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf0d9458-9e9d-4941-95ae-5b084eb50f31-tigera-ca-bundle\") pod \"calico-kube-controllers-697549bf97-ftlsb\" (UID: \"bf0d9458-9e9d-4941-95ae-5b084eb50f31\") " pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" Dec 12 17:26:49.721035 kubelet[2831]: I1212 17:26:49.720968 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/88235c34-52d7-4e16-9ac0-d4ea6115b35c-calico-apiserver-certs\") pod \"calico-apiserver-84ff4454f8-hggpv\" (UID: \"88235c34-52d7-4e16-9ac0-d4ea6115b35c\") " pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" Dec 12 17:26:49.721035 kubelet[2831]: I1212 17:26:49.720982 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-backend-key-pair\") pod \"whisker-979d77f8b-zfs5n\" (UID: \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\") " pod="calico-system/whisker-979d77f8b-zfs5n" Dec 12 17:26:49.721035 kubelet[2831]: I1212 17:26:49.720997 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s649l\" (UniqueName: \"kubernetes.io/projected/31cf16b5-737c-43fc-8a73-dab266575bf7-kube-api-access-s649l\") pod \"calico-apiserver-84ff4454f8-4zd4q\" (UID: \"31cf16b5-737c-43fc-8a73-dab266575bf7\") " pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" Dec 12 17:26:49.721035 kubelet[2831]: I1212 17:26:49.721014 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5fc253-58d4-49dc-8ba3-6150ce1f97bc-config\") pod \"goldmane-666569f655-v498f\" (UID: \"ef5fc253-58d4-49dc-8ba3-6150ce1f97bc\") " pod="calico-system/goldmane-666569f655-v498f" Dec 12 17:26:49.721142 kubelet[2831]: I1212 17:26:49.721034 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfksl\" (UniqueName: \"kubernetes.io/projected/bf0d9458-9e9d-4941-95ae-5b084eb50f31-kube-api-access-tfksl\") pod \"calico-kube-controllers-697549bf97-ftlsb\" (UID: \"bf0d9458-9e9d-4941-95ae-5b084eb50f31\") " pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" Dec 12 17:26:49.721142 kubelet[2831]: I1212 17:26:49.721060 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrgd5\" (UniqueName: \"kubernetes.io/projected/88235c34-52d7-4e16-9ac0-d4ea6115b35c-kube-api-access-xrgd5\") pod \"calico-apiserver-84ff4454f8-hggpv\" (UID: \"88235c34-52d7-4e16-9ac0-d4ea6115b35c\") " pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" Dec 12 17:26:49.721142 kubelet[2831]: I1212 17:26:49.721079 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghzzl\" (UniqueName: \"kubernetes.io/projected/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-kube-api-access-ghzzl\") pod \"whisker-979d77f8b-zfs5n\" (UID: \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\") " pod="calico-system/whisker-979d77f8b-zfs5n" Dec 12 17:26:49.724729 systemd[1]: Created slice kubepods-burstable-podea5b6410_884a_4c0c_b81e_dd7563ad6122.slice - libcontainer container kubepods-burstable-podea5b6410_884a_4c0c_b81e_dd7563ad6122.slice. Dec 12 17:26:49.730234 systemd[1]: Created slice kubepods-besteffort-pod88235c34_52d7_4e16_9ac0_d4ea6115b35c.slice - libcontainer container kubepods-besteffort-pod88235c34_52d7_4e16_9ac0_d4ea6115b35c.slice. Dec 12 17:26:49.736946 systemd[1]: Created slice kubepods-besteffort-pod31cf16b5_737c_43fc_8a73_dab266575bf7.slice - libcontainer container kubepods-besteffort-pod31cf16b5_737c_43fc_8a73_dab266575bf7.slice. Dec 12 17:26:49.747376 systemd[1]: Created slice kubepods-besteffort-podef5fc253_58d4_49dc_8ba3_6150ce1f97bc.slice - libcontainer container kubepods-besteffort-podef5fc253_58d4_49dc_8ba3_6150ce1f97bc.slice. Dec 12 17:26:49.753592 systemd[1]: Created slice kubepods-besteffort-pod430b5a2c_3706_4f2a_ac26_aaf032aa29a7.slice - libcontainer container kubepods-besteffort-pod430b5a2c_3706_4f2a_ac26_aaf032aa29a7.slice. Dec 12 17:26:50.015349 containerd[1633]: time="2025-12-12T17:26:50.015289926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ctm45,Uid:f3a08696-1c81-4fd3-9d6b-5185ac2ed59c,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:50.022455 containerd[1633]: time="2025-12-12T17:26:50.022393104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697549bf97-ftlsb,Uid:bf0d9458-9e9d-4941-95ae-5b084eb50f31,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:50.029677 containerd[1633]: time="2025-12-12T17:26:50.029621363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-spt5h,Uid:ea5b6410-884a-4c0c-b81e-dd7563ad6122,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:50.036432 containerd[1633]: time="2025-12-12T17:26:50.036347980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-hggpv,Uid:88235c34-52d7-4e16-9ac0-d4ea6115b35c,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:50.041848 containerd[1633]: time="2025-12-12T17:26:50.041801834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-4zd4q,Uid:31cf16b5-737c-43fc-8a73-dab266575bf7,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:50.053222 containerd[1633]: time="2025-12-12T17:26:50.053005663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v498f,Uid:ef5fc253-58d4-49dc-8ba3-6150ce1f97bc,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:50.057844 containerd[1633]: time="2025-12-12T17:26:50.057737596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-979d77f8b-zfs5n,Uid:430b5a2c-3706-4f2a-ac26-aaf032aa29a7,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:50.130189 containerd[1633]: time="2025-12-12T17:26:50.129814983Z" level=error msg="Failed to destroy network for sandbox \"f84115ec199fcbeca20d370f151805d35d5822d2251dd1b231e18a90fdfb2734\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.132214 containerd[1633]: time="2025-12-12T17:26:50.132129149Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-spt5h,Uid:ea5b6410-884a-4c0c-b81e-dd7563ad6122,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f84115ec199fcbeca20d370f151805d35d5822d2251dd1b231e18a90fdfb2734\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.132898 kubelet[2831]: E1212 17:26:50.132850 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f84115ec199fcbeca20d370f151805d35d5822d2251dd1b231e18a90fdfb2734\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.132963 kubelet[2831]: E1212 17:26:50.132932 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f84115ec199fcbeca20d370f151805d35d5822d2251dd1b231e18a90fdfb2734\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-spt5h" Dec 12 17:26:50.132963 kubelet[2831]: E1212 17:26:50.132952 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f84115ec199fcbeca20d370f151805d35d5822d2251dd1b231e18a90fdfb2734\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-spt5h" Dec 12 17:26:50.133247 kubelet[2831]: E1212 17:26:50.132995 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-spt5h_kube-system(ea5b6410-884a-4c0c-b81e-dd7563ad6122)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-spt5h_kube-system(ea5b6410-884a-4c0c-b81e-dd7563ad6122)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f84115ec199fcbeca20d370f151805d35d5822d2251dd1b231e18a90fdfb2734\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-spt5h" podUID="ea5b6410-884a-4c0c-b81e-dd7563ad6122" Dec 12 17:26:50.132997 systemd[1]: run-netns-cni\x2db3dee1c9\x2d2825\x2d36d0\x2d6348\x2d1e112f538e2e.mount: Deactivated successfully. Dec 12 17:26:50.137599 containerd[1633]: time="2025-12-12T17:26:50.137548243Z" level=error msg="Failed to destroy network for sandbox \"89a51e40dc6d700c5e89d482e0792c2ee31c1673a784bf87fd7bdb131bc0482c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.139792 systemd[1]: run-netns-cni\x2d33b98308\x2d0d34\x2d4242\x2d874d\x2dd62511cb76bc.mount: Deactivated successfully. Dec 12 17:26:50.145180 containerd[1633]: time="2025-12-12T17:26:50.145127823Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ctm45,Uid:f3a08696-1c81-4fd3-9d6b-5185ac2ed59c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"89a51e40dc6d700c5e89d482e0792c2ee31c1673a784bf87fd7bdb131bc0482c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.145678 kubelet[2831]: E1212 17:26:50.145624 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89a51e40dc6d700c5e89d482e0792c2ee31c1673a784bf87fd7bdb131bc0482c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.145737 kubelet[2831]: E1212 17:26:50.145706 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89a51e40dc6d700c5e89d482e0792c2ee31c1673a784bf87fd7bdb131bc0482c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ctm45" Dec 12 17:26:50.145737 kubelet[2831]: E1212 17:26:50.145727 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89a51e40dc6d700c5e89d482e0792c2ee31c1673a784bf87fd7bdb131bc0482c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ctm45" Dec 12 17:26:50.146069 kubelet[2831]: E1212 17:26:50.145780 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-ctm45_kube-system(f3a08696-1c81-4fd3-9d6b-5185ac2ed59c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-ctm45_kube-system(f3a08696-1c81-4fd3-9d6b-5185ac2ed59c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"89a51e40dc6d700c5e89d482e0792c2ee31c1673a784bf87fd7bdb131bc0482c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-ctm45" podUID="f3a08696-1c81-4fd3-9d6b-5185ac2ed59c" Dec 12 17:26:50.150652 containerd[1633]: time="2025-12-12T17:26:50.150531757Z" level=error msg="Failed to destroy network for sandbox \"871a1640a286d89dfb09321483c26453cb99271643b2ab3e9ad5e203a2fba69b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.151995 containerd[1633]: time="2025-12-12T17:26:50.151943320Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-4zd4q,Uid:31cf16b5-737c-43fc-8a73-dab266575bf7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"871a1640a286d89dfb09321483c26453cb99271643b2ab3e9ad5e203a2fba69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.152725 kubelet[2831]: E1212 17:26:50.152642 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"871a1640a286d89dfb09321483c26453cb99271643b2ab3e9ad5e203a2fba69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.152827 kubelet[2831]: E1212 17:26:50.152734 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"871a1640a286d89dfb09321483c26453cb99271643b2ab3e9ad5e203a2fba69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" Dec 12 17:26:50.152869 kubelet[2831]: E1212 17:26:50.152823 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"871a1640a286d89dfb09321483c26453cb99271643b2ab3e9ad5e203a2fba69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" Dec 12 17:26:50.152913 kubelet[2831]: E1212 17:26:50.152883 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84ff4454f8-4zd4q_calico-apiserver(31cf16b5-737c-43fc-8a73-dab266575bf7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84ff4454f8-4zd4q_calico-apiserver(31cf16b5-737c-43fc-8a73-dab266575bf7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"871a1640a286d89dfb09321483c26453cb99271643b2ab3e9ad5e203a2fba69b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:26:50.154561 systemd[1]: run-netns-cni\x2d03861b3c\x2d9fb9\x2deb78\x2de82c\x2d122f7d9b230a.mount: Deactivated successfully. Dec 12 17:26:50.156563 containerd[1633]: time="2025-12-12T17:26:50.156404532Z" level=error msg="Failed to destroy network for sandbox \"8b79aa7e6065d49f7d8a39612ee4e66ae871f8eea757f5c5b674c6c8260bec05\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.157696 containerd[1633]: time="2025-12-12T17:26:50.157653175Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697549bf97-ftlsb,Uid:bf0d9458-9e9d-4941-95ae-5b084eb50f31,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b79aa7e6065d49f7d8a39612ee4e66ae871f8eea757f5c5b674c6c8260bec05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.158849 systemd[1]: run-netns-cni\x2d36f5d6f2\x2d6a85\x2d15c6\x2d1800\x2d32957031ab76.mount: Deactivated successfully. Dec 12 17:26:50.159393 kubelet[2831]: E1212 17:26:50.159344 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b79aa7e6065d49f7d8a39612ee4e66ae871f8eea757f5c5b674c6c8260bec05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.159480 kubelet[2831]: E1212 17:26:50.159415 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b79aa7e6065d49f7d8a39612ee4e66ae871f8eea757f5c5b674c6c8260bec05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" Dec 12 17:26:50.159480 kubelet[2831]: E1212 17:26:50.159437 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b79aa7e6065d49f7d8a39612ee4e66ae871f8eea757f5c5b674c6c8260bec05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" Dec 12 17:26:50.159701 kubelet[2831]: E1212 17:26:50.159479 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697549bf97-ftlsb_calico-system(bf0d9458-9e9d-4941-95ae-5b084eb50f31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697549bf97-ftlsb_calico-system(bf0d9458-9e9d-4941-95ae-5b084eb50f31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8b79aa7e6065d49f7d8a39612ee4e66ae871f8eea757f5c5b674c6c8260bec05\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:26:50.176131 containerd[1633]: time="2025-12-12T17:26:50.176085703Z" level=error msg="Failed to destroy network for sandbox \"9296ec222c683c8f46c209244e602aace7a10bf67c0fa095fabb79cce3cce535\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.177775 containerd[1633]: time="2025-12-12T17:26:50.177737427Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v498f,Uid:ef5fc253-58d4-49dc-8ba3-6150ce1f97bc,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9296ec222c683c8f46c209244e602aace7a10bf67c0fa095fabb79cce3cce535\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.178001 kubelet[2831]: E1212 17:26:50.177967 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9296ec222c683c8f46c209244e602aace7a10bf67c0fa095fabb79cce3cce535\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.178056 kubelet[2831]: E1212 17:26:50.178022 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9296ec222c683c8f46c209244e602aace7a10bf67c0fa095fabb79cce3cce535\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-v498f" Dec 12 17:26:50.178056 kubelet[2831]: E1212 17:26:50.178040 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9296ec222c683c8f46c209244e602aace7a10bf67c0fa095fabb79cce3cce535\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-v498f" Dec 12 17:26:50.178110 kubelet[2831]: E1212 17:26:50.178081 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-v498f_calico-system(ef5fc253-58d4-49dc-8ba3-6150ce1f97bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-v498f_calico-system(ef5fc253-58d4-49dc-8ba3-6150ce1f97bc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9296ec222c683c8f46c209244e602aace7a10bf67c0fa095fabb79cce3cce535\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:26:50.179002 containerd[1633]: time="2025-12-12T17:26:50.178516470Z" level=error msg="Failed to destroy network for sandbox \"8a07de75eb1eaa831384d62f7c9d758bd686291b1ad21f0b06e0e6b5d252f1a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.180273 containerd[1633]: time="2025-12-12T17:26:50.180081114Z" level=error msg="Failed to destroy network for sandbox \"bb4da7576cb5e1c8fa1888dec99e050fb86779be8dd02eae22dac0c9ca0470a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.181259 containerd[1633]: time="2025-12-12T17:26:50.181059076Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-979d77f8b-zfs5n,Uid:430b5a2c-3706-4f2a-ac26-aaf032aa29a7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a07de75eb1eaa831384d62f7c9d758bd686291b1ad21f0b06e0e6b5d252f1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.181687 kubelet[2831]: E1212 17:26:50.181655 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a07de75eb1eaa831384d62f7c9d758bd686291b1ad21f0b06e0e6b5d252f1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.181906 kubelet[2831]: E1212 17:26:50.181787 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a07de75eb1eaa831384d62f7c9d758bd686291b1ad21f0b06e0e6b5d252f1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-979d77f8b-zfs5n" Dec 12 17:26:50.181906 kubelet[2831]: E1212 17:26:50.181810 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a07de75eb1eaa831384d62f7c9d758bd686291b1ad21f0b06e0e6b5d252f1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-979d77f8b-zfs5n" Dec 12 17:26:50.181906 kubelet[2831]: E1212 17:26:50.181858 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-979d77f8b-zfs5n_calico-system(430b5a2c-3706-4f2a-ac26-aaf032aa29a7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-979d77f8b-zfs5n_calico-system(430b5a2c-3706-4f2a-ac26-aaf032aa29a7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a07de75eb1eaa831384d62f7c9d758bd686291b1ad21f0b06e0e6b5d252f1a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-979d77f8b-zfs5n" podUID="430b5a2c-3706-4f2a-ac26-aaf032aa29a7" Dec 12 17:26:50.183934 containerd[1633]: time="2025-12-12T17:26:50.183830723Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-hggpv,Uid:88235c34-52d7-4e16-9ac0-d4ea6115b35c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb4da7576cb5e1c8fa1888dec99e050fb86779be8dd02eae22dac0c9ca0470a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.184064 kubelet[2831]: E1212 17:26:50.184022 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb4da7576cb5e1c8fa1888dec99e050fb86779be8dd02eae22dac0c9ca0470a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.184100 kubelet[2831]: E1212 17:26:50.184070 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb4da7576cb5e1c8fa1888dec99e050fb86779be8dd02eae22dac0c9ca0470a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" Dec 12 17:26:50.184100 kubelet[2831]: E1212 17:26:50.184088 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb4da7576cb5e1c8fa1888dec99e050fb86779be8dd02eae22dac0c9ca0470a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" Dec 12 17:26:50.184166 kubelet[2831]: E1212 17:26:50.184122 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84ff4454f8-hggpv_calico-apiserver(88235c34-52d7-4e16-9ac0-d4ea6115b35c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84ff4454f8-hggpv_calico-apiserver(88235c34-52d7-4e16-9ac0-d4ea6115b35c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bb4da7576cb5e1c8fa1888dec99e050fb86779be8dd02eae22dac0c9ca0470a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:26:50.533305 systemd[1]: Created slice kubepods-besteffort-podeda1ca3a_3908_4257_9a19_d316969a4cc3.slice - libcontainer container kubepods-besteffort-podeda1ca3a_3908_4257_9a19_d316969a4cc3.slice. Dec 12 17:26:50.535443 containerd[1633]: time="2025-12-12T17:26:50.535384597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ppkws,Uid:eda1ca3a-3908-4257-9a19-d316969a4cc3,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:50.577533 containerd[1633]: time="2025-12-12T17:26:50.577479946Z" level=error msg="Failed to destroy network for sandbox \"8f4dd335c220d7021019d3190bfd4c9a2381c9a839ad6372deaa627f71e385ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.579108 containerd[1633]: time="2025-12-12T17:26:50.579064550Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ppkws,Uid:eda1ca3a-3908-4257-9a19-d316969a4cc3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f4dd335c220d7021019d3190bfd4c9a2381c9a839ad6372deaa627f71e385ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.579343 kubelet[2831]: E1212 17:26:50.579305 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f4dd335c220d7021019d3190bfd4c9a2381c9a839ad6372deaa627f71e385ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:50.579476 kubelet[2831]: E1212 17:26:50.579378 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f4dd335c220d7021019d3190bfd4c9a2381c9a839ad6372deaa627f71e385ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ppkws" Dec 12 17:26:50.579476 kubelet[2831]: E1212 17:26:50.579411 2831 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f4dd335c220d7021019d3190bfd4c9a2381c9a839ad6372deaa627f71e385ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ppkws" Dec 12 17:26:50.579476 kubelet[2831]: E1212 17:26:50.579456 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8f4dd335c220d7021019d3190bfd4c9a2381c9a839ad6372deaa627f71e385ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:26:50.634384 containerd[1633]: time="2025-12-12T17:26:50.634342374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 17:26:51.072586 systemd[1]: run-netns-cni\x2df462430c\x2d51e9\x2db2da\x2d0d38\x2d995b03520974.mount: Deactivated successfully. Dec 12 17:26:51.072677 systemd[1]: run-netns-cni\x2dc8e2e854\x2da505\x2dd55a\x2d7ad1\x2da4c20e80f463.mount: Deactivated successfully. Dec 12 17:26:51.072723 systemd[1]: run-netns-cni\x2dd9c6c6d9\x2dd1d2\x2dc11c\x2db6d3\x2d82efacf163de.mount: Deactivated successfully. Dec 12 17:26:54.128879 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2164251085.mount: Deactivated successfully. Dec 12 17:26:54.152932 containerd[1633]: time="2025-12-12T17:26:54.152868233Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:54.153734 containerd[1633]: time="2025-12-12T17:26:54.153681316Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 12 17:26:54.155271 containerd[1633]: time="2025-12-12T17:26:54.155233760Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:54.157780 containerd[1633]: time="2025-12-12T17:26:54.157739006Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:54.158442 containerd[1633]: time="2025-12-12T17:26:54.158236567Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 3.523841153s" Dec 12 17:26:54.158442 containerd[1633]: time="2025-12-12T17:26:54.158264327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 12 17:26:54.167875 containerd[1633]: time="2025-12-12T17:26:54.167832392Z" level=info msg="CreateContainer within sandbox \"8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 17:26:54.183778 containerd[1633]: time="2025-12-12T17:26:54.183725354Z" level=info msg="Container b5acbd45686b317aa3e216c8fb17b8f623f6bdbbda2825e9c45fa44a0bc08bab: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:54.193984 containerd[1633]: time="2025-12-12T17:26:54.193885940Z" level=info msg="CreateContainer within sandbox \"8437b40597fbe2ad383ecbe095be90ea5492f1332655b9f1415a4804fee8741a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b5acbd45686b317aa3e216c8fb17b8f623f6bdbbda2825e9c45fa44a0bc08bab\"" Dec 12 17:26:54.194694 containerd[1633]: time="2025-12-12T17:26:54.194637582Z" level=info msg="StartContainer for \"b5acbd45686b317aa3e216c8fb17b8f623f6bdbbda2825e9c45fa44a0bc08bab\"" Dec 12 17:26:54.196651 containerd[1633]: time="2025-12-12T17:26:54.196601947Z" level=info msg="connecting to shim b5acbd45686b317aa3e216c8fb17b8f623f6bdbbda2825e9c45fa44a0bc08bab" address="unix:///run/containerd/s/0b75e650b7bcc9d548ca9b11b734b25add7a2e5474e11e62a18e691f2e421298" protocol=ttrpc version=3 Dec 12 17:26:54.220582 systemd[1]: Started cri-containerd-b5acbd45686b317aa3e216c8fb17b8f623f6bdbbda2825e9c45fa44a0bc08bab.scope - libcontainer container b5acbd45686b317aa3e216c8fb17b8f623f6bdbbda2825e9c45fa44a0bc08bab. Dec 12 17:26:54.305103 containerd[1633]: time="2025-12-12T17:26:54.304984669Z" level=info msg="StartContainer for \"b5acbd45686b317aa3e216c8fb17b8f623f6bdbbda2825e9c45fa44a0bc08bab\" returns successfully" Dec 12 17:26:54.440546 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 17:26:54.440671 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 17:26:54.654045 kubelet[2831]: I1212 17:26:54.654000 2831 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-ca-bundle\") pod \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\" (UID: \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\") " Dec 12 17:26:54.654045 kubelet[2831]: I1212 17:26:54.654052 2831 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghzzl\" (UniqueName: \"kubernetes.io/projected/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-kube-api-access-ghzzl\") pod \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\" (UID: \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\") " Dec 12 17:26:54.654389 kubelet[2831]: I1212 17:26:54.654078 2831 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-backend-key-pair\") pod \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\" (UID: \"430b5a2c-3706-4f2a-ac26-aaf032aa29a7\") " Dec 12 17:26:54.655792 kubelet[2831]: I1212 17:26:54.654665 2831 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "430b5a2c-3706-4f2a-ac26-aaf032aa29a7" (UID: "430b5a2c-3706-4f2a-ac26-aaf032aa29a7"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 17:26:54.657531 kubelet[2831]: I1212 17:26:54.657492 2831 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "430b5a2c-3706-4f2a-ac26-aaf032aa29a7" (UID: "430b5a2c-3706-4f2a-ac26-aaf032aa29a7"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 17:26:54.657898 kubelet[2831]: I1212 17:26:54.657835 2831 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-kube-api-access-ghzzl" (OuterVolumeSpecName: "kube-api-access-ghzzl") pod "430b5a2c-3706-4f2a-ac26-aaf032aa29a7" (UID: "430b5a2c-3706-4f2a-ac26-aaf032aa29a7"). InnerVolumeSpecName "kube-api-access-ghzzl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 17:26:54.755180 kubelet[2831]: I1212 17:26:54.754847 2831 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghzzl\" (UniqueName: \"kubernetes.io/projected/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-kube-api-access-ghzzl\") on node \"ci-4459-2-2-2-0ba9591bbe\" DevicePath \"\"" Dec 12 17:26:54.755180 kubelet[2831]: I1212 17:26:54.754886 2831 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-backend-key-pair\") on node \"ci-4459-2-2-2-0ba9591bbe\" DevicePath \"\"" Dec 12 17:26:54.755180 kubelet[2831]: I1212 17:26:54.754899 2831 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/430b5a2c-3706-4f2a-ac26-aaf032aa29a7-whisker-ca-bundle\") on node \"ci-4459-2-2-2-0ba9591bbe\" DevicePath \"\"" Dec 12 17:26:54.948597 systemd[1]: Removed slice kubepods-besteffort-pod430b5a2c_3706_4f2a_ac26_aaf032aa29a7.slice - libcontainer container kubepods-besteffort-pod430b5a2c_3706_4f2a_ac26_aaf032aa29a7.slice. Dec 12 17:26:54.960029 kubelet[2831]: I1212 17:26:54.959950 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vndvx" podStartSLOduration=1.6326259859999999 podStartE2EDuration="12.95993005s" podCreationTimestamp="2025-12-12 17:26:42 +0000 UTC" firstStartedPulling="2025-12-12 17:26:42.831852946 +0000 UTC m=+23.407670162" lastFinishedPulling="2025-12-12 17:26:54.15915697 +0000 UTC m=+34.734974226" observedRunningTime="2025-12-12 17:26:54.661991436 +0000 UTC m=+35.237808732" watchObservedRunningTime="2025-12-12 17:26:54.95993005 +0000 UTC m=+35.535747306" Dec 12 17:26:54.998845 systemd[1]: Created slice kubepods-besteffort-podba26b6aa_6293_4a63_974b_db5efe9dc1a9.slice - libcontainer container kubepods-besteffort-podba26b6aa_6293_4a63_974b_db5efe9dc1a9.slice. Dec 12 17:26:55.057226 kubelet[2831]: I1212 17:26:55.056661 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7bj6\" (UniqueName: \"kubernetes.io/projected/ba26b6aa-6293-4a63-974b-db5efe9dc1a9-kube-api-access-p7bj6\") pod \"whisker-54c44c7f8d-jf2mj\" (UID: \"ba26b6aa-6293-4a63-974b-db5efe9dc1a9\") " pod="calico-system/whisker-54c44c7f8d-jf2mj" Dec 12 17:26:55.057226 kubelet[2831]: I1212 17:26:55.056714 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ba26b6aa-6293-4a63-974b-db5efe9dc1a9-whisker-backend-key-pair\") pod \"whisker-54c44c7f8d-jf2mj\" (UID: \"ba26b6aa-6293-4a63-974b-db5efe9dc1a9\") " pod="calico-system/whisker-54c44c7f8d-jf2mj" Dec 12 17:26:55.057226 kubelet[2831]: I1212 17:26:55.056731 2831 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba26b6aa-6293-4a63-974b-db5efe9dc1a9-whisker-ca-bundle\") pod \"whisker-54c44c7f8d-jf2mj\" (UID: \"ba26b6aa-6293-4a63-974b-db5efe9dc1a9\") " pod="calico-system/whisker-54c44c7f8d-jf2mj" Dec 12 17:26:55.129845 systemd[1]: var-lib-kubelet-pods-430b5a2c\x2d3706\x2d4f2a\x2dac26\x2daaf032aa29a7-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dghzzl.mount: Deactivated successfully. Dec 12 17:26:55.129928 systemd[1]: var-lib-kubelet-pods-430b5a2c\x2d3706\x2d4f2a\x2dac26\x2daaf032aa29a7-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 17:26:55.301954 containerd[1633]: time="2025-12-12T17:26:55.301904818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54c44c7f8d-jf2mj,Uid:ba26b6aa-6293-4a63-974b-db5efe9dc1a9,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:55.429197 systemd-networkd[1510]: cali0d9b242fbda: Link UP Dec 12 17:26:55.429881 systemd-networkd[1510]: cali0d9b242fbda: Gained carrier Dec 12 17:26:55.442905 containerd[1633]: 2025-12-12 17:26:55.323 [INFO][4017] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 17:26:55.442905 containerd[1633]: 2025-12-12 17:26:55.341 [INFO][4017] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0 whisker-54c44c7f8d- calico-system ba26b6aa-6293-4a63-974b-db5efe9dc1a9 853 0 2025-12-12 17:26:54 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:54c44c7f8d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe whisker-54c44c7f8d-jf2mj eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali0d9b242fbda [] [] }} ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-" Dec 12 17:26:55.442905 containerd[1633]: 2025-12-12 17:26:55.341 [INFO][4017] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" Dec 12 17:26:55.442905 containerd[1633]: 2025-12-12 17:26:55.387 [INFO][4032] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" HandleID="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.387 [INFO][4032] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" HandleID="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2290), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"whisker-54c44c7f8d-jf2mj", "timestamp":"2025-12-12 17:26:55.38746368 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.387 [INFO][4032] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.387 [INFO][4032] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.387 [INFO][4032] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.397 [INFO][4032] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.402 [INFO][4032] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.406 [INFO][4032] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.408 [INFO][4032] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443142 containerd[1633]: 2025-12-12 17:26:55.410 [INFO][4032] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443329 containerd[1633]: 2025-12-12 17:26:55.410 [INFO][4032] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443329 containerd[1633]: 2025-12-12 17:26:55.411 [INFO][4032] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d Dec 12 17:26:55.443329 containerd[1633]: 2025-12-12 17:26:55.415 [INFO][4032] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443329 containerd[1633]: 2025-12-12 17:26:55.420 [INFO][4032] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.1/26] block=192.168.96.0/26 handle="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443329 containerd[1633]: 2025-12-12 17:26:55.420 [INFO][4032] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.1/26] handle="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:26:55.443329 containerd[1633]: 2025-12-12 17:26:55.420 [INFO][4032] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:55.443329 containerd[1633]: 2025-12-12 17:26:55.420 [INFO][4032] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.1/26] IPv6=[] ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" HandleID="k8s-pod-network.a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" Dec 12 17:26:55.443482 containerd[1633]: 2025-12-12 17:26:55.422 [INFO][4017] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0", GenerateName:"whisker-54c44c7f8d-", Namespace:"calico-system", SelfLink:"", UID:"ba26b6aa-6293-4a63-974b-db5efe9dc1a9", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54c44c7f8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"whisker-54c44c7f8d-jf2mj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.96.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali0d9b242fbda", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:55.443482 containerd[1633]: 2025-12-12 17:26:55.422 [INFO][4017] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.1/32] ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" Dec 12 17:26:55.443577 containerd[1633]: 2025-12-12 17:26:55.422 [INFO][4017] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0d9b242fbda ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" Dec 12 17:26:55.443577 containerd[1633]: 2025-12-12 17:26:55.430 [INFO][4017] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" Dec 12 17:26:55.443622 containerd[1633]: 2025-12-12 17:26:55.430 [INFO][4017] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0", GenerateName:"whisker-54c44c7f8d-", Namespace:"calico-system", SelfLink:"", UID:"ba26b6aa-6293-4a63-974b-db5efe9dc1a9", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54c44c7f8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d", Pod:"whisker-54c44c7f8d-jf2mj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.96.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali0d9b242fbda", MAC:"12:14:f4:9f:f2:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:55.443668 containerd[1633]: 2025-12-12 17:26:55.441 [INFO][4017] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" Namespace="calico-system" Pod="whisker-54c44c7f8d-jf2mj" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-whisker--54c44c7f8d--jf2mj-eth0" Dec 12 17:26:55.462663 containerd[1633]: time="2025-12-12T17:26:55.462619716Z" level=info msg="connecting to shim a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d" address="unix:///run/containerd/s/4bd60b0e3f53c4df199f33f56978d133b917f81d04180d171d75dc677c645123" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:55.491634 systemd[1]: Started cri-containerd-a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d.scope - libcontainer container a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d. Dec 12 17:26:55.523453 containerd[1633]: time="2025-12-12T17:26:55.523411594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54c44c7f8d-jf2mj,Uid:ba26b6aa-6293-4a63-974b-db5efe9dc1a9,Namespace:calico-system,Attempt:0,} returns sandbox id \"a8173390a64ad32d9ab704492e34d1e30f5e0c00c255726b99a7cd724a807d3d\"" Dec 12 17:26:55.525247 containerd[1633]: time="2025-12-12T17:26:55.525216278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:26:55.531352 kubelet[2831]: I1212 17:26:55.530934 2831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430b5a2c-3706-4f2a-ac26-aaf032aa29a7" path="/var/lib/kubelet/pods/430b5a2c-3706-4f2a-ac26-aaf032aa29a7/volumes" Dec 12 17:26:55.865913 containerd[1633]: time="2025-12-12T17:26:55.865792723Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:55.867612 containerd[1633]: time="2025-12-12T17:26:55.867556968Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:26:55.867704 containerd[1633]: time="2025-12-12T17:26:55.867577968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:26:55.868645 kubelet[2831]: E1212 17:26:55.868601 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:55.868960 kubelet[2831]: E1212 17:26:55.868658 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:55.868991 kubelet[2831]: E1212 17:26:55.868874 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:ed04d3e7c1884f59a096e0abc2a1d532,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:55.870950 containerd[1633]: time="2025-12-12T17:26:55.870897616Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:26:56.233020 containerd[1633]: time="2025-12-12T17:26:56.232941757Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:56.234360 containerd[1633]: time="2025-12-12T17:26:56.234291760Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:26:56.234360 containerd[1633]: time="2025-12-12T17:26:56.234327200Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:56.234594 kubelet[2831]: E1212 17:26:56.234531 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:56.234594 kubelet[2831]: E1212 17:26:56.234590 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:56.234750 kubelet[2831]: E1212 17:26:56.234701 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:56.235944 kubelet[2831]: E1212 17:26:56.235902 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:26:56.651340 kubelet[2831]: E1212 17:26:56.650914 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:26:56.656570 systemd-networkd[1510]: cali0d9b242fbda: Gained IPv6LL Dec 12 17:27:00.916971 kubelet[2831]: I1212 17:27:00.916067 2831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:27:01.367934 systemd-networkd[1510]: vxlan.calico: Link UP Dec 12 17:27:01.367942 systemd-networkd[1510]: vxlan.calico: Gained carrier Dec 12 17:27:01.528782 containerd[1633]: time="2025-12-12T17:27:01.528442633Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-hggpv,Uid:88235c34-52d7-4e16-9ac0-d4ea6115b35c,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:27:01.674602 systemd-networkd[1510]: calib9306b23cdc: Link UP Dec 12 17:27:01.675741 systemd-networkd[1510]: calib9306b23cdc: Gained carrier Dec 12 17:27:01.691910 containerd[1633]: 2025-12-12 17:27:01.576 [INFO][4439] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0 calico-apiserver-84ff4454f8- calico-apiserver 88235c34-52d7-4e16-9ac0-d4ea6115b35c 794 0 2025-12-12 17:26:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:84ff4454f8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe calico-apiserver-84ff4454f8-hggpv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib9306b23cdc [] [] }} ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-" Dec 12 17:27:01.691910 containerd[1633]: 2025-12-12 17:27:01.576 [INFO][4439] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" Dec 12 17:27:01.691910 containerd[1633]: 2025-12-12 17:27:01.616 [INFO][4465] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" HandleID="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.616 [INFO][4465] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" HandleID="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400050eb80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"calico-apiserver-84ff4454f8-hggpv", "timestamp":"2025-12-12 17:27:01.6160771 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.616 [INFO][4465] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.616 [INFO][4465] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.616 [INFO][4465] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.628 [INFO][4465] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.635 [INFO][4465] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.643 [INFO][4465] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.647 [INFO][4465] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692166 containerd[1633]: 2025-12-12 17:27:01.650 [INFO][4465] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692595 containerd[1633]: 2025-12-12 17:27:01.650 [INFO][4465] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692595 containerd[1633]: 2025-12-12 17:27:01.653 [INFO][4465] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82 Dec 12 17:27:01.692595 containerd[1633]: 2025-12-12 17:27:01.660 [INFO][4465] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692595 containerd[1633]: 2025-12-12 17:27:01.668 [INFO][4465] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.2/26] block=192.168.96.0/26 handle="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692595 containerd[1633]: 2025-12-12 17:27:01.668 [INFO][4465] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.2/26] handle="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:01.692595 containerd[1633]: 2025-12-12 17:27:01.668 [INFO][4465] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:27:01.692595 containerd[1633]: 2025-12-12 17:27:01.668 [INFO][4465] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.2/26] IPv6=[] ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" HandleID="k8s-pod-network.acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" Dec 12 17:27:01.692812 containerd[1633]: 2025-12-12 17:27:01.670 [INFO][4439] cni-plugin/k8s.go 418: Populated endpoint ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0", GenerateName:"calico-apiserver-84ff4454f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"88235c34-52d7-4e16-9ac0-d4ea6115b35c", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84ff4454f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"calico-apiserver-84ff4454f8-hggpv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib9306b23cdc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:01.692900 containerd[1633]: 2025-12-12 17:27:01.671 [INFO][4439] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.2/32] ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" Dec 12 17:27:01.692900 containerd[1633]: 2025-12-12 17:27:01.671 [INFO][4439] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib9306b23cdc ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" Dec 12 17:27:01.692900 containerd[1633]: 2025-12-12 17:27:01.676 [INFO][4439] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" Dec 12 17:27:01.693031 containerd[1633]: 2025-12-12 17:27:01.677 [INFO][4439] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0", GenerateName:"calico-apiserver-84ff4454f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"88235c34-52d7-4e16-9ac0-d4ea6115b35c", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84ff4454f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82", Pod:"calico-apiserver-84ff4454f8-hggpv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib9306b23cdc", MAC:"26:d5:67:f2:a1:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:01.693103 containerd[1633]: 2025-12-12 17:27:01.689 [INFO][4439] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-hggpv" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--hggpv-eth0" Dec 12 17:27:01.724603 containerd[1633]: time="2025-12-12T17:27:01.724560062Z" level=info msg="connecting to shim acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82" address="unix:///run/containerd/s/32c0af6963f0b59fcfbd0c557e6c2ed04a75b42ae566a17e3ab58ec2d3cb9938" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:27:01.748567 systemd[1]: Started cri-containerd-acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82.scope - libcontainer container acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82. Dec 12 17:27:01.779099 containerd[1633]: time="2025-12-12T17:27:01.778936083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-hggpv,Uid:88235c34-52d7-4e16-9ac0-d4ea6115b35c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"acc0d07aefc5519b464fc81c3a25324ce67bf99cca9a9808cb36c33f1bf81a82\"" Dec 12 17:27:01.781311 containerd[1633]: time="2025-12-12T17:27:01.781141849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:02.114081 containerd[1633]: time="2025-12-12T17:27:02.113323952Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:02.115038 containerd[1633]: time="2025-12-12T17:27:02.114935316Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:02.115038 containerd[1633]: time="2025-12-12T17:27:02.114982276Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:02.115227 kubelet[2831]: E1212 17:27:02.115171 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:02.115227 kubelet[2831]: E1212 17:27:02.115223 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:02.115546 kubelet[2831]: E1212 17:27:02.115369 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xrgd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-hggpv_calico-apiserver(88235c34-52d7-4e16-9ac0-d4ea6115b35c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:02.116798 kubelet[2831]: E1212 17:27:02.116756 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:27:02.528374 containerd[1633]: time="2025-12-12T17:27:02.528313190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697549bf97-ftlsb,Uid:bf0d9458-9e9d-4941-95ae-5b084eb50f31,Namespace:calico-system,Attempt:0,}" Dec 12 17:27:02.626152 systemd-networkd[1510]: cali82344858bff: Link UP Dec 12 17:27:02.627096 systemd-networkd[1510]: cali82344858bff: Gained carrier Dec 12 17:27:02.643075 containerd[1633]: 2025-12-12 17:27:02.565 [INFO][4549] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0 calico-kube-controllers-697549bf97- calico-system bf0d9458-9e9d-4941-95ae-5b084eb50f31 786 0 2025-12-12 17:26:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:697549bf97 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe calico-kube-controllers-697549bf97-ftlsb eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali82344858bff [] [] }} ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-" Dec 12 17:27:02.643075 containerd[1633]: 2025-12-12 17:27:02.565 [INFO][4549] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" Dec 12 17:27:02.643075 containerd[1633]: 2025-12-12 17:27:02.586 [INFO][4563] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" HandleID="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.586 [INFO][4563] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" HandleID="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004245f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"calico-kube-controllers-697549bf97-ftlsb", "timestamp":"2025-12-12 17:27:02.586373901 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.586 [INFO][4563] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.586 [INFO][4563] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.586 [INFO][4563] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.596 [INFO][4563] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.600 [INFO][4563] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.604 [INFO][4563] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.606 [INFO][4563] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.643901 containerd[1633]: 2025-12-12 17:27:02.609 [INFO][4563] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.644137 containerd[1633]: 2025-12-12 17:27:02.609 [INFO][4563] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.644137 containerd[1633]: 2025-12-12 17:27:02.610 [INFO][4563] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012 Dec 12 17:27:02.644137 containerd[1633]: 2025-12-12 17:27:02.614 [INFO][4563] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.644137 containerd[1633]: 2025-12-12 17:27:02.620 [INFO][4563] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.3/26] block=192.168.96.0/26 handle="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.644137 containerd[1633]: 2025-12-12 17:27:02.620 [INFO][4563] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.3/26] handle="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:02.644137 containerd[1633]: 2025-12-12 17:27:02.620 [INFO][4563] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:27:02.644137 containerd[1633]: 2025-12-12 17:27:02.620 [INFO][4563] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.3/26] IPv6=[] ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" HandleID="k8s-pod-network.91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" Dec 12 17:27:02.644277 containerd[1633]: 2025-12-12 17:27:02.622 [INFO][4549] cni-plugin/k8s.go 418: Populated endpoint ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0", GenerateName:"calico-kube-controllers-697549bf97-", Namespace:"calico-system", SelfLink:"", UID:"bf0d9458-9e9d-4941-95ae-5b084eb50f31", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697549bf97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"calico-kube-controllers-697549bf97-ftlsb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82344858bff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:02.644334 containerd[1633]: 2025-12-12 17:27:02.623 [INFO][4549] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.3/32] ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" Dec 12 17:27:02.644334 containerd[1633]: 2025-12-12 17:27:02.623 [INFO][4549] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali82344858bff ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" Dec 12 17:27:02.644334 containerd[1633]: 2025-12-12 17:27:02.626 [INFO][4549] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" Dec 12 17:27:02.644417 containerd[1633]: 2025-12-12 17:27:02.629 [INFO][4549] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0", GenerateName:"calico-kube-controllers-697549bf97-", Namespace:"calico-system", SelfLink:"", UID:"bf0d9458-9e9d-4941-95ae-5b084eb50f31", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697549bf97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012", Pod:"calico-kube-controllers-697549bf97-ftlsb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82344858bff", MAC:"5a:05:4d:32:6e:51", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:02.644470 containerd[1633]: 2025-12-12 17:27:02.640 [INFO][4549] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" Namespace="calico-system" Pod="calico-kube-controllers-697549bf97-ftlsb" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--kube--controllers--697549bf97--ftlsb-eth0" Dec 12 17:27:02.661101 kubelet[2831]: E1212 17:27:02.661053 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:27:02.678224 containerd[1633]: time="2025-12-12T17:27:02.678155099Z" level=info msg="connecting to shim 91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012" address="unix:///run/containerd/s/5243a418066fed986b00208f17a8f4e7fad8cbcfc2cfd84ab8c4de0e6ab7ac4e" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:27:02.702602 systemd[1]: Started cri-containerd-91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012.scope - libcontainer container 91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012. Dec 12 17:27:02.739098 containerd[1633]: time="2025-12-12T17:27:02.739058177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697549bf97-ftlsb,Uid:bf0d9458-9e9d-4941-95ae-5b084eb50f31,Namespace:calico-system,Attempt:0,} returns sandbox id \"91d786faa393a2cbaaefde763824da694fd51a269aeb819753babf858c8ae012\"" Dec 12 17:27:02.741064 containerd[1633]: time="2025-12-12T17:27:02.741028022Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:27:03.062471 containerd[1633]: time="2025-12-12T17:27:03.062328697Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:03.063709 containerd[1633]: time="2025-12-12T17:27:03.063672100Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:27:03.063799 containerd[1633]: time="2025-12-12T17:27:03.063749861Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:03.063987 kubelet[2831]: E1212 17:27:03.063928 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:03.063987 kubelet[2831]: E1212 17:27:03.063982 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:03.064210 kubelet[2831]: E1212 17:27:03.064112 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfksl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-697549bf97-ftlsb_calico-system(bf0d9458-9e9d-4941-95ae-5b084eb50f31): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:03.065342 kubelet[2831]: E1212 17:27:03.065295 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:27:03.376577 systemd-networkd[1510]: vxlan.calico: Gained IPv6LL Dec 12 17:27:03.504531 systemd-networkd[1510]: calib9306b23cdc: Gained IPv6LL Dec 12 17:27:03.528880 containerd[1633]: time="2025-12-12T17:27:03.528840429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ppkws,Uid:eda1ca3a-3908-4257-9a19-d316969a4cc3,Namespace:calico-system,Attempt:0,}" Dec 12 17:27:03.624438 systemd-networkd[1510]: cali4524e64cd84: Link UP Dec 12 17:27:03.625295 systemd-networkd[1510]: cali4524e64cd84: Gained carrier Dec 12 17:27:03.640078 containerd[1633]: 2025-12-12 17:27:03.563 [INFO][4634] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0 csi-node-driver- calico-system eda1ca3a-3908-4257-9a19-d316969a4cc3 700 0 2025-12-12 17:26:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe csi-node-driver-ppkws eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4524e64cd84 [] [] }} ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-" Dec 12 17:27:03.640078 containerd[1633]: 2025-12-12 17:27:03.563 [INFO][4634] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" Dec 12 17:27:03.640078 containerd[1633]: 2025-12-12 17:27:03.585 [INFO][4649] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" HandleID="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.585 [INFO][4649] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" HandleID="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137650), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"csi-node-driver-ppkws", "timestamp":"2025-12-12 17:27:03.585106895 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.585 [INFO][4649] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.585 [INFO][4649] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.585 [INFO][4649] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.595 [INFO][4649] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.599 [INFO][4649] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.604 [INFO][4649] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.606 [INFO][4649] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640274 containerd[1633]: 2025-12-12 17:27:03.608 [INFO][4649] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640524 containerd[1633]: 2025-12-12 17:27:03.608 [INFO][4649] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640524 containerd[1633]: 2025-12-12 17:27:03.610 [INFO][4649] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3 Dec 12 17:27:03.640524 containerd[1633]: 2025-12-12 17:27:03.614 [INFO][4649] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640524 containerd[1633]: 2025-12-12 17:27:03.620 [INFO][4649] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.4/26] block=192.168.96.0/26 handle="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640524 containerd[1633]: 2025-12-12 17:27:03.620 [INFO][4649] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.4/26] handle="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:03.640524 containerd[1633]: 2025-12-12 17:27:03.620 [INFO][4649] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:27:03.640524 containerd[1633]: 2025-12-12 17:27:03.620 [INFO][4649] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.4/26] IPv6=[] ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" HandleID="k8s-pod-network.5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" Dec 12 17:27:03.640702 containerd[1633]: 2025-12-12 17:27:03.622 [INFO][4634] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eda1ca3a-3908-4257-9a19-d316969a4cc3", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"csi-node-driver-ppkws", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.96.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4524e64cd84", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:03.640777 containerd[1633]: 2025-12-12 17:27:03.622 [INFO][4634] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.4/32] ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" Dec 12 17:27:03.640777 containerd[1633]: 2025-12-12 17:27:03.622 [INFO][4634] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4524e64cd84 ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" Dec 12 17:27:03.640777 containerd[1633]: 2025-12-12 17:27:03.625 [INFO][4634] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" Dec 12 17:27:03.640859 containerd[1633]: 2025-12-12 17:27:03.626 [INFO][4634] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eda1ca3a-3908-4257-9a19-d316969a4cc3", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3", Pod:"csi-node-driver-ppkws", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.96.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4524e64cd84", MAC:"2e:d6:27:15:53:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:03.640930 containerd[1633]: 2025-12-12 17:27:03.635 [INFO][4634] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" Namespace="calico-system" Pod="csi-node-driver-ppkws" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-csi--node--driver--ppkws-eth0" Dec 12 17:27:03.666864 kubelet[2831]: E1212 17:27:03.666779 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:27:03.667541 containerd[1633]: time="2025-12-12T17:27:03.667303868Z" level=info msg="connecting to shim 5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3" address="unix:///run/containerd/s/e7333896fb382eac1df2c8b2489e739a2198cc6a2e2ed92f3ad4ebe39a7aea95" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:27:03.668350 kubelet[2831]: E1212 17:27:03.667833 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:27:03.705763 systemd[1]: Started cri-containerd-5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3.scope - libcontainer container 5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3. Dec 12 17:27:03.731792 containerd[1633]: time="2025-12-12T17:27:03.731716156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ppkws,Uid:eda1ca3a-3908-4257-9a19-d316969a4cc3,Namespace:calico-system,Attempt:0,} returns sandbox id \"5b62ecdca5d7152baab5f08723ef2d5932b34aa235f8c67e9f066bb512c85cb3\"" Dec 12 17:27:03.734028 containerd[1633]: time="2025-12-12T17:27:03.733999762Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:27:04.065936 containerd[1633]: time="2025-12-12T17:27:04.065712543Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:04.067541 containerd[1633]: time="2025-12-12T17:27:04.067487308Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:27:04.067603 containerd[1633]: time="2025-12-12T17:27:04.067561748Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:27:04.067767 kubelet[2831]: E1212 17:27:04.067720 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:04.067837 kubelet[2831]: E1212 17:27:04.067781 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:04.067960 kubelet[2831]: E1212 17:27:04.067922 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:04.069759 containerd[1633]: time="2025-12-12T17:27:04.069718274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:27:04.272729 systemd-networkd[1510]: cali82344858bff: Gained IPv6LL Dec 12 17:27:04.419288 containerd[1633]: time="2025-12-12T17:27:04.419233302Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:04.420496 containerd[1633]: time="2025-12-12T17:27:04.420459105Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:27:04.420613 containerd[1633]: time="2025-12-12T17:27:04.420494505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:27:04.420789 kubelet[2831]: E1212 17:27:04.420741 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:04.420861 kubelet[2831]: E1212 17:27:04.420803 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:04.421017 kubelet[2831]: E1212 17:27:04.420916 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:04.422146 kubelet[2831]: E1212 17:27:04.422102 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:27:04.528653 containerd[1633]: time="2025-12-12T17:27:04.528611426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ctm45,Uid:f3a08696-1c81-4fd3-9d6b-5185ac2ed59c,Namespace:kube-system,Attempt:0,}" Dec 12 17:27:04.528811 containerd[1633]: time="2025-12-12T17:27:04.528777306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-spt5h,Uid:ea5b6410-884a-4c0c-b81e-dd7563ad6122,Namespace:kube-system,Attempt:0,}" Dec 12 17:27:04.528905 containerd[1633]: time="2025-12-12T17:27:04.528876467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v498f,Uid:ef5fc253-58d4-49dc-8ba3-6150ce1f97bc,Namespace:calico-system,Attempt:0,}" Dec 12 17:27:04.651377 systemd-networkd[1510]: calid20c32a4626: Link UP Dec 12 17:27:04.652049 systemd-networkd[1510]: calid20c32a4626: Gained carrier Dec 12 17:27:04.669755 containerd[1633]: 2025-12-12 17:27:04.575 [INFO][4713] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0 coredns-668d6bf9bc- kube-system f3a08696-1c81-4fd3-9d6b-5185ac2ed59c 782 0 2025-12-12 17:26:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe coredns-668d6bf9bc-ctm45 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid20c32a4626 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-" Dec 12 17:27:04.669755 containerd[1633]: 2025-12-12 17:27:04.575 [INFO][4713] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" Dec 12 17:27:04.669755 containerd[1633]: 2025-12-12 17:27:04.608 [INFO][4759] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" HandleID="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.608 [INFO][4759] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" HandleID="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3140), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"coredns-668d6bf9bc-ctm45", "timestamp":"2025-12-12 17:27:04.608469753 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.608 [INFO][4759] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.608 [INFO][4759] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.608 [INFO][4759] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.619 [INFO][4759] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.623 [INFO][4759] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.629 [INFO][4759] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.631 [INFO][4759] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.670164 containerd[1633]: 2025-12-12 17:27:04.633 [INFO][4759] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.671230 containerd[1633]: 2025-12-12 17:27:04.633 [INFO][4759] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.671230 containerd[1633]: 2025-12-12 17:27:04.635 [INFO][4759] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d Dec 12 17:27:04.671230 containerd[1633]: 2025-12-12 17:27:04.639 [INFO][4759] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.671230 containerd[1633]: 2025-12-12 17:27:04.646 [INFO][4759] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.5/26] block=192.168.96.0/26 handle="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.671230 containerd[1633]: 2025-12-12 17:27:04.646 [INFO][4759] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.5/26] handle="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.671230 containerd[1633]: 2025-12-12 17:27:04.646 [INFO][4759] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:27:04.671230 containerd[1633]: 2025-12-12 17:27:04.646 [INFO][4759] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.5/26] IPv6=[] ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" HandleID="k8s-pod-network.1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" Dec 12 17:27:04.671539 kubelet[2831]: E1212 17:27:04.671469 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:27:04.672110 containerd[1633]: 2025-12-12 17:27:04.649 [INFO][4713] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f3a08696-1c81-4fd3-9d6b-5185ac2ed59c", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"coredns-668d6bf9bc-ctm45", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid20c32a4626", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:04.672110 containerd[1633]: 2025-12-12 17:27:04.649 [INFO][4713] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.5/32] ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" Dec 12 17:27:04.672110 containerd[1633]: 2025-12-12 17:27:04.649 [INFO][4713] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid20c32a4626 ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" Dec 12 17:27:04.672110 containerd[1633]: 2025-12-12 17:27:04.652 [INFO][4713] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" Dec 12 17:27:04.672110 containerd[1633]: 2025-12-12 17:27:04.655 [INFO][4713] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f3a08696-1c81-4fd3-9d6b-5185ac2ed59c", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d", Pod:"coredns-668d6bf9bc-ctm45", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid20c32a4626", MAC:"4e:93:67:7e:e3:b1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:04.672110 containerd[1633]: 2025-12-12 17:27:04.666 [INFO][4713] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" Namespace="kube-system" Pod="coredns-668d6bf9bc-ctm45" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--ctm45-eth0" Dec 12 17:27:04.673463 kubelet[2831]: E1212 17:27:04.673352 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:27:04.707371 containerd[1633]: time="2025-12-12T17:27:04.707322530Z" level=info msg="connecting to shim 1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d" address="unix:///run/containerd/s/1c7745fedd6a49ecb6a9dc26683a374ee65b62743cc5ad0d436ca85fd910691c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:27:04.736613 systemd[1]: Started cri-containerd-1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d.scope - libcontainer container 1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d. Dec 12 17:27:04.764587 systemd-networkd[1510]: calia2b87f07901: Link UP Dec 12 17:27:04.768586 systemd-networkd[1510]: calia2b87f07901: Gained carrier Dec 12 17:27:04.781154 containerd[1633]: time="2025-12-12T17:27:04.781039522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ctm45,Uid:f3a08696-1c81-4fd3-9d6b-5185ac2ed59c,Namespace:kube-system,Attempt:0,} returns sandbox id \"1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d\"" Dec 12 17:27:04.786829 containerd[1633]: time="2025-12-12T17:27:04.786784216Z" level=info msg="CreateContainer within sandbox \"1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.584 [INFO][4732] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0 goldmane-666569f655- calico-system ef5fc253-58d4-49dc-8ba3-6150ce1f97bc 792 0 2025-12-12 17:26:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe goldmane-666569f655-v498f eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia2b87f07901 [] [] }} ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.584 [INFO][4732] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.610 [INFO][4767] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" HandleID="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.610 [INFO][4767] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" HandleID="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c760), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"goldmane-666569f655-v498f", "timestamp":"2025-12-12 17:27:04.610114518 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.610 [INFO][4767] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.646 [INFO][4767] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.646 [INFO][4767] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.721 [INFO][4767] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.729 [INFO][4767] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.733 [INFO][4767] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.735 [INFO][4767] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.738 [INFO][4767] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.739 [INFO][4767] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.740 [INFO][4767] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.745 [INFO][4767] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.752 [INFO][4767] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.6/26] block=192.168.96.0/26 handle="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.752 [INFO][4767] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.6/26] handle="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.752 [INFO][4767] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:27:04.786956 containerd[1633]: 2025-12-12 17:27:04.752 [INFO][4767] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.6/26] IPv6=[] ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" HandleID="k8s-pod-network.c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" Dec 12 17:27:04.787689 containerd[1633]: 2025-12-12 17:27:04.757 [INFO][4732] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"ef5fc253-58d4-49dc-8ba3-6150ce1f97bc", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"goldmane-666569f655-v498f", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.96.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia2b87f07901", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:04.787689 containerd[1633]: 2025-12-12 17:27:04.757 [INFO][4732] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.6/32] ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" Dec 12 17:27:04.787689 containerd[1633]: 2025-12-12 17:27:04.757 [INFO][4732] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia2b87f07901 ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" Dec 12 17:27:04.787689 containerd[1633]: 2025-12-12 17:27:04.770 [INFO][4732] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" Dec 12 17:27:04.787689 containerd[1633]: 2025-12-12 17:27:04.771 [INFO][4732] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"ef5fc253-58d4-49dc-8ba3-6150ce1f97bc", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a", Pod:"goldmane-666569f655-v498f", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.96.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia2b87f07901", MAC:"5a:2d:61:4b:56:f3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:04.787689 containerd[1633]: 2025-12-12 17:27:04.783 [INFO][4732] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" Namespace="calico-system" Pod="goldmane-666569f655-v498f" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-goldmane--666569f655--v498f-eth0" Dec 12 17:27:04.801791 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1694855816.mount: Deactivated successfully. Dec 12 17:27:04.804426 containerd[1633]: time="2025-12-12T17:27:04.804381822Z" level=info msg="Container ece36632fda6ded133c4b2e780ae7634a974b602d1bb4e0ee05b0511157d7670: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:27:04.816904 containerd[1633]: time="2025-12-12T17:27:04.816860295Z" level=info msg="CreateContainer within sandbox \"1da3f41cacb2bd39404bd0580669bafe95538f5d8a5de0290a75126f11c3825d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ece36632fda6ded133c4b2e780ae7634a974b602d1bb4e0ee05b0511157d7670\"" Dec 12 17:27:04.818435 containerd[1633]: time="2025-12-12T17:27:04.817848657Z" level=info msg="StartContainer for \"ece36632fda6ded133c4b2e780ae7634a974b602d1bb4e0ee05b0511157d7670\"" Dec 12 17:27:04.819426 containerd[1633]: time="2025-12-12T17:27:04.818926620Z" level=info msg="connecting to shim ece36632fda6ded133c4b2e780ae7634a974b602d1bb4e0ee05b0511157d7670" address="unix:///run/containerd/s/1c7745fedd6a49ecb6a9dc26683a374ee65b62743cc5ad0d436ca85fd910691c" protocol=ttrpc version=3 Dec 12 17:27:04.820933 containerd[1633]: time="2025-12-12T17:27:04.820891065Z" level=info msg="connecting to shim c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a" address="unix:///run/containerd/s/d0516b368ca769c95bbfd58b454a515d8b1b554d32e0b21a23a3d49b29e06e5c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:27:04.844573 systemd[1]: Started cri-containerd-ece36632fda6ded133c4b2e780ae7634a974b602d1bb4e0ee05b0511157d7670.scope - libcontainer container ece36632fda6ded133c4b2e780ae7634a974b602d1bb4e0ee05b0511157d7670. Dec 12 17:27:04.847734 systemd[1]: Started cri-containerd-c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a.scope - libcontainer container c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a. Dec 12 17:27:04.864359 systemd-networkd[1510]: cali33bbbde5eb0: Link UP Dec 12 17:27:04.865575 systemd-networkd[1510]: cali33bbbde5eb0: Gained carrier Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.583 [INFO][4727] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0 coredns-668d6bf9bc- kube-system ea5b6410-884a-4c0c-b81e-dd7563ad6122 790 0 2025-12-12 17:26:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe coredns-668d6bf9bc-spt5h eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali33bbbde5eb0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.583 [INFO][4727] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.612 [INFO][4768] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" HandleID="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.612 [INFO][4768] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" HandleID="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002b95d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"coredns-668d6bf9bc-spt5h", "timestamp":"2025-12-12 17:27:04.612118083 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.612 [INFO][4768] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.752 [INFO][4768] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.753 [INFO][4768] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.822 [INFO][4768] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.829 [INFO][4768] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.834 [INFO][4768] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.836 [INFO][4768] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.839 [INFO][4768] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.839 [INFO][4768] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.842 [INFO][4768] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1 Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.851 [INFO][4768] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.858 [INFO][4768] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.7/26] block=192.168.96.0/26 handle="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.859 [INFO][4768] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.7/26] handle="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.859 [INFO][4768] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:27:04.884506 containerd[1633]: 2025-12-12 17:27:04.859 [INFO][4768] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.7/26] IPv6=[] ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" HandleID="k8s-pod-network.29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" Dec 12 17:27:04.885002 containerd[1633]: 2025-12-12 17:27:04.861 [INFO][4727] cni-plugin/k8s.go 418: Populated endpoint ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ea5b6410-884a-4c0c-b81e-dd7563ad6122", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"coredns-668d6bf9bc-spt5h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali33bbbde5eb0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:04.885002 containerd[1633]: 2025-12-12 17:27:04.862 [INFO][4727] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.7/32] ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" Dec 12 17:27:04.885002 containerd[1633]: 2025-12-12 17:27:04.862 [INFO][4727] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali33bbbde5eb0 ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" Dec 12 17:27:04.885002 containerd[1633]: 2025-12-12 17:27:04.867 [INFO][4727] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" Dec 12 17:27:04.885002 containerd[1633]: 2025-12-12 17:27:04.867 [INFO][4727] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ea5b6410-884a-4c0c-b81e-dd7563ad6122", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1", Pod:"coredns-668d6bf9bc-spt5h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali33bbbde5eb0", MAC:"9a:27:8f:37:47:29", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:04.885002 containerd[1633]: 2025-12-12 17:27:04.878 [INFO][4727] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" Namespace="kube-system" Pod="coredns-668d6bf9bc-spt5h" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-coredns--668d6bf9bc--spt5h-eth0" Dec 12 17:27:04.912959 containerd[1633]: time="2025-12-12T17:27:04.912905304Z" level=info msg="StartContainer for \"ece36632fda6ded133c4b2e780ae7634a974b602d1bb4e0ee05b0511157d7670\" returns successfully" Dec 12 17:27:04.923886 containerd[1633]: time="2025-12-12T17:27:04.923195291Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v498f,Uid:ef5fc253-58d4-49dc-8ba3-6150ce1f97bc,Namespace:calico-system,Attempt:0,} returns sandbox id \"c11e3f986084820e290680e135c0daf7b7c76c260109317a706ce330a3f1544a\"" Dec 12 17:27:04.931115 containerd[1633]: time="2025-12-12T17:27:04.931060031Z" level=info msg="connecting to shim 29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1" address="unix:///run/containerd/s/d0cc643069db0a82115b54e1b58bd20f9cfff77d0bcae0c94acbc5d5a7a2513c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:27:04.931627 containerd[1633]: time="2025-12-12T17:27:04.931596873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:27:04.955647 systemd[1]: Started cri-containerd-29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1.scope - libcontainer container 29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1. Dec 12 17:27:04.997098 containerd[1633]: time="2025-12-12T17:27:04.997058283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-spt5h,Uid:ea5b6410-884a-4c0c-b81e-dd7563ad6122,Namespace:kube-system,Attempt:0,} returns sandbox id \"29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1\"" Dec 12 17:27:04.999487 containerd[1633]: time="2025-12-12T17:27:04.999455049Z" level=info msg="CreateContainer within sandbox \"29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:27:05.009999 containerd[1633]: time="2025-12-12T17:27:05.009944916Z" level=info msg="Container 489622e163d04c9d9cd063c138a2d091f70a86416ae715da864583386ccddd39: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:27:05.017391 containerd[1633]: time="2025-12-12T17:27:05.017352735Z" level=info msg="CreateContainer within sandbox \"29dd04a708052bf3ac31727a34b369e22cb7429bb6f73ea628607dc1412127a1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"489622e163d04c9d9cd063c138a2d091f70a86416ae715da864583386ccddd39\"" Dec 12 17:27:05.018056 containerd[1633]: time="2025-12-12T17:27:05.018029257Z" level=info msg="StartContainer for \"489622e163d04c9d9cd063c138a2d091f70a86416ae715da864583386ccddd39\"" Dec 12 17:27:05.019178 containerd[1633]: time="2025-12-12T17:27:05.019153780Z" level=info msg="connecting to shim 489622e163d04c9d9cd063c138a2d091f70a86416ae715da864583386ccddd39" address="unix:///run/containerd/s/d0cc643069db0a82115b54e1b58bd20f9cfff77d0bcae0c94acbc5d5a7a2513c" protocol=ttrpc version=3 Dec 12 17:27:05.038621 systemd[1]: Started cri-containerd-489622e163d04c9d9cd063c138a2d091f70a86416ae715da864583386ccddd39.scope - libcontainer container 489622e163d04c9d9cd063c138a2d091f70a86416ae715da864583386ccddd39. Dec 12 17:27:05.065165 containerd[1633]: time="2025-12-12T17:27:05.065114259Z" level=info msg="StartContainer for \"489622e163d04c9d9cd063c138a2d091f70a86416ae715da864583386ccddd39\" returns successfully" Dec 12 17:27:05.168562 systemd-networkd[1510]: cali4524e64cd84: Gained IPv6LL Dec 12 17:27:05.280784 containerd[1633]: time="2025-12-12T17:27:05.280601099Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:05.284287 containerd[1633]: time="2025-12-12T17:27:05.284213949Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:27:05.284417 containerd[1633]: time="2025-12-12T17:27:05.284305669Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:05.284676 kubelet[2831]: E1212 17:27:05.284615 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:05.284676 kubelet[2831]: E1212 17:27:05.284665 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:05.284831 kubelet[2831]: E1212 17:27:05.284786 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhkdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v498f_calico-system(ef5fc253-58d4-49dc-8ba3-6150ce1f97bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:05.286329 kubelet[2831]: E1212 17:27:05.285949 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:27:05.528855 containerd[1633]: time="2025-12-12T17:27:05.528812984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-4zd4q,Uid:31cf16b5-737c-43fc-8a73-dab266575bf7,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:27:05.625493 systemd-networkd[1510]: cali4e36d543c8e: Link UP Dec 12 17:27:05.626703 systemd-networkd[1510]: cali4e36d543c8e: Gained carrier Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.563 [INFO][5028] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0 calico-apiserver-84ff4454f8- calico-apiserver 31cf16b5-737c-43fc-8a73-dab266575bf7 791 0 2025-12-12 17:26:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:84ff4454f8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-2-0ba9591bbe calico-apiserver-84ff4454f8-4zd4q eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4e36d543c8e [] [] }} ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.563 [INFO][5028] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.584 [INFO][5041] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" HandleID="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.584 [INFO][5041] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" HandleID="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c790), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-2-0ba9591bbe", "pod":"calico-apiserver-84ff4454f8-4zd4q", "timestamp":"2025-12-12 17:27:05.584089208 +0000 UTC"}, Hostname:"ci-4459-2-2-2-0ba9591bbe", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.584 [INFO][5041] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.584 [INFO][5041] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.584 [INFO][5041] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-2-0ba9591bbe' Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.594 [INFO][5041] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.598 [INFO][5041] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.603 [INFO][5041] ipam/ipam.go 511: Trying affinity for 192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.605 [INFO][5041] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.608 [INFO][5041] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.0/26 host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.608 [INFO][5041] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.0/26 handle="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.609 [INFO][5041] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.614 [INFO][5041] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.0/26 handle="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.620 [INFO][5041] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.8/26] block=192.168.96.0/26 handle="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.620 [INFO][5041] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.8/26] handle="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" host="ci-4459-2-2-2-0ba9591bbe" Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.620 [INFO][5041] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:27:05.642515 containerd[1633]: 2025-12-12 17:27:05.620 [INFO][5041] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.8/26] IPv6=[] ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" HandleID="k8s-pod-network.165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Workload="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" Dec 12 17:27:05.643046 containerd[1633]: 2025-12-12 17:27:05.623 [INFO][5028] cni-plugin/k8s.go 418: Populated endpoint ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0", GenerateName:"calico-apiserver-84ff4454f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"31cf16b5-737c-43fc-8a73-dab266575bf7", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84ff4454f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"", Pod:"calico-apiserver-84ff4454f8-4zd4q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4e36d543c8e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:05.643046 containerd[1633]: 2025-12-12 17:27:05.623 [INFO][5028] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.8/32] ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" Dec 12 17:27:05.643046 containerd[1633]: 2025-12-12 17:27:05.623 [INFO][5028] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4e36d543c8e ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" Dec 12 17:27:05.643046 containerd[1633]: 2025-12-12 17:27:05.626 [INFO][5028] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" Dec 12 17:27:05.643046 containerd[1633]: 2025-12-12 17:27:05.627 [INFO][5028] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0", GenerateName:"calico-apiserver-84ff4454f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"31cf16b5-737c-43fc-8a73-dab266575bf7", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84ff4454f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-2-0ba9591bbe", ContainerID:"165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded", Pod:"calico-apiserver-84ff4454f8-4zd4q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4e36d543c8e", MAC:"06:a1:07:18:c6:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:27:05.643046 containerd[1633]: 2025-12-12 17:27:05.639 [INFO][5028] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" Namespace="calico-apiserver" Pod="calico-apiserver-84ff4454f8-4zd4q" WorkloadEndpoint="ci--4459--2--2--2--0ba9591bbe-k8s-calico--apiserver--84ff4454f8--4zd4q-eth0" Dec 12 17:27:05.667976 containerd[1633]: time="2025-12-12T17:27:05.667936305Z" level=info msg="connecting to shim 165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded" address="unix:///run/containerd/s/0c971da08fadfd7a1229cadb3d7b09ead4bcf24f3d040edb5a47b2aa97fd18b9" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:27:05.672996 kubelet[2831]: E1212 17:27:05.672959 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:27:05.681640 kubelet[2831]: E1212 17:27:05.681556 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:27:05.703598 systemd[1]: Started cri-containerd-165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded.scope - libcontainer container 165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded. Dec 12 17:27:05.720748 kubelet[2831]: I1212 17:27:05.720667 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-ctm45" podStartSLOduration=40.720647322 podStartE2EDuration="40.720647322s" podCreationTimestamp="2025-12-12 17:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:27:05.720358002 +0000 UTC m=+46.296175258" watchObservedRunningTime="2025-12-12 17:27:05.720647322 +0000 UTC m=+46.296464578" Dec 12 17:27:05.738171 kubelet[2831]: I1212 17:27:05.737823 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-spt5h" podStartSLOduration=40.737803887 podStartE2EDuration="40.737803887s" podCreationTimestamp="2025-12-12 17:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:27:05.737306806 +0000 UTC m=+46.313124062" watchObservedRunningTime="2025-12-12 17:27:05.737803887 +0000 UTC m=+46.313621143" Dec 12 17:27:05.764940 containerd[1633]: time="2025-12-12T17:27:05.764898477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84ff4454f8-4zd4q,Uid:31cf16b5-737c-43fc-8a73-dab266575bf7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"165db2a33f60eef02952b22a21fd9b0779ea981907c68149f62299126ba02ded\"" Dec 12 17:27:05.766879 containerd[1633]: time="2025-12-12T17:27:05.766821762Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:05.808573 systemd-networkd[1510]: calid20c32a4626: Gained IPv6LL Dec 12 17:27:06.109789 containerd[1633]: time="2025-12-12T17:27:06.109664373Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:06.111440 containerd[1633]: time="2025-12-12T17:27:06.111371417Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:06.111440 containerd[1633]: time="2025-12-12T17:27:06.111423937Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:06.111639 kubelet[2831]: E1212 17:27:06.111587 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:06.111639 kubelet[2831]: E1212 17:27:06.111637 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:06.112467 kubelet[2831]: E1212 17:27:06.111897 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s649l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-4zd4q_calico-apiserver(31cf16b5-737c-43fc-8a73-dab266575bf7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:06.113740 kubelet[2831]: E1212 17:27:06.113688 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:27:06.684254 kubelet[2831]: E1212 17:27:06.684198 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:27:06.686158 kubelet[2831]: E1212 17:27:06.686117 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:27:06.704606 systemd-networkd[1510]: cali33bbbde5eb0: Gained IPv6LL Dec 12 17:27:06.768587 systemd-networkd[1510]: calia2b87f07901: Gained IPv6LL Dec 12 17:27:07.344652 systemd-networkd[1510]: cali4e36d543c8e: Gained IPv6LL Dec 12 17:27:07.686674 kubelet[2831]: E1212 17:27:07.686631 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:27:08.529034 containerd[1633]: time="2025-12-12T17:27:08.528963737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:27:08.864295 containerd[1633]: time="2025-12-12T17:27:08.863941487Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:08.865525 containerd[1633]: time="2025-12-12T17:27:08.865454971Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:27:08.865525 containerd[1633]: time="2025-12-12T17:27:08.865500692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:27:08.866022 kubelet[2831]: E1212 17:27:08.865793 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:08.866022 kubelet[2831]: E1212 17:27:08.865837 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:08.866022 kubelet[2831]: E1212 17:27:08.865976 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:ed04d3e7c1884f59a096e0abc2a1d532,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:08.868329 containerd[1633]: time="2025-12-12T17:27:08.868132898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:27:09.209867 containerd[1633]: time="2025-12-12T17:27:09.209748986Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:09.211605 containerd[1633]: time="2025-12-12T17:27:09.211513150Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:27:09.211605 containerd[1633]: time="2025-12-12T17:27:09.211571550Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:09.211832 kubelet[2831]: E1212 17:27:09.211766 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:09.211925 kubelet[2831]: E1212 17:27:09.211835 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:09.212022 kubelet[2831]: E1212 17:27:09.211943 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:09.213400 kubelet[2831]: E1212 17:27:09.213354 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:27:15.529516 containerd[1633]: time="2025-12-12T17:27:15.528952321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:27:15.877749 containerd[1633]: time="2025-12-12T17:27:15.877629506Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:15.880614 containerd[1633]: time="2025-12-12T17:27:15.880518874Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:27:15.880614 containerd[1633]: time="2025-12-12T17:27:15.880578474Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:15.881070 kubelet[2831]: E1212 17:27:15.881026 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:15.883136 kubelet[2831]: E1212 17:27:15.881082 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:15.883136 kubelet[2831]: E1212 17:27:15.881389 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfksl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-697549bf97-ftlsb_calico-system(bf0d9458-9e9d-4941-95ae-5b084eb50f31): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:15.883136 kubelet[2831]: E1212 17:27:15.882576 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:27:15.884673 containerd[1633]: time="2025-12-12T17:27:15.882672040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:16.216238 containerd[1633]: time="2025-12-12T17:27:16.216121186Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:16.218308 containerd[1633]: time="2025-12-12T17:27:16.218157271Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:16.218308 containerd[1633]: time="2025-12-12T17:27:16.218238351Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:16.218607 kubelet[2831]: E1212 17:27:16.218547 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:16.218699 kubelet[2831]: E1212 17:27:16.218628 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:16.218846 kubelet[2831]: E1212 17:27:16.218759 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xrgd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-hggpv_calico-apiserver(88235c34-52d7-4e16-9ac0-d4ea6115b35c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:16.220030 kubelet[2831]: E1212 17:27:16.219968 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:27:16.530281 containerd[1633]: time="2025-12-12T17:27:16.529912481Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:27:16.857480 containerd[1633]: time="2025-12-12T17:27:16.857223771Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:16.858601 containerd[1633]: time="2025-12-12T17:27:16.858527135Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:27:16.858601 containerd[1633]: time="2025-12-12T17:27:16.858567775Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:27:16.858785 kubelet[2831]: E1212 17:27:16.858730 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:16.858785 kubelet[2831]: E1212 17:27:16.858781 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:16.858940 kubelet[2831]: E1212 17:27:16.858900 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:16.861059 containerd[1633]: time="2025-12-12T17:27:16.861024701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:27:17.204592 containerd[1633]: time="2025-12-12T17:27:17.204463873Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:17.206071 containerd[1633]: time="2025-12-12T17:27:17.206032237Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:27:17.206128 containerd[1633]: time="2025-12-12T17:27:17.206085757Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:27:17.206292 kubelet[2831]: E1212 17:27:17.206249 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:17.206584 kubelet[2831]: E1212 17:27:17.206304 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:17.206584 kubelet[2831]: E1212 17:27:17.206434 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:17.207788 kubelet[2831]: E1212 17:27:17.207603 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:27:18.529120 containerd[1633]: time="2025-12-12T17:27:18.528851953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:27:18.883297 containerd[1633]: time="2025-12-12T17:27:18.883103234Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:18.884687 containerd[1633]: time="2025-12-12T17:27:18.884654998Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:27:18.884780 containerd[1633]: time="2025-12-12T17:27:18.884729078Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:18.884960 kubelet[2831]: E1212 17:27:18.884920 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:18.885456 kubelet[2831]: E1212 17:27:18.885249 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:18.885456 kubelet[2831]: E1212 17:27:18.885390 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhkdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v498f_calico-system(ef5fc253-58d4-49dc-8ba3-6150ce1f97bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:18.887095 kubelet[2831]: E1212 17:27:18.887049 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:27:20.529204 containerd[1633]: time="2025-12-12T17:27:20.529161390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:20.529734 kubelet[2831]: E1212 17:27:20.529676 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:27:20.866435 containerd[1633]: time="2025-12-12T17:27:20.866119745Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:20.867563 containerd[1633]: time="2025-12-12T17:27:20.867522188Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:20.867631 containerd[1633]: time="2025-12-12T17:27:20.867572269Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:20.867779 kubelet[2831]: E1212 17:27:20.867742 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:20.867847 kubelet[2831]: E1212 17:27:20.867791 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:20.868032 kubelet[2831]: E1212 17:27:20.867924 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s649l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-4zd4q_calico-apiserver(31cf16b5-737c-43fc-8a73-dab266575bf7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:20.869374 kubelet[2831]: E1212 17:27:20.869330 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:27:27.529138 kubelet[2831]: E1212 17:27:27.529032 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:27:27.529861 kubelet[2831]: E1212 17:27:27.529560 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:27:28.529494 kubelet[2831]: E1212 17:27:28.529426 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:27:33.529288 kubelet[2831]: E1212 17:27:33.529160 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:27:33.529288 kubelet[2831]: E1212 17:27:33.529158 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:27:34.529239 containerd[1633]: time="2025-12-12T17:27:34.528885596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:27:34.860953 containerd[1633]: time="2025-12-12T17:27:34.860622337Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:34.862314 containerd[1633]: time="2025-12-12T17:27:34.862207861Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:27:34.862314 containerd[1633]: time="2025-12-12T17:27:34.862288142Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:27:34.862526 kubelet[2831]: E1212 17:27:34.862445 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:34.862526 kubelet[2831]: E1212 17:27:34.862518 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:34.862924 kubelet[2831]: E1212 17:27:34.862620 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:ed04d3e7c1884f59a096e0abc2a1d532,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:34.864780 containerd[1633]: time="2025-12-12T17:27:34.864751628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:27:35.216437 containerd[1633]: time="2025-12-12T17:27:35.216280261Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:35.218136 containerd[1633]: time="2025-12-12T17:27:35.218063346Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:27:35.218220 containerd[1633]: time="2025-12-12T17:27:35.218080946Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:35.218307 kubelet[2831]: E1212 17:27:35.218265 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:35.218362 kubelet[2831]: E1212 17:27:35.218318 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:35.218504 kubelet[2831]: E1212 17:27:35.218443 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:35.219707 kubelet[2831]: E1212 17:27:35.219654 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:27:40.529410 containerd[1633]: time="2025-12-12T17:27:40.529346662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:40.869993 containerd[1633]: time="2025-12-12T17:27:40.869870307Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:40.872791 containerd[1633]: time="2025-12-12T17:27:40.872735834Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:40.872875 containerd[1633]: time="2025-12-12T17:27:40.872820915Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:40.873023 kubelet[2831]: E1212 17:27:40.872957 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:40.873665 kubelet[2831]: E1212 17:27:40.873031 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:40.873665 kubelet[2831]: E1212 17:27:40.873196 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xrgd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-hggpv_calico-apiserver(88235c34-52d7-4e16-9ac0-d4ea6115b35c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:40.874445 kubelet[2831]: E1212 17:27:40.874369 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:27:41.529070 containerd[1633]: time="2025-12-12T17:27:41.528862979Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:27:41.846717 containerd[1633]: time="2025-12-12T17:27:41.845924842Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:41.849446 containerd[1633]: time="2025-12-12T17:27:41.848600969Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:27:41.849446 containerd[1633]: time="2025-12-12T17:27:41.848711650Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:27:41.850522 kubelet[2831]: E1212 17:27:41.849688 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:41.850522 kubelet[2831]: E1212 17:27:41.849744 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:41.850522 kubelet[2831]: E1212 17:27:41.849862 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:41.851926 containerd[1633]: time="2025-12-12T17:27:41.851882498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:27:42.186116 containerd[1633]: time="2025-12-12T17:27:42.186054486Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:42.187573 containerd[1633]: time="2025-12-12T17:27:42.187515090Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:27:42.187690 containerd[1633]: time="2025-12-12T17:27:42.187575490Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:27:42.187788 kubelet[2831]: E1212 17:27:42.187743 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:42.188091 kubelet[2831]: E1212 17:27:42.187798 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:42.188091 kubelet[2831]: E1212 17:27:42.187918 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:42.189187 kubelet[2831]: E1212 17:27:42.189129 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:27:42.529600 containerd[1633]: time="2025-12-12T17:27:42.529386098Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:27:42.878392 containerd[1633]: time="2025-12-12T17:27:42.878268364Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:42.879969 containerd[1633]: time="2025-12-12T17:27:42.879808248Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:27:42.879969 containerd[1633]: time="2025-12-12T17:27:42.879930888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:42.880102 kubelet[2831]: E1212 17:27:42.880072 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:42.880142 kubelet[2831]: E1212 17:27:42.880111 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:42.880286 kubelet[2831]: E1212 17:27:42.880229 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfksl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-697549bf97-ftlsb_calico-system(bf0d9458-9e9d-4941-95ae-5b084eb50f31): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:42.881469 kubelet[2831]: E1212 17:27:42.881434 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:27:45.531898 containerd[1633]: time="2025-12-12T17:27:45.531741857Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:46.057008 containerd[1633]: time="2025-12-12T17:27:46.056963141Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:46.058630 containerd[1633]: time="2025-12-12T17:27:46.058584145Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:46.058716 containerd[1633]: time="2025-12-12T17:27:46.058668666Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:46.058865 kubelet[2831]: E1212 17:27:46.058821 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:46.059164 kubelet[2831]: E1212 17:27:46.058878 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:46.059164 kubelet[2831]: E1212 17:27:46.058998 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s649l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-4zd4q_calico-apiserver(31cf16b5-737c-43fc-8a73-dab266575bf7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:46.060532 kubelet[2831]: E1212 17:27:46.060460 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:27:46.530428 kubelet[2831]: E1212 17:27:46.529511 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:27:47.529614 containerd[1633]: time="2025-12-12T17:27:47.528994205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:27:48.005988 containerd[1633]: time="2025-12-12T17:27:48.005820204Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:48.009255 containerd[1633]: time="2025-12-12T17:27:48.009198652Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:27:48.009454 containerd[1633]: time="2025-12-12T17:27:48.009231172Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:48.009659 kubelet[2831]: E1212 17:27:48.009578 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:48.009659 kubelet[2831]: E1212 17:27:48.009640 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:48.009969 kubelet[2831]: E1212 17:27:48.009775 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhkdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v498f_calico-system(ef5fc253-58d4-49dc-8ba3-6150ce1f97bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:48.010941 kubelet[2831]: E1212 17:27:48.010897 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:27:52.530320 kubelet[2831]: E1212 17:27:52.530266 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:27:56.529848 kubelet[2831]: E1212 17:27:56.529691 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:27:57.528961 kubelet[2831]: E1212 17:27:57.528840 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:27:57.528961 kubelet[2831]: E1212 17:27:57.528838 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:27:58.529047 kubelet[2831]: E1212 17:27:58.528965 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:28:01.530556 kubelet[2831]: E1212 17:28:01.530174 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:28:05.528851 kubelet[2831]: E1212 17:28:05.528696 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:28:08.531835 kubelet[2831]: E1212 17:28:08.531754 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:28:08.531835 kubelet[2831]: E1212 17:28:08.531798 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:28:11.532910 kubelet[2831]: E1212 17:28:11.532737 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:28:12.529431 kubelet[2831]: E1212 17:28:12.529364 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:28:15.528837 kubelet[2831]: E1212 17:28:15.528753 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:28:20.528657 kubelet[2831]: E1212 17:28:20.528593 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:28:22.529341 containerd[1633]: time="2025-12-12T17:28:22.529182242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:28:23.069907 containerd[1633]: time="2025-12-12T17:28:23.069851887Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:23.071195 containerd[1633]: time="2025-12-12T17:28:23.071112530Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:28:23.071334 containerd[1633]: time="2025-12-12T17:28:23.071160930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:28:23.071372 kubelet[2831]: E1212 17:28:23.071307 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:28:23.071372 kubelet[2831]: E1212 17:28:23.071359 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:28:23.071793 kubelet[2831]: E1212 17:28:23.071555 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:23.071899 containerd[1633]: time="2025-12-12T17:28:23.071710211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:28:23.570624 containerd[1633]: time="2025-12-12T17:28:23.570446227Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:23.572899 containerd[1633]: time="2025-12-12T17:28:23.572799953Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:28:23.572899 containerd[1633]: time="2025-12-12T17:28:23.572859993Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:28:23.573586 kubelet[2831]: E1212 17:28:23.573548 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:28:23.573644 kubelet[2831]: E1212 17:28:23.573598 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:28:23.574101 kubelet[2831]: E1212 17:28:23.573866 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:ed04d3e7c1884f59a096e0abc2a1d532,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:23.574216 containerd[1633]: time="2025-12-12T17:28:23.573884076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:28:24.045425 containerd[1633]: time="2025-12-12T17:28:24.045354741Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:24.049542 containerd[1633]: time="2025-12-12T17:28:24.049471031Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:28:24.049655 containerd[1633]: time="2025-12-12T17:28:24.049576432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:28:24.049835 kubelet[2831]: E1212 17:28:24.049737 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:28:24.049835 kubelet[2831]: E1212 17:28:24.049791 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:28:24.050058 kubelet[2831]: E1212 17:28:24.050000 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:24.051605 containerd[1633]: time="2025-12-12T17:28:24.051448556Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:28:24.051770 kubelet[2831]: E1212 17:28:24.051546 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:28:24.536975 containerd[1633]: time="2025-12-12T17:28:24.536913737Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:24.539373 containerd[1633]: time="2025-12-12T17:28:24.539313784Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:28:24.539450 containerd[1633]: time="2025-12-12T17:28:24.539372744Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:28:24.539595 kubelet[2831]: E1212 17:28:24.539537 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:28:24.539595 kubelet[2831]: E1212 17:28:24.539588 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:28:24.539912 kubelet[2831]: E1212 17:28:24.539705 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:24.541220 kubelet[2831]: E1212 17:28:24.541168 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:28:26.529530 containerd[1633]: time="2025-12-12T17:28:26.529464713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:28:27.048487 containerd[1633]: time="2025-12-12T17:28:27.048441141Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:27.049784 containerd[1633]: time="2025-12-12T17:28:27.049743185Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:28:27.049903 containerd[1633]: time="2025-12-12T17:28:27.049786825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:28:27.050011 kubelet[2831]: E1212 17:28:27.049971 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:28:27.050306 kubelet[2831]: E1212 17:28:27.050024 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:28:27.050306 kubelet[2831]: E1212 17:28:27.050146 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfksl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-697549bf97-ftlsb_calico-system(bf0d9458-9e9d-4941-95ae-5b084eb50f31): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:27.051362 kubelet[2831]: E1212 17:28:27.051303 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:28:27.530019 containerd[1633]: time="2025-12-12T17:28:27.529910512Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:28:28.037108 containerd[1633]: time="2025-12-12T17:28:28.037024149Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:28.038704 containerd[1633]: time="2025-12-12T17:28:28.038666514Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:28:28.038766 containerd[1633]: time="2025-12-12T17:28:28.038733314Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:28.038957 kubelet[2831]: E1212 17:28:28.038909 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:28.039007 kubelet[2831]: E1212 17:28:28.038963 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:28.039387 kubelet[2831]: E1212 17:28:28.039085 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s649l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-4zd4q_calico-apiserver(31cf16b5-737c-43fc-8a73-dab266575bf7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:28.042623 kubelet[2831]: E1212 17:28:28.042542 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:28:29.530279 containerd[1633]: time="2025-12-12T17:28:29.529785947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:28:30.003473 containerd[1633]: time="2025-12-12T17:28:30.003424657Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:30.005933 containerd[1633]: time="2025-12-12T17:28:30.005882304Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:28:30.005933 containerd[1633]: time="2025-12-12T17:28:30.005954824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:30.006125 kubelet[2831]: E1212 17:28:30.006076 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:28:30.006635 kubelet[2831]: E1212 17:28:30.006133 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:28:30.006635 kubelet[2831]: E1212 17:28:30.006251 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhkdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v498f_calico-system(ef5fc253-58d4-49dc-8ba3-6150ce1f97bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:30.007527 kubelet[2831]: E1212 17:28:30.007466 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:28:33.531732 containerd[1633]: time="2025-12-12T17:28:33.531678062Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:28:33.872569 containerd[1633]: time="2025-12-12T17:28:33.872358308Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:33.873734 containerd[1633]: time="2025-12-12T17:28:33.873691631Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:28:33.874049 containerd[1633]: time="2025-12-12T17:28:33.873767711Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:33.874197 kubelet[2831]: E1212 17:28:33.874161 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:33.874703 kubelet[2831]: E1212 17:28:33.874358 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:33.874858 kubelet[2831]: E1212 17:28:33.874812 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xrgd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-hggpv_calico-apiserver(88235c34-52d7-4e16-9ac0-d4ea6115b35c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:33.876305 kubelet[2831]: E1212 17:28:33.876265 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:28:35.529182 kubelet[2831]: E1212 17:28:35.529121 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:28:35.529734 kubelet[2831]: E1212 17:28:35.529607 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:28:39.529727 kubelet[2831]: E1212 17:28:39.529673 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:28:40.528633 kubelet[2831]: E1212 17:28:40.528487 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:28:40.969796 systemd[1]: Started sshd@7-10.0.17.31:22-147.75.109.163:49916.service - OpenSSH per-connection server daemon (147.75.109.163:49916). Dec 12 17:28:41.939516 sshd[5271]: Accepted publickey for core from 147.75.109.163 port 49916 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:41.941447 sshd-session[5271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:41.948385 systemd-logind[1612]: New session 8 of user core. Dec 12 17:28:41.961607 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 17:28:42.529913 kubelet[2831]: E1212 17:28:42.529481 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:28:42.673045 sshd[5274]: Connection closed by 147.75.109.163 port 49916 Dec 12 17:28:42.673815 sshd-session[5271]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:42.677513 systemd-logind[1612]: Session 8 logged out. Waiting for processes to exit. Dec 12 17:28:42.677750 systemd[1]: sshd@7-10.0.17.31:22-147.75.109.163:49916.service: Deactivated successfully. Dec 12 17:28:42.680848 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 17:28:42.682445 systemd-logind[1612]: Removed session 8. Dec 12 17:28:45.528778 kubelet[2831]: E1212 17:28:45.528736 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:28:47.529701 kubelet[2831]: E1212 17:28:47.529545 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:28:47.843676 systemd[1]: Started sshd@8-10.0.17.31:22-147.75.109.163:56374.service - OpenSSH per-connection server daemon (147.75.109.163:56374). Dec 12 17:28:48.529620 kubelet[2831]: E1212 17:28:48.529535 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:28:48.824154 sshd[5294]: Accepted publickey for core from 147.75.109.163 port 56374 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:48.826137 sshd-session[5294]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:48.830389 systemd-logind[1612]: New session 9 of user core. Dec 12 17:28:48.842686 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 17:28:49.562303 sshd[5297]: Connection closed by 147.75.109.163 port 56374 Dec 12 17:28:49.562820 sshd-session[5294]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:49.566422 systemd-logind[1612]: Session 9 logged out. Waiting for processes to exit. Dec 12 17:28:49.566696 systemd[1]: sshd@8-10.0.17.31:22-147.75.109.163:56374.service: Deactivated successfully. Dec 12 17:28:49.568946 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 17:28:49.570674 systemd-logind[1612]: Removed session 9. Dec 12 17:28:49.731215 systemd[1]: Started sshd@9-10.0.17.31:22-147.75.109.163:56388.service - OpenSSH per-connection server daemon (147.75.109.163:56388). Dec 12 17:28:50.712419 sshd[5312]: Accepted publickey for core from 147.75.109.163 port 56388 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:50.714304 sshd-session[5312]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:50.719330 systemd-logind[1612]: New session 10 of user core. Dec 12 17:28:50.726606 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 17:28:51.473615 sshd[5315]: Connection closed by 147.75.109.163 port 56388 Dec 12 17:28:51.475172 sshd-session[5312]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:51.478662 systemd[1]: sshd@9-10.0.17.31:22-147.75.109.163:56388.service: Deactivated successfully. Dec 12 17:28:51.480895 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 17:28:51.481962 systemd-logind[1612]: Session 10 logged out. Waiting for processes to exit. Dec 12 17:28:51.483823 systemd-logind[1612]: Removed session 10. Dec 12 17:28:51.651303 systemd[1]: Started sshd@10-10.0.17.31:22-147.75.109.163:56394.service - OpenSSH per-connection server daemon (147.75.109.163:56394). Dec 12 17:28:52.529961 kubelet[2831]: E1212 17:28:52.529878 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:28:52.692277 sshd[5327]: Accepted publickey for core from 147.75.109.163 port 56394 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:52.693703 sshd-session[5327]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:52.697505 systemd-logind[1612]: New session 11 of user core. Dec 12 17:28:52.708581 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 17:28:53.503316 sshd[5330]: Connection closed by 147.75.109.163 port 56394 Dec 12 17:28:53.503716 sshd-session[5327]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:53.508432 systemd[1]: sshd@10-10.0.17.31:22-147.75.109.163:56394.service: Deactivated successfully. Dec 12 17:28:53.510161 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 17:28:53.510892 systemd-logind[1612]: Session 11 logged out. Waiting for processes to exit. Dec 12 17:28:53.512081 systemd-logind[1612]: Removed session 11. Dec 12 17:28:54.530159 kubelet[2831]: E1212 17:28:54.530093 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:28:55.529272 kubelet[2831]: E1212 17:28:55.529212 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:28:57.530686 kubelet[2831]: E1212 17:28:57.530636 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:28:58.684084 systemd[1]: Started sshd@11-10.0.17.31:22-147.75.109.163:47208.service - OpenSSH per-connection server daemon (147.75.109.163:47208). Dec 12 17:28:59.686086 sshd[5377]: Accepted publickey for core from 147.75.109.163 port 47208 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:59.687344 sshd-session[5377]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:59.692063 systemd-logind[1612]: New session 12 of user core. Dec 12 17:28:59.697564 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 17:29:00.439465 sshd[5380]: Connection closed by 147.75.109.163 port 47208 Dec 12 17:29:00.439621 sshd-session[5377]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:00.442849 systemd[1]: sshd@11-10.0.17.31:22-147.75.109.163:47208.service: Deactivated successfully. Dec 12 17:29:00.444685 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 17:29:00.445988 systemd-logind[1612]: Session 12 logged out. Waiting for processes to exit. Dec 12 17:29:00.447392 systemd-logind[1612]: Removed session 12. Dec 12 17:29:01.531394 kubelet[2831]: E1212 17:29:01.531235 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:29:02.528672 kubelet[2831]: E1212 17:29:02.528614 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:29:05.529580 kubelet[2831]: E1212 17:29:05.529533 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:29:05.603080 systemd[1]: Started sshd@12-10.0.17.31:22-147.75.109.163:48788.service - OpenSSH per-connection server daemon (147.75.109.163:48788). Dec 12 17:29:06.576318 sshd[5394]: Accepted publickey for core from 147.75.109.163 port 48788 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:06.578203 sshd-session[5394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:06.582102 systemd-logind[1612]: New session 13 of user core. Dec 12 17:29:06.592592 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 17:29:07.332632 sshd[5397]: Connection closed by 147.75.109.163 port 48788 Dec 12 17:29:07.332475 sshd-session[5394]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:07.336572 systemd[1]: sshd@12-10.0.17.31:22-147.75.109.163:48788.service: Deactivated successfully. Dec 12 17:29:07.338739 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 17:29:07.341458 systemd-logind[1612]: Session 13 logged out. Waiting for processes to exit. Dec 12 17:29:07.344604 systemd-logind[1612]: Removed session 13. Dec 12 17:29:07.533047 kubelet[2831]: E1212 17:29:07.532625 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:29:07.533047 kubelet[2831]: E1212 17:29:07.532995 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:29:10.529361 kubelet[2831]: E1212 17:29:10.529307 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:29:12.509251 systemd[1]: Started sshd@13-10.0.17.31:22-147.75.109.163:41246.service - OpenSSH per-connection server daemon (147.75.109.163:41246). Dec 12 17:29:12.529605 kubelet[2831]: E1212 17:29:12.529556 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:29:13.497412 sshd[5410]: Accepted publickey for core from 147.75.109.163 port 41246 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:13.498669 sshd-session[5410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:13.502333 systemd-logind[1612]: New session 14 of user core. Dec 12 17:29:13.508565 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 17:29:13.529987 kubelet[2831]: E1212 17:29:13.529912 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:29:14.241005 sshd[5413]: Connection closed by 147.75.109.163 port 41246 Dec 12 17:29:14.241354 sshd-session[5410]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:14.245026 systemd[1]: sshd@13-10.0.17.31:22-147.75.109.163:41246.service: Deactivated successfully. Dec 12 17:29:14.248902 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 17:29:14.249693 systemd-logind[1612]: Session 14 logged out. Waiting for processes to exit. Dec 12 17:29:14.250791 systemd-logind[1612]: Removed session 14. Dec 12 17:29:14.408682 systemd[1]: Started sshd@14-10.0.17.31:22-147.75.109.163:41252.service - OpenSSH per-connection server daemon (147.75.109.163:41252). Dec 12 17:29:15.387975 sshd[5427]: Accepted publickey for core from 147.75.109.163 port 41252 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:15.389294 sshd-session[5427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:15.394522 systemd-logind[1612]: New session 15 of user core. Dec 12 17:29:15.401589 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 17:29:16.248453 sshd[5431]: Connection closed by 147.75.109.163 port 41252 Dec 12 17:29:16.249000 sshd-session[5427]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:16.252797 systemd-logind[1612]: Session 15 logged out. Waiting for processes to exit. Dec 12 17:29:16.253077 systemd[1]: sshd@14-10.0.17.31:22-147.75.109.163:41252.service: Deactivated successfully. Dec 12 17:29:16.254928 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 17:29:16.256486 systemd-logind[1612]: Removed session 15. Dec 12 17:29:16.416896 systemd[1]: Started sshd@15-10.0.17.31:22-147.75.109.163:41262.service - OpenSSH per-connection server daemon (147.75.109.163:41262). Dec 12 17:29:17.404478 sshd[5442]: Accepted publickey for core from 147.75.109.163 port 41262 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:17.405823 sshd-session[5442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:17.409762 systemd-logind[1612]: New session 16 of user core. Dec 12 17:29:17.418805 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 17:29:18.529501 kubelet[2831]: E1212 17:29:18.529457 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:29:18.873021 sshd[5445]: Connection closed by 147.75.109.163 port 41262 Dec 12 17:29:18.873563 sshd-session[5442]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:18.878384 systemd[1]: sshd@15-10.0.17.31:22-147.75.109.163:41262.service: Deactivated successfully. Dec 12 17:29:18.880504 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 17:29:18.881503 systemd-logind[1612]: Session 16 logged out. Waiting for processes to exit. Dec 12 17:29:18.883652 systemd-logind[1612]: Removed session 16. Dec 12 17:29:19.045376 systemd[1]: Started sshd@16-10.0.17.31:22-147.75.109.163:41264.service - OpenSSH per-connection server daemon (147.75.109.163:41264). Dec 12 17:29:20.041912 sshd[5464]: Accepted publickey for core from 147.75.109.163 port 41264 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:20.043248 sshd-session[5464]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:20.048011 systemd-logind[1612]: New session 17 of user core. Dec 12 17:29:20.055614 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 17:29:20.529263 kubelet[2831]: E1212 17:29:20.529199 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:29:20.885546 sshd[5469]: Connection closed by 147.75.109.163 port 41264 Dec 12 17:29:20.886055 sshd-session[5464]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:20.893043 systemd[1]: sshd@16-10.0.17.31:22-147.75.109.163:41264.service: Deactivated successfully. Dec 12 17:29:20.895838 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 17:29:20.898255 systemd-logind[1612]: Session 17 logged out. Waiting for processes to exit. Dec 12 17:29:20.899970 systemd-logind[1612]: Removed session 17. Dec 12 17:29:21.061254 systemd[1]: Started sshd@17-10.0.17.31:22-147.75.109.163:41270.service - OpenSSH per-connection server daemon (147.75.109.163:41270). Dec 12 17:29:21.529508 kubelet[2831]: E1212 17:29:21.529427 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:29:22.106118 sshd[5481]: Accepted publickey for core from 147.75.109.163 port 41270 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:22.106982 sshd-session[5481]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:22.110928 systemd-logind[1612]: New session 18 of user core. Dec 12 17:29:22.124627 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 17:29:22.879725 sshd[5484]: Connection closed by 147.75.109.163 port 41270 Dec 12 17:29:22.879636 sshd-session[5481]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:22.883899 systemd[1]: sshd@17-10.0.17.31:22-147.75.109.163:41270.service: Deactivated successfully. Dec 12 17:29:22.885969 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 17:29:22.888306 systemd-logind[1612]: Session 18 logged out. Waiting for processes to exit. Dec 12 17:29:22.889882 systemd-logind[1612]: Removed session 18. Dec 12 17:29:24.529023 kubelet[2831]: E1212 17:29:24.528659 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:29:26.529562 kubelet[2831]: E1212 17:29:26.529351 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:29:27.530881 kubelet[2831]: E1212 17:29:27.530829 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:29:28.056235 systemd[1]: Started sshd@18-10.0.17.31:22-147.75.109.163:36402.service - OpenSSH per-connection server daemon (147.75.109.163:36402). Dec 12 17:29:29.037304 sshd[5529]: Accepted publickey for core from 147.75.109.163 port 36402 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:29.038732 sshd-session[5529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:29.042759 systemd-logind[1612]: New session 19 of user core. Dec 12 17:29:29.060634 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 17:29:29.776468 sshd[5532]: Connection closed by 147.75.109.163 port 36402 Dec 12 17:29:29.777109 sshd-session[5529]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:29.780614 systemd[1]: sshd@18-10.0.17.31:22-147.75.109.163:36402.service: Deactivated successfully. Dec 12 17:29:29.782670 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 17:29:29.783518 systemd-logind[1612]: Session 19 logged out. Waiting for processes to exit. Dec 12 17:29:29.784739 systemd-logind[1612]: Removed session 19. Dec 12 17:29:33.529609 kubelet[2831]: E1212 17:29:33.529569 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:29:34.954527 systemd[1]: Started sshd@19-10.0.17.31:22-147.75.109.163:54876.service - OpenSSH per-connection server daemon (147.75.109.163:54876). Dec 12 17:29:35.531029 kubelet[2831]: E1212 17:29:35.530988 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:29:35.956811 sshd[5547]: Accepted publickey for core from 147.75.109.163 port 54876 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:35.958186 sshd-session[5547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:35.962439 systemd-logind[1612]: New session 20 of user core. Dec 12 17:29:35.968834 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 17:29:36.528799 kubelet[2831]: E1212 17:29:36.528739 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:29:36.716527 sshd[5550]: Connection closed by 147.75.109.163 port 54876 Dec 12 17:29:36.717085 sshd-session[5547]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:36.721896 systemd-logind[1612]: Session 20 logged out. Waiting for processes to exit. Dec 12 17:29:36.722545 systemd[1]: sshd@19-10.0.17.31:22-147.75.109.163:54876.service: Deactivated successfully. Dec 12 17:29:36.725757 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 17:29:36.728298 systemd-logind[1612]: Removed session 20. Dec 12 17:29:38.528283 kubelet[2831]: E1212 17:29:38.528215 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:29:40.529041 kubelet[2831]: E1212 17:29:40.528973 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:29:41.881424 systemd[1]: Started sshd@20-10.0.17.31:22-147.75.109.163:54886.service - OpenSSH per-connection server daemon (147.75.109.163:54886). Dec 12 17:29:42.528811 kubelet[2831]: E1212 17:29:42.528768 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:29:42.859466 sshd[5569]: Accepted publickey for core from 147.75.109.163 port 54886 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:42.861057 sshd-session[5569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:42.867380 systemd-logind[1612]: New session 21 of user core. Dec 12 17:29:42.878708 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 17:29:43.586960 sshd[5572]: Connection closed by 147.75.109.163 port 54886 Dec 12 17:29:43.587511 sshd-session[5569]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:43.590889 systemd[1]: sshd@20-10.0.17.31:22-147.75.109.163:54886.service: Deactivated successfully. Dec 12 17:29:43.593966 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 17:29:43.594657 systemd-logind[1612]: Session 21 logged out. Waiting for processes to exit. Dec 12 17:29:43.595950 systemd-logind[1612]: Removed session 21. Dec 12 17:29:46.528719 kubelet[2831]: E1212 17:29:46.528638 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:29:48.529111 kubelet[2831]: E1212 17:29:48.528756 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:29:48.752000 systemd[1]: Started sshd@21-10.0.17.31:22-147.75.109.163:59146.service - OpenSSH per-connection server daemon (147.75.109.163:59146). Dec 12 17:29:49.529537 containerd[1633]: time="2025-12-12T17:29:49.529473157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:29:49.718425 sshd[5587]: Accepted publickey for core from 147.75.109.163 port 59146 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:29:49.719261 sshd-session[5587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:49.723457 systemd-logind[1612]: New session 22 of user core. Dec 12 17:29:49.731583 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 17:29:49.883607 containerd[1633]: time="2025-12-12T17:29:49.883473757Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:49.884737 containerd[1633]: time="2025-12-12T17:29:49.884696320Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:29:49.884737 containerd[1633]: time="2025-12-12T17:29:49.884769480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:29:49.885081 kubelet[2831]: E1212 17:29:49.884924 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:29:49.885818 kubelet[2831]: E1212 17:29:49.885089 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:29:49.888016 kubelet[2831]: E1212 17:29:49.886520 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfksl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-697549bf97-ftlsb_calico-system(bf0d9458-9e9d-4941-95ae-5b084eb50f31): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:49.889756 kubelet[2831]: E1212 17:29:49.889571 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:29:50.438531 sshd[5590]: Connection closed by 147.75.109.163 port 59146 Dec 12 17:29:50.439601 sshd-session[5587]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:50.447680 systemd-logind[1612]: Session 22 logged out. Waiting for processes to exit. Dec 12 17:29:50.447951 systemd[1]: sshd@21-10.0.17.31:22-147.75.109.163:59146.service: Deactivated successfully. Dec 12 17:29:50.451618 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 17:29:50.455733 systemd-logind[1612]: Removed session 22. Dec 12 17:29:51.528446 kubelet[2831]: E1212 17:29:51.528394 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:29:53.529042 containerd[1633]: time="2025-12-12T17:29:53.528923426Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:29:53.860917 containerd[1633]: time="2025-12-12T17:29:53.860720608Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:53.862659 containerd[1633]: time="2025-12-12T17:29:53.862547813Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:29:53.862659 containerd[1633]: time="2025-12-12T17:29:53.862594173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:29:53.862833 kubelet[2831]: E1212 17:29:53.862775 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:53.863208 kubelet[2831]: E1212 17:29:53.862833 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:53.863208 kubelet[2831]: E1212 17:29:53.863049 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:ed04d3e7c1884f59a096e0abc2a1d532,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:53.863330 containerd[1633]: time="2025-12-12T17:29:53.863151735Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:29:54.197515 containerd[1633]: time="2025-12-12T17:29:54.197463163Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:54.199113 containerd[1633]: time="2025-12-12T17:29:54.199015927Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:29:54.199113 containerd[1633]: time="2025-12-12T17:29:54.199076087Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:29:54.199266 kubelet[2831]: E1212 17:29:54.199226 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:54.199304 kubelet[2831]: E1212 17:29:54.199268 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:54.199533 kubelet[2831]: E1212 17:29:54.199482 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:54.200115 containerd[1633]: time="2025-12-12T17:29:54.199902849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:29:54.536225 containerd[1633]: time="2025-12-12T17:29:54.536051403Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:54.537360 containerd[1633]: time="2025-12-12T17:29:54.537322846Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:29:54.538051 containerd[1633]: time="2025-12-12T17:29:54.537395086Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:29:54.538095 kubelet[2831]: E1212 17:29:54.537538 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:54.538095 kubelet[2831]: E1212 17:29:54.537587 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:54.538095 kubelet[2831]: E1212 17:29:54.537769 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7bj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54c44c7f8d-jf2mj_calico-system(ba26b6aa-6293-4a63-974b-db5efe9dc1a9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:54.538629 containerd[1633]: time="2025-12-12T17:29:54.538413769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:29:54.539789 kubelet[2831]: E1212 17:29:54.539733 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:29:54.880713 containerd[1633]: time="2025-12-12T17:29:54.880596898Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:54.883096 containerd[1633]: time="2025-12-12T17:29:54.883027824Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:29:54.883218 containerd[1633]: time="2025-12-12T17:29:54.883068184Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:29:54.883341 kubelet[2831]: E1212 17:29:54.883301 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:54.883640 kubelet[2831]: E1212 17:29:54.883351 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:54.883756 kubelet[2831]: E1212 17:29:54.883701 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6q8dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ppkws_calico-system(eda1ca3a-3908-4257-9a19-d316969a4cc3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:54.884962 kubelet[2831]: E1212 17:29:54.884893 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:29:59.529693 containerd[1633]: time="2025-12-12T17:29:59.529566374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:29:59.893840 containerd[1633]: time="2025-12-12T17:29:59.893784440Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:59.895100 containerd[1633]: time="2025-12-12T17:29:59.895039883Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:29:59.895192 containerd[1633]: time="2025-12-12T17:29:59.895108643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:59.895359 kubelet[2831]: E1212 17:29:59.895326 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:29:59.895859 kubelet[2831]: E1212 17:29:59.895676 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:29:59.895968 kubelet[2831]: E1212 17:29:59.895913 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhkdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v498f_calico-system(ef5fc253-58d4-49dc-8ba3-6150ce1f97bc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:59.896134 containerd[1633]: time="2025-12-12T17:29:59.895993886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:59.897390 kubelet[2831]: E1212 17:29:59.897314 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:30:00.252959 containerd[1633]: time="2025-12-12T17:30:00.252803613Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:00.254584 containerd[1633]: time="2025-12-12T17:30:00.254452057Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:00.254584 containerd[1633]: time="2025-12-12T17:30:00.254534217Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:00.254745 kubelet[2831]: E1212 17:30:00.254697 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:00.254807 kubelet[2831]: E1212 17:30:00.254750 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:00.255044 kubelet[2831]: E1212 17:30:00.254883 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s649l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-4zd4q_calico-apiserver(31cf16b5-737c-43fc-8a73-dab266575bf7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:00.256342 kubelet[2831]: E1212 17:30:00.256297 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:30:04.529164 kubelet[2831]: E1212 17:30:04.529097 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:30:05.529367 kubelet[2831]: E1212 17:30:05.528618 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:30:05.529367 kubelet[2831]: E1212 17:30:05.529270 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:30:05.530235 containerd[1633]: time="2025-12-12T17:30:05.529358559Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:30:05.876486 containerd[1633]: time="2025-12-12T17:30:05.876268140Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:05.881760 containerd[1633]: time="2025-12-12T17:30:05.881645954Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:05.881760 containerd[1633]: time="2025-12-12T17:30:05.881715794Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:05.881905 kubelet[2831]: E1212 17:30:05.881856 2831 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:05.881905 kubelet[2831]: E1212 17:30:05.881895 2831 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:05.882141 kubelet[2831]: E1212 17:30:05.882050 2831 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xrgd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84ff4454f8-hggpv_calico-apiserver(88235c34-52d7-4e16-9ac0-d4ea6115b35c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:05.883184 kubelet[2831]: E1212 17:30:05.883132 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:30:13.529375 kubelet[2831]: E1212 17:30:13.529323 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:30:14.528886 kubelet[2831]: E1212 17:30:14.528788 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:30:15.634892 kubelet[2831]: E1212 17:30:15.634804 2831 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.17.31:45890->10.0.17.78:2379: read: connection timed out" Dec 12 17:30:15.641169 systemd[1]: cri-containerd-90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624.scope: Deactivated successfully. Dec 12 17:30:15.642634 containerd[1633]: time="2025-12-12T17:30:15.641227706Z" level=info msg="received container exit event container_id:\"90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624\" id:\"90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624\" pid:2685 exit_status:1 exited_at:{seconds:1765560615 nanos:640905585}" Dec 12 17:30:15.641505 systemd[1]: cri-containerd-90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624.scope: Consumed 3.339s CPU time, 23M memory peak. Dec 12 17:30:15.663535 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624-rootfs.mount: Deactivated successfully. Dec 12 17:30:15.954738 kubelet[2831]: I1212 17:30:15.954692 2831 status_manager.go:890] "Failed to get status for pod" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" pod="calico-system/csi-node-driver-ppkws" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.17.31:45814->10.0.17.78:2379: read: connection timed out" Dec 12 17:30:15.954938 kubelet[2831]: E1212 17:30:15.954703 2831 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.17.31:45708->10.0.17.78:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-84ff4454f8-hggpv.188087d969b91be8 calico-apiserver 1324 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-84ff4454f8-hggpv,UID:88235c34-52d7-4e16-9ac0-d4ea6115b35c,APIVersion:v1,ResourceVersion:777,FieldPath:spec.containers{calico-apiserver},},Reason:Pulling,Message:Pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4459-2-2-2-0ba9591bbe,},FirstTimestamp:2025-12-12 17:27:01 +0000 UTC,LastTimestamp:2025-12-12 17:30:05.528871518 +0000 UTC m=+226.104688774,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-2-0ba9591bbe,}" Dec 12 17:30:16.086112 kubelet[2831]: I1212 17:30:16.086068 2831 scope.go:117] "RemoveContainer" containerID="90e4ce730a6218bcddaa1aae4e3c2b573ea4db095b64fd40ae076468a1303624" Dec 12 17:30:16.088444 containerd[1633]: time="2025-12-12T17:30:16.088391747Z" level=info msg="CreateContainer within sandbox \"3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 12 17:30:16.098903 containerd[1633]: time="2025-12-12T17:30:16.098859815Z" level=info msg="Container 04353fbec1d74315397bc396a91175fff06247e4e4d3d4c41b34833913438ede: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:30:16.108702 containerd[1633]: time="2025-12-12T17:30:16.108638960Z" level=info msg="CreateContainer within sandbox \"3b64c640b2b6ae5d305ee9feb73cef568a778a10cadb7a4a1c51023e3f312229\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"04353fbec1d74315397bc396a91175fff06247e4e4d3d4c41b34833913438ede\"" Dec 12 17:30:16.109202 containerd[1633]: time="2025-12-12T17:30:16.109157881Z" level=info msg="StartContainer for \"04353fbec1d74315397bc396a91175fff06247e4e4d3d4c41b34833913438ede\"" Dec 12 17:30:16.110448 containerd[1633]: time="2025-12-12T17:30:16.110421685Z" level=info msg="connecting to shim 04353fbec1d74315397bc396a91175fff06247e4e4d3d4c41b34833913438ede" address="unix:///run/containerd/s/bcf42dcba21f88cf4b4a0338396bd841ce83b965ea129123d9fcb41bb3394aef" protocol=ttrpc version=3 Dec 12 17:30:16.136593 systemd[1]: Started cri-containerd-04353fbec1d74315397bc396a91175fff06247e4e4d3d4c41b34833913438ede.scope - libcontainer container 04353fbec1d74315397bc396a91175fff06247e4e4d3d4c41b34833913438ede. Dec 12 17:30:16.154012 systemd[1]: cri-containerd-5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781.scope: Deactivated successfully. Dec 12 17:30:16.154380 systemd[1]: cri-containerd-5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781.scope: Consumed 41.399s CPU time, 104.2M memory peak. Dec 12 17:30:16.156530 containerd[1633]: time="2025-12-12T17:30:16.156477444Z" level=info msg="received container exit event container_id:\"5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781\" id:\"5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781\" pid:3153 exit_status:1 exited_at:{seconds:1765560616 nanos:155605282}" Dec 12 17:30:16.180212 containerd[1633]: time="2025-12-12T17:30:16.180165186Z" level=info msg="StartContainer for \"04353fbec1d74315397bc396a91175fff06247e4e4d3d4c41b34833913438ede\" returns successfully" Dec 12 17:30:16.182316 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781-rootfs.mount: Deactivated successfully. Dec 12 17:30:16.826943 systemd[1]: cri-containerd-82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7.scope: Deactivated successfully. Dec 12 17:30:16.827282 systemd[1]: cri-containerd-82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7.scope: Consumed 4.507s CPU time, 61.7M memory peak. Dec 12 17:30:16.829823 containerd[1633]: time="2025-12-12T17:30:16.829602713Z" level=info msg="received container exit event container_id:\"82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7\" id:\"82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7\" pid:2692 exit_status:1 exited_at:{seconds:1765560616 nanos:829313592}" Dec 12 17:30:16.850796 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7-rootfs.mount: Deactivated successfully. Dec 12 17:30:17.215031 kubelet[2831]: I1212 17:30:17.092232 2831 scope.go:117] "RemoveContainer" containerID="5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781" Dec 12 17:30:17.217555 containerd[1633]: time="2025-12-12T17:30:17.217141719Z" level=info msg="CreateContainer within sandbox \"9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 12 17:30:17.531389 kubelet[2831]: E1212 17:30:17.531225 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:30:17.822210 containerd[1633]: time="2025-12-12T17:30:17.821342369Z" level=info msg="Container d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:30:17.831112 containerd[1633]: time="2025-12-12T17:30:17.831067554Z" level=info msg="CreateContainer within sandbox \"9c4d2db4df663818e8a16530656974cb81dc11c888a4329a00ae40723acb97b8\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f\"" Dec 12 17:30:17.831653 containerd[1633]: time="2025-12-12T17:30:17.831620636Z" level=info msg="StartContainer for \"d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f\"" Dec 12 17:30:17.832520 containerd[1633]: time="2025-12-12T17:30:17.832491478Z" level=info msg="connecting to shim d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f" address="unix:///run/containerd/s/0ca6ed5de74bdef08f6783aa3e0b17c90cb02b90dc4ad02afd4d4cd1ab9b7ee1" protocol=ttrpc version=3 Dec 12 17:30:17.854671 systemd[1]: Started cri-containerd-d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f.scope - libcontainer container d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f. Dec 12 17:30:18.458511 containerd[1633]: time="2025-12-12T17:30:18.458461304Z" level=info msg="StartContainer for \"d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f\" returns successfully" Dec 12 17:30:18.459691 kubelet[2831]: I1212 17:30:18.459658 2831 scope.go:117] "RemoveContainer" containerID="82d47a83a5d79921b0968bd6679538d8d2b23c0e8a2658150a57dc1eb34839c7" Dec 12 17:30:18.462475 containerd[1633]: time="2025-12-12T17:30:18.462440634Z" level=info msg="CreateContainer within sandbox \"5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 12 17:30:18.528250 kubelet[2831]: E1212 17:30:18.528208 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:30:18.528881 kubelet[2831]: E1212 17:30:18.528841 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:30:18.605323 containerd[1633]: time="2025-12-12T17:30:18.605274965Z" level=info msg="Container ed5432c216167648685dbb45f7c35e8298186aef99f4178c259702444e107de6: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:30:18.615914 containerd[1633]: time="2025-12-12T17:30:18.615832273Z" level=info msg="CreateContainer within sandbox \"5538f4a4d426bfbf909671f028183db75515df9a7fa8f316a9efc190d40096e7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"ed5432c216167648685dbb45f7c35e8298186aef99f4178c259702444e107de6\"" Dec 12 17:30:18.616378 containerd[1633]: time="2025-12-12T17:30:18.616310754Z" level=info msg="StartContainer for \"ed5432c216167648685dbb45f7c35e8298186aef99f4178c259702444e107de6\"" Dec 12 17:30:18.617382 containerd[1633]: time="2025-12-12T17:30:18.617359917Z" level=info msg="connecting to shim ed5432c216167648685dbb45f7c35e8298186aef99f4178c259702444e107de6" address="unix:///run/containerd/s/48a06d9e8ff417ba1af96f191f63fd138aa890474e05458aff18524216c5a883" protocol=ttrpc version=3 Dec 12 17:30:18.642626 systemd[1]: Started cri-containerd-ed5432c216167648685dbb45f7c35e8298186aef99f4178c259702444e107de6.scope - libcontainer container ed5432c216167648685dbb45f7c35e8298186aef99f4178c259702444e107de6. Dec 12 17:30:18.684500 containerd[1633]: time="2025-12-12T17:30:18.684455491Z" level=info msg="StartContainer for \"ed5432c216167648685dbb45f7c35e8298186aef99f4178c259702444e107de6\" returns successfully" Dec 12 17:30:19.530020 kubelet[2831]: E1212 17:30:19.529969 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:30:24.529097 kubelet[2831]: E1212 17:30:24.529036 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v498f" podUID="ef5fc253-58d4-49dc-8ba3-6150ce1f97bc" Dec 12 17:30:25.635259 kubelet[2831]: E1212 17:30:25.635136 2831 controller.go:195] "Failed to update lease" err="Put \"https://10.0.17.31:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-2-0ba9591bbe?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 17:30:27.529725 kubelet[2831]: E1212 17:30:27.529644 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-4zd4q" podUID="31cf16b5-737c-43fc-8a73-dab266575bf7" Dec 12 17:30:29.090106 systemd[1]: cri-containerd-d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f.scope: Deactivated successfully. Dec 12 17:30:29.093450 containerd[1633]: time="2025-12-12T17:30:29.090639482Z" level=info msg="received container exit event container_id:\"d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f\" id:\"d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f\" pid:5724 exit_status:1 exited_at:{seconds:1765560629 nanos:90257841}" Dec 12 17:30:29.110786 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f-rootfs.mount: Deactivated successfully. Dec 12 17:30:29.489443 kubelet[2831]: I1212 17:30:29.489017 2831 scope.go:117] "RemoveContainer" containerID="5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781" Dec 12 17:30:29.489443 kubelet[2831]: I1212 17:30:29.489268 2831 scope.go:117] "RemoveContainer" containerID="d10e3444dbb81874736321adc07b0b361c9b5de487221653b00c4bb32b42530f" Dec 12 17:30:29.489443 kubelet[2831]: E1212 17:30:29.489430 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-kphbl_tigera-operator(cc00ffae-48a8-4f33-94b6-55ffd138b99b)\"" pod="tigera-operator/tigera-operator-7dcd859c48-kphbl" podUID="cc00ffae-48a8-4f33-94b6-55ffd138b99b" Dec 12 17:30:29.490887 containerd[1633]: time="2025-12-12T17:30:29.490708122Z" level=info msg="RemoveContainer for \"5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781\"" Dec 12 17:30:29.495383 containerd[1633]: time="2025-12-12T17:30:29.495341094Z" level=info msg="RemoveContainer for \"5b7495333847eb39a345a8d0cfd6048dc77c45d68c294f27a3e0de22dbf98781\" returns successfully" Dec 12 17:30:31.529313 kubelet[2831]: E1212 17:30:31.529238 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-54c44c7f8d-jf2mj" podUID="ba26b6aa-6293-4a63-974b-db5efe9dc1a9" Dec 12 17:30:32.529057 kubelet[2831]: E1212 17:30:32.529006 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84ff4454f8-hggpv" podUID="88235c34-52d7-4e16-9ac0-d4ea6115b35c" Dec 12 17:30:32.529057 kubelet[2831]: E1212 17:30:32.529019 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-697549bf97-ftlsb" podUID="bf0d9458-9e9d-4941-95ae-5b084eb50f31" Dec 12 17:30:33.529933 kubelet[2831]: E1212 17:30:33.529878 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ppkws" podUID="eda1ca3a-3908-4257-9a19-d316969a4cc3" Dec 12 17:30:34.906446 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec