Dec 16 12:21:58.755223 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 16 12:21:58.755250 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 16 12:21:58.755261 kernel: KASLR enabled Dec 16 12:21:58.755266 kernel: efi: EFI v2.7 by EDK II Dec 16 12:21:58.755272 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Dec 16 12:21:58.755277 kernel: random: crng init done Dec 16 12:21:58.755284 kernel: secureboot: Secure boot disabled Dec 16 12:21:58.755290 kernel: ACPI: Early table checksum verification disabled Dec 16 12:21:58.755295 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Dec 16 12:21:58.755301 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Dec 16 12:21:58.755308 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755314 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755320 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755326 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755333 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755339 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755346 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755352 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755362 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755377 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:21:58.755384 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 12:21:58.755390 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 16 12:21:58.755396 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 16 12:21:58.755402 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Dec 16 12:21:58.755408 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Dec 16 12:21:58.755414 kernel: Zone ranges: Dec 16 12:21:58.755421 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 16 12:21:58.755427 kernel: DMA32 empty Dec 16 12:21:58.755433 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 16 12:21:58.755439 kernel: Device empty Dec 16 12:21:58.755445 kernel: Movable zone start for each node Dec 16 12:21:58.755450 kernel: Early memory node ranges Dec 16 12:21:58.755456 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Dec 16 12:21:58.755462 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Dec 16 12:21:58.755468 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Dec 16 12:21:58.755474 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Dec 16 12:21:58.755480 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Dec 16 12:21:58.755486 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Dec 16 12:21:58.755493 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 16 12:21:58.755499 kernel: psci: probing for conduit method from ACPI. Dec 16 12:21:58.755508 kernel: psci: PSCIv1.3 detected in firmware. Dec 16 12:21:58.755514 kernel: psci: Using standard PSCI v0.2 function IDs Dec 16 12:21:58.755521 kernel: psci: Trusted OS migration not required Dec 16 12:21:58.755528 kernel: psci: SMC Calling Convention v1.1 Dec 16 12:21:58.755535 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 16 12:21:58.755541 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 16 12:21:58.755547 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 16 12:21:58.755554 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Dec 16 12:21:58.755560 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Dec 16 12:21:58.755566 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 16 12:21:58.755573 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 16 12:21:58.755579 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Dec 16 12:21:58.755586 kernel: Detected PIPT I-cache on CPU0 Dec 16 12:21:58.755592 kernel: CPU features: detected: GIC system register CPU interface Dec 16 12:21:58.755598 kernel: CPU features: detected: Spectre-v4 Dec 16 12:21:58.755609 kernel: CPU features: detected: Spectre-BHB Dec 16 12:21:58.755615 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 16 12:21:58.755622 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 16 12:21:58.755628 kernel: CPU features: detected: ARM erratum 1418040 Dec 16 12:21:58.755634 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 16 12:21:58.755641 kernel: alternatives: applying boot alternatives Dec 16 12:21:58.755648 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 16 12:21:58.755655 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 16 12:21:58.755661 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 16 12:21:58.755668 kernel: Fallback order for Node 0: 0 Dec 16 12:21:58.755675 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Dec 16 12:21:58.755681 kernel: Policy zone: Normal Dec 16 12:21:58.755688 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:21:58.755694 kernel: software IO TLB: area num 4. Dec 16 12:21:58.755700 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 16 12:21:58.755707 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Dec 16 12:21:58.755713 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:21:58.755720 kernel: rcu: RCU event tracing is enabled. Dec 16 12:21:58.755726 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Dec 16 12:21:58.755733 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:21:58.755739 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:21:58.755746 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:21:58.755754 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Dec 16 12:21:58.755760 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:21:58.755767 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:21:58.755773 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 16 12:21:58.755779 kernel: GICv3: 256 SPIs implemented Dec 16 12:21:58.755785 kernel: GICv3: 0 Extended SPIs implemented Dec 16 12:21:58.755792 kernel: Root IRQ handler: gic_handle_irq Dec 16 12:21:58.755798 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 16 12:21:58.755804 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 16 12:21:58.755810 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 16 12:21:58.755817 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 16 12:21:58.755823 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Dec 16 12:21:58.755831 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Dec 16 12:21:58.755837 kernel: GICv3: using LPI property table @0x0000000100130000 Dec 16 12:21:58.755844 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Dec 16 12:21:58.755850 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:21:58.755856 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:21:58.755863 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 16 12:21:58.755869 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 16 12:21:58.755875 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 16 12:21:58.755882 kernel: arm-pv: using stolen time PV Dec 16 12:21:58.755888 kernel: Console: colour dummy device 80x25 Dec 16 12:21:58.755896 kernel: ACPI: Core revision 20240827 Dec 16 12:21:58.755903 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 16 12:21:58.755910 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:21:58.755916 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:21:58.755923 kernel: landlock: Up and running. Dec 16 12:21:58.755929 kernel: SELinux: Initializing. Dec 16 12:21:58.755936 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:21:58.755942 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:21:58.755949 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:21:58.755955 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:21:58.755964 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:21:58.755970 kernel: Remapping and enabling EFI services. Dec 16 12:21:58.755977 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:21:58.755983 kernel: Detected PIPT I-cache on CPU1 Dec 16 12:21:58.755990 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 16 12:21:58.755996 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Dec 16 12:21:58.756003 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:21:58.756009 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 16 12:21:58.756016 kernel: Detected PIPT I-cache on CPU2 Dec 16 12:21:58.756028 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Dec 16 12:21:58.756035 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Dec 16 12:21:58.756042 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:21:58.756050 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Dec 16 12:21:58.756057 kernel: Detected PIPT I-cache on CPU3 Dec 16 12:21:58.756064 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Dec 16 12:21:58.756071 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Dec 16 12:21:58.756077 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:21:58.756085 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Dec 16 12:21:58.756092 kernel: smp: Brought up 1 node, 4 CPUs Dec 16 12:21:58.756099 kernel: SMP: Total of 4 processors activated. Dec 16 12:21:58.756106 kernel: CPU: All CPU(s) started at EL1 Dec 16 12:21:58.756112 kernel: CPU features: detected: 32-bit EL0 Support Dec 16 12:21:58.756119 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 16 12:21:58.756126 kernel: CPU features: detected: Common not Private translations Dec 16 12:21:58.756133 kernel: CPU features: detected: CRC32 instructions Dec 16 12:21:58.756140 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 16 12:21:58.756148 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 16 12:21:58.756155 kernel: CPU features: detected: LSE atomic instructions Dec 16 12:21:58.756162 kernel: CPU features: detected: Privileged Access Never Dec 16 12:21:58.756168 kernel: CPU features: detected: RAS Extension Support Dec 16 12:21:58.756175 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 16 12:21:58.756182 kernel: alternatives: applying system-wide alternatives Dec 16 12:21:58.756189 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Dec 16 12:21:58.756219 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Dec 16 12:21:58.756227 kernel: devtmpfs: initialized Dec 16 12:21:58.756236 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:21:58.756243 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Dec 16 12:21:58.756251 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 16 12:21:58.756257 kernel: 0 pages in range for non-PLT usage Dec 16 12:21:58.756264 kernel: 508400 pages in range for PLT usage Dec 16 12:21:58.756271 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:21:58.756278 kernel: SMBIOS 3.0.0 present. Dec 16 12:21:58.756288 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Dec 16 12:21:58.756295 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:21:58.756304 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:21:58.756311 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 16 12:21:58.756318 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 16 12:21:58.756325 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 16 12:21:58.756332 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:21:58.756339 kernel: audit: type=2000 audit(0.043:1): state=initialized audit_enabled=0 res=1 Dec 16 12:21:58.756345 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:21:58.756352 kernel: cpuidle: using governor menu Dec 16 12:21:58.756359 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 16 12:21:58.756367 kernel: ASID allocator initialised with 32768 entries Dec 16 12:21:58.756374 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:21:58.756381 kernel: Serial: AMBA PL011 UART driver Dec 16 12:21:58.756388 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:21:58.756395 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:21:58.756401 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 16 12:21:58.756408 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 16 12:21:58.756415 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:21:58.756422 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:21:58.756430 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 16 12:21:58.756437 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 16 12:21:58.756443 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:21:58.756450 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:21:58.756457 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:21:58.756464 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:21:58.756471 kernel: ACPI: Interpreter enabled Dec 16 12:21:58.756478 kernel: ACPI: Using GIC for interrupt routing Dec 16 12:21:58.756484 kernel: ACPI: MCFG table detected, 1 entries Dec 16 12:21:58.756493 kernel: ACPI: CPU0 has been hot-added Dec 16 12:21:58.756500 kernel: ACPI: CPU1 has been hot-added Dec 16 12:21:58.756506 kernel: ACPI: CPU2 has been hot-added Dec 16 12:21:58.756513 kernel: ACPI: CPU3 has been hot-added Dec 16 12:21:58.756520 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 16 12:21:58.756527 kernel: printk: legacy console [ttyAMA0] enabled Dec 16 12:21:58.756534 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:21:58.756672 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:21:58.756737 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 12:21:58.756795 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 12:21:58.756850 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 16 12:21:58.756905 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 16 12:21:58.756914 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 16 12:21:58.756921 kernel: PCI host bridge to bus 0000:00 Dec 16 12:21:58.756985 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 16 12:21:58.757048 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 16 12:21:58.757103 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 16 12:21:58.757154 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:21:58.757257 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:21:58.757332 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.757392 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Dec 16 12:21:58.757451 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:21:58.757513 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Dec 16 12:21:58.757570 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 16 12:21:58.757635 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.757693 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Dec 16 12:21:58.757750 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:21:58.757806 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Dec 16 12:21:58.757877 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.757938 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Dec 16 12:21:58.757995 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:21:58.758058 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Dec 16 12:21:58.758116 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 16 12:21:58.758182 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.758256 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Dec 16 12:21:58.758315 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:21:58.758375 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 16 12:21:58.758441 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.758498 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Dec 16 12:21:58.758555 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:21:58.758611 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Dec 16 12:21:58.758667 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 16 12:21:58.758739 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.758801 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Dec 16 12:21:58.758857 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:21:58.758914 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Dec 16 12:21:58.758971 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 16 12:21:58.759038 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.759099 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Dec 16 12:21:58.759156 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:21:58.759236 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.759296 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Dec 16 12:21:58.759353 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:21:58.759435 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.759495 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Dec 16 12:21:58.759551 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:21:58.759616 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.759677 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Dec 16 12:21:58.759734 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:21:58.759801 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.759869 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Dec 16 12:21:58.759934 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:21:58.760002 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.760067 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Dec 16 12:21:58.760125 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:21:58.760190 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.760267 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Dec 16 12:21:58.760327 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:21:58.760392 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.760460 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Dec 16 12:21:58.760521 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:21:58.760591 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.760657 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Dec 16 12:21:58.760718 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:21:58.760787 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.760852 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Dec 16 12:21:58.760913 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:21:58.760981 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.761040 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Dec 16 12:21:58.761104 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:21:58.761168 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.761239 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Dec 16 12:21:58.761297 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:21:58.761357 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Dec 16 12:21:58.761413 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:21:58.761477 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.761534 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Dec 16 12:21:58.761591 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:21:58.761646 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Dec 16 12:21:58.761702 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:21:58.761775 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.761832 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Dec 16 12:21:58.761888 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:21:58.761944 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Dec 16 12:21:58.762000 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:21:58.762063 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.762121 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Dec 16 12:21:58.762185 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:21:58.762256 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Dec 16 12:21:58.762323 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:21:58.762389 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.762452 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Dec 16 12:21:58.762512 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:21:58.762569 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Dec 16 12:21:58.762628 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:21:58.762698 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.762757 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Dec 16 12:21:58.762813 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:21:58.762869 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Dec 16 12:21:58.762925 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:21:58.762988 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.763045 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Dec 16 12:21:58.763104 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:21:58.763160 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Dec 16 12:21:58.763270 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:21:58.763337 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.763408 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Dec 16 12:21:58.763467 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:21:58.763523 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Dec 16 12:21:58.763583 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:21:58.763653 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.763724 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Dec 16 12:21:58.763804 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:21:58.763864 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Dec 16 12:21:58.763920 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:21:58.763984 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.764049 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Dec 16 12:21:58.764106 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:21:58.764162 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Dec 16 12:21:58.764234 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:21:58.764308 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.764372 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Dec 16 12:21:58.764435 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:21:58.764498 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Dec 16 12:21:58.764554 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:21:58.764630 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.764693 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Dec 16 12:21:58.764759 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:21:58.764815 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Dec 16 12:21:58.764876 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:21:58.764941 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.764998 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Dec 16 12:21:58.765062 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:21:58.765120 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Dec 16 12:21:58.765177 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:21:58.765264 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.765323 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Dec 16 12:21:58.765387 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:21:58.765452 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Dec 16 12:21:58.765511 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:21:58.765580 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.765639 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Dec 16 12:21:58.765698 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:21:58.765757 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Dec 16 12:21:58.765822 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:21:58.765887 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:21:58.765951 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Dec 16 12:21:58.766011 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:21:58.766071 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Dec 16 12:21:58.766136 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:21:58.766219 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:21:58.766281 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Dec 16 12:21:58.766346 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 16 12:21:58.766407 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:21:58.766475 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:21:58.766536 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Dec 16 12:21:58.766605 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 16 12:21:58.766664 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Dec 16 12:21:58.766723 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 16 12:21:58.766794 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:21:58.766854 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 16 12:21:58.766924 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:21:58.766999 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Dec 16 12:21:58.767062 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 16 12:21:58.767129 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Dec 16 12:21:58.767189 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Dec 16 12:21:58.767258 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 16 12:21:58.767319 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 16 12:21:58.767392 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:21:58.767452 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:21:58.767513 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 16 12:21:58.767571 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 16 12:21:58.767630 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 16 12:21:58.767696 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 12:21:58.767756 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:21:58.767815 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:21:58.767893 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 12:21:58.767952 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 16 12:21:58.768011 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 16 12:21:58.768077 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 12:21:58.768139 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:21:58.768208 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:21:58.768274 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 12:21:58.768333 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:21:58.768390 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:21:58.768449 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:21:58.768506 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:21:58.768566 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:21:58.768626 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:21:58.768682 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:21:58.768743 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:21:58.768806 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:21:58.768864 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:21:58.768924 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:21:58.768986 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 16 12:21:58.769056 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:21:58.769115 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:21:58.769179 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 16 12:21:58.769252 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:21:58.769318 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:21:58.769385 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 16 12:21:58.769443 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:21:58.769504 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:21:58.769567 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 16 12:21:58.769626 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:21:58.769684 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:21:58.769747 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 16 12:21:58.769808 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:21:58.769866 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:21:58.769928 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 16 12:21:58.769987 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:21:58.770052 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:21:58.770117 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 16 12:21:58.770175 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:21:58.770264 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:21:58.770329 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 16 12:21:58.770386 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:21:58.770442 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:21:58.770502 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 16 12:21:58.770562 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:21:58.770627 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:21:58.770689 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 16 12:21:58.770748 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:21:58.770805 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:21:58.770865 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 16 12:21:58.770923 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:21:58.770981 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:21:58.771043 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 16 12:21:58.771101 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:21:58.771158 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:21:58.771229 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 16 12:21:58.771288 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:21:58.771345 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:21:58.771421 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 16 12:21:58.771486 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:21:58.771543 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:21:58.771604 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 16 12:21:58.771663 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:21:58.771722 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:21:58.771784 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 16 12:21:58.771846 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:21:58.771904 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:21:58.771980 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 16 12:21:58.772038 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:21:58.772106 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:21:58.772167 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 16 12:21:58.772248 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:21:58.772312 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:21:58.772372 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 16 12:21:58.772430 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:21:58.772486 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:21:58.772546 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 16 12:21:58.772603 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:21:58.772659 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:21:58.772728 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 16 12:21:58.772788 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:21:58.772849 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:21:58.772918 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 16 12:21:58.772980 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:21:58.773037 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:21:58.773099 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 16 12:21:58.773160 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:21:58.773239 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:21:58.773307 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 16 12:21:58.773380 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:21:58.773438 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:21:58.773498 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 16 12:21:58.773557 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 16 12:21:58.773619 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 16 12:21:58.773677 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 16 12:21:58.773736 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 16 12:21:58.773793 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 16 12:21:58.773853 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 16 12:21:58.773909 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 16 12:21:58.773967 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 16 12:21:58.774027 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 16 12:21:58.774086 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 16 12:21:58.774144 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 16 12:21:58.774214 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 16 12:21:58.774280 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 16 12:21:58.774342 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 16 12:21:58.774402 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 16 12:21:58.774472 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 16 12:21:58.774556 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 16 12:21:58.774617 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Dec 16 12:21:58.774677 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Dec 16 12:21:58.774735 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Dec 16 12:21:58.774792 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Dec 16 12:21:58.774850 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Dec 16 12:21:58.774907 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Dec 16 12:21:58.774966 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Dec 16 12:21:58.775033 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Dec 16 12:21:58.775099 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Dec 16 12:21:58.775159 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Dec 16 12:21:58.775234 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Dec 16 12:21:58.775303 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Dec 16 12:21:58.775386 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Dec 16 12:21:58.775451 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Dec 16 12:21:58.775512 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Dec 16 12:21:58.775586 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Dec 16 12:21:58.775646 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Dec 16 12:21:58.775703 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Dec 16 12:21:58.775762 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Dec 16 12:21:58.775825 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Dec 16 12:21:58.775886 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Dec 16 12:21:58.775961 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Dec 16 12:21:58.776023 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Dec 16 12:21:58.776080 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Dec 16 12:21:58.776140 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Dec 16 12:21:58.776209 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Dec 16 12:21:58.776271 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Dec 16 12:21:58.776329 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Dec 16 12:21:58.776387 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Dec 16 12:21:58.776445 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Dec 16 12:21:58.776506 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Dec 16 12:21:58.776563 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Dec 16 12:21:58.776622 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Dec 16 12:21:58.776682 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Dec 16 12:21:58.776740 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Dec 16 12:21:58.776797 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Dec 16 12:21:58.776855 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Dec 16 12:21:58.776912 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Dec 16 12:21:58.776972 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Dec 16 12:21:58.777029 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Dec 16 12:21:58.777086 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Dec 16 12:21:58.777142 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Dec 16 12:21:58.777218 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Dec 16 12:21:58.777282 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Dec 16 12:21:58.777341 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Dec 16 12:21:58.777398 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Dec 16 12:21:58.777460 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Dec 16 12:21:58.777517 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Dec 16 12:21:58.777575 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Dec 16 12:21:58.777631 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:21:58.777689 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Dec 16 12:21:58.777746 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:21:58.777804 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Dec 16 12:21:58.777862 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:21:58.777920 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Dec 16 12:21:58.777977 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:21:58.778035 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Dec 16 12:21:58.778091 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:21:58.778149 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Dec 16 12:21:58.778219 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:21:58.778280 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Dec 16 12:21:58.778341 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:21:58.778399 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Dec 16 12:21:58.778459 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:21:58.778517 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Dec 16 12:21:58.778574 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:21:58.778632 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Dec 16 12:21:58.778689 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:21:58.778746 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Dec 16 12:21:58.778805 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:21:58.778864 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Dec 16 12:21:58.778920 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:21:58.778979 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Dec 16 12:21:58.779036 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:21:58.779093 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Dec 16 12:21:58.779157 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:21:58.779225 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Dec 16 12:21:58.779286 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:21:58.779345 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Dec 16 12:21:58.779421 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.779480 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.779538 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Dec 16 12:21:58.779597 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.779654 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.779713 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Dec 16 12:21:58.779769 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.779825 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.779883 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Dec 16 12:21:58.779940 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.779998 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.780059 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Dec 16 12:21:58.780116 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.780180 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.780259 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Dec 16 12:21:58.780318 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.780374 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.780432 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Dec 16 12:21:58.780488 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.780548 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.780605 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Dec 16 12:21:58.780662 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.780718 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.780776 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Dec 16 12:21:58.780835 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.780893 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.780954 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Dec 16 12:21:58.781014 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.781072 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.781130 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Dec 16 12:21:58.781188 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.781260 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.781318 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Dec 16 12:21:58.781379 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.781437 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.781501 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Dec 16 12:21:58.781561 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.781618 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.781676 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Dec 16 12:21:58.781734 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.781792 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.781852 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Dec 16 12:21:58.781910 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.781969 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.782030 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Dec 16 12:21:58.782090 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.782156 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.782231 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Dec 16 12:21:58.782308 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.782366 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.782427 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Dec 16 12:21:58.782489 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.782548 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.782607 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:21:58.782666 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:21:58.782725 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:21:58.782803 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:21:58.782865 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:21:58.782924 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:21:58.782998 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:21:58.783057 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:21:58.783119 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:21:58.783177 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:21:58.783255 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:21:58.783315 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:21:58.783389 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:21:58.783451 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:21:58.783509 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:21:58.783579 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.783643 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.783707 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.783765 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.783824 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.783881 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.783940 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.783998 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784056 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.784131 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784213 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.784281 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784344 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.784402 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784468 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.784525 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784590 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.784654 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784718 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.784776 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784841 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.784900 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.784959 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.785020 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.785079 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.785141 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.785213 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.785283 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.785349 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.785409 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.785468 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.785529 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.785587 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.785644 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.785707 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:21:58.785770 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:21:58.785839 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 16 12:21:58.785910 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 16 12:21:58.785971 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 16 12:21:58.786030 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:21:58.786088 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:21:58.786145 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:21:58.786222 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 16 12:21:58.786290 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:21:58.786356 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:21:58.786414 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:21:58.786480 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 16 12:21:58.786540 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 16 12:21:58.786599 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:21:58.786657 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:21:58.786715 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:21:58.786779 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 16 12:21:58.786839 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:21:58.786900 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:21:58.786959 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:21:58.787025 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 16 12:21:58.787086 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 16 12:21:58.787144 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:21:58.787212 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:21:58.787274 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:21:58.787338 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 16 12:21:58.787413 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 16 12:21:58.787473 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:21:58.787532 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:21:58.787590 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:21:58.787649 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:21:58.787708 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:21:58.787769 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:21:58.787827 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:21:58.787885 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:21:58.787942 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:21:58.788002 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:21:58.788060 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:21:58.788121 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:21:58.788179 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:21:58.788258 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:21:58.788319 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:21:58.788381 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:21:58.788442 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:21:58.788510 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:21:58.788574 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:21:58.788647 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:21:58.788705 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:21:58.788766 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:21:58.788825 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:21:58.788882 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:21:58.788942 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:21:58.789000 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:21:58.789056 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:21:58.789115 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:21:58.789171 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:21:58.789249 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:21:58.789312 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:21:58.789370 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:21:58.789433 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:21:58.789492 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:21:58.789550 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Dec 16 12:21:58.789608 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:21:58.789668 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:21:58.789731 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Dec 16 12:21:58.789791 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:21:58.789852 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:21:58.789911 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Dec 16 12:21:58.789968 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Dec 16 12:21:58.790027 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:21:58.790086 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:21:58.790145 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Dec 16 12:21:58.790237 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Dec 16 12:21:58.790298 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:21:58.790359 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:21:58.790420 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Dec 16 12:21:58.790480 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Dec 16 12:21:58.790540 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:21:58.790601 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:21:58.790660 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Dec 16 12:21:58.790723 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Dec 16 12:21:58.790781 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:21:58.790840 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:21:58.790897 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Dec 16 12:21:58.790955 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Dec 16 12:21:58.791047 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:21:58.791112 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:21:58.791172 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Dec 16 12:21:58.791246 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Dec 16 12:21:58.791309 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:21:58.791381 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:21:58.791444 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Dec 16 12:21:58.791502 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Dec 16 12:21:58.791565 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:21:58.791626 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:21:58.791685 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Dec 16 12:21:58.791742 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Dec 16 12:21:58.791804 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:21:58.791864 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:21:58.791921 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Dec 16 12:21:58.791982 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Dec 16 12:21:58.792040 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:21:58.792100 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:21:58.792162 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Dec 16 12:21:58.792235 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Dec 16 12:21:58.792298 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:21:58.792358 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:21:58.792417 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Dec 16 12:21:58.792473 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Dec 16 12:21:58.792530 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:21:58.792589 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:21:58.792656 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Dec 16 12:21:58.792714 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Dec 16 12:21:58.792773 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:21:58.792838 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:21:58.792896 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Dec 16 12:21:58.792952 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Dec 16 12:21:58.793009 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:21:58.793075 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:21:58.793133 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Dec 16 12:21:58.793190 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Dec 16 12:21:58.793265 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:21:58.793331 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:21:58.793389 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Dec 16 12:21:58.793446 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Dec 16 12:21:58.793503 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:21:58.793566 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 16 12:21:58.793618 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 16 12:21:58.793671 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 16 12:21:58.793744 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 16 12:21:58.793802 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:21:58.793864 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 16 12:21:58.793918 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:21:58.793979 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 16 12:21:58.794032 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:21:58.794105 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 16 12:21:58.794162 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:21:58.794251 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 16 12:21:58.794313 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:21:58.794375 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 16 12:21:58.794429 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:21:58.794506 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 16 12:21:58.794560 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:21:58.794621 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 16 12:21:58.794675 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:21:58.794740 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 16 12:21:58.794794 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:21:58.794858 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Dec 16 12:21:58.794916 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:21:58.794984 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Dec 16 12:21:58.795044 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:21:58.795105 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Dec 16 12:21:58.795159 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:21:58.795263 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Dec 16 12:21:58.795325 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:21:58.795408 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Dec 16 12:21:58.795477 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:21:58.795539 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Dec 16 12:21:58.795592 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:21:58.795658 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Dec 16 12:21:58.795715 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:21:58.795776 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Dec 16 12:21:58.795836 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:21:58.795896 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Dec 16 12:21:58.795955 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:21:58.796022 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Dec 16 12:21:58.796081 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Dec 16 12:21:58.796135 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:21:58.796208 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Dec 16 12:21:58.796266 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Dec 16 12:21:58.796319 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:21:58.796380 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Dec 16 12:21:58.796443 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Dec 16 12:21:58.796510 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:21:58.796577 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Dec 16 12:21:58.796632 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Dec 16 12:21:58.796685 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:21:58.796745 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Dec 16 12:21:58.796799 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Dec 16 12:21:58.796855 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:21:58.796919 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Dec 16 12:21:58.796973 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Dec 16 12:21:58.797026 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:21:58.797086 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Dec 16 12:21:58.797139 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Dec 16 12:21:58.797205 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:21:58.797269 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Dec 16 12:21:58.797323 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Dec 16 12:21:58.797376 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:21:58.797437 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 16 12:21:58.797497 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Dec 16 12:21:58.797552 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:21:58.797615 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Dec 16 12:21:58.797671 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Dec 16 12:21:58.797725 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:21:58.797787 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Dec 16 12:21:58.797841 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Dec 16 12:21:58.797895 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:21:58.797954 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Dec 16 12:21:58.798011 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Dec 16 12:21:58.798069 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:21:58.798130 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Dec 16 12:21:58.798184 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Dec 16 12:21:58.798254 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:21:58.798318 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Dec 16 12:21:58.798378 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Dec 16 12:21:58.798435 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:21:58.798500 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Dec 16 12:21:58.798554 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Dec 16 12:21:58.798607 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:21:58.798617 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 16 12:21:58.798624 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 16 12:21:58.798632 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 16 12:21:58.798642 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 16 12:21:58.798650 kernel: iommu: Default domain type: Translated Dec 16 12:21:58.798657 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 16 12:21:58.798664 kernel: efivars: Registered efivars operations Dec 16 12:21:58.798671 kernel: vgaarb: loaded Dec 16 12:21:58.798679 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 16 12:21:58.798686 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:21:58.798694 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:21:58.798701 kernel: pnp: PnP ACPI init Dec 16 12:21:58.798768 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 16 12:21:58.798780 kernel: pnp: PnP ACPI: found 1 devices Dec 16 12:21:58.798788 kernel: NET: Registered PF_INET protocol family Dec 16 12:21:58.798796 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:21:58.798803 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 16 12:21:58.798811 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:21:58.798818 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 12:21:58.798827 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 16 12:21:58.798835 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 16 12:21:58.798844 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:21:58.798852 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:21:58.798859 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:21:58.798927 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 16 12:21:58.798938 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:21:58.798946 kernel: kvm [1]: HYP mode not available Dec 16 12:21:58.798953 kernel: Initialise system trusted keyrings Dec 16 12:21:58.798966 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 16 12:21:58.798976 kernel: Key type asymmetric registered Dec 16 12:21:58.798983 kernel: Asymmetric key parser 'x509' registered Dec 16 12:21:58.798994 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 16 12:21:58.799002 kernel: io scheduler mq-deadline registered Dec 16 12:21:58.799009 kernel: io scheduler kyber registered Dec 16 12:21:58.799019 kernel: io scheduler bfq registered Dec 16 12:21:58.799027 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 16 12:21:58.799110 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Dec 16 12:21:58.799176 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Dec 16 12:21:58.799253 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.799314 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Dec 16 12:21:58.799385 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Dec 16 12:21:58.799449 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.799510 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Dec 16 12:21:58.799568 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Dec 16 12:21:58.799626 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.799686 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Dec 16 12:21:58.799747 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Dec 16 12:21:58.799805 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.799864 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Dec 16 12:21:58.799922 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Dec 16 12:21:58.799980 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.800047 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Dec 16 12:21:58.800114 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Dec 16 12:21:58.800175 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.800260 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Dec 16 12:21:58.800321 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Dec 16 12:21:58.800378 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.800437 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Dec 16 12:21:58.800495 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Dec 16 12:21:58.800552 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.800563 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 16 12:21:58.800620 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Dec 16 12:21:58.800680 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Dec 16 12:21:58.800752 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.800811 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Dec 16 12:21:58.800879 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Dec 16 12:21:58.800939 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.800998 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Dec 16 12:21:58.801060 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Dec 16 12:21:58.801121 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.801184 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Dec 16 12:21:58.801259 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Dec 16 12:21:58.801318 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.801377 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Dec 16 12:21:58.801435 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Dec 16 12:21:58.801492 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.801551 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Dec 16 12:21:58.801612 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Dec 16 12:21:58.801671 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.801736 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Dec 16 12:21:58.801795 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Dec 16 12:21:58.801852 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.801915 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Dec 16 12:21:58.801974 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Dec 16 12:21:58.802032 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.802048 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 16 12:21:58.802107 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Dec 16 12:21:58.802164 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Dec 16 12:21:58.802243 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.802303 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Dec 16 12:21:58.802360 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Dec 16 12:21:58.802428 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.802489 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Dec 16 12:21:58.802550 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Dec 16 12:21:58.802610 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.802670 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Dec 16 12:21:58.802728 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Dec 16 12:21:58.802785 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.802844 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Dec 16 12:21:58.802901 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Dec 16 12:21:58.802960 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.803020 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Dec 16 12:21:58.803077 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Dec 16 12:21:58.803135 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.803201 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Dec 16 12:21:58.803263 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Dec 16 12:21:58.803320 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.803390 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Dec 16 12:21:58.803455 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Dec 16 12:21:58.803514 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.803528 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 16 12:21:58.803587 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Dec 16 12:21:58.803645 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Dec 16 12:21:58.803703 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.803762 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Dec 16 12:21:58.803819 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Dec 16 12:21:58.803879 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.803938 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Dec 16 12:21:58.804002 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Dec 16 12:21:58.804064 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.804125 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Dec 16 12:21:58.804182 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Dec 16 12:21:58.804265 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.804328 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Dec 16 12:21:58.804390 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Dec 16 12:21:58.804448 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.804506 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Dec 16 12:21:58.804564 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Dec 16 12:21:58.804622 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.804682 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Dec 16 12:21:58.804740 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Dec 16 12:21:58.804796 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.804858 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Dec 16 12:21:58.804921 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Dec 16 12:21:58.804983 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.805044 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Dec 16 12:21:58.805101 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Dec 16 12:21:58.805163 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:21:58.805173 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 16 12:21:58.805183 kernel: ACPI: button: Power Button [PWRB] Dec 16 12:21:58.805262 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Dec 16 12:21:58.805340 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 16 12:21:58.805351 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:21:58.805358 kernel: thunder_xcv, ver 1.0 Dec 16 12:21:58.805366 kernel: thunder_bgx, ver 1.0 Dec 16 12:21:58.805373 kernel: nicpf, ver 1.0 Dec 16 12:21:58.805380 kernel: nicvf, ver 1.0 Dec 16 12:21:58.805455 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 16 12:21:58.805514 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-16T12:21:58 UTC (1765887718) Dec 16 12:21:58.805524 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:21:58.805532 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 16 12:21:58.805539 kernel: watchdog: NMI not fully supported Dec 16 12:21:58.805547 kernel: watchdog: Hard watchdog permanently disabled Dec 16 12:21:58.805554 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:21:58.805561 kernel: Segment Routing with IPv6 Dec 16 12:21:58.805568 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:21:58.805578 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:21:58.805585 kernel: Key type dns_resolver registered Dec 16 12:21:58.805592 kernel: registered taskstats version 1 Dec 16 12:21:58.805599 kernel: Loading compiled-in X.509 certificates Dec 16 12:21:58.805607 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 16 12:21:58.805614 kernel: Demotion targets for Node 0: null Dec 16 12:21:58.805621 kernel: Key type .fscrypt registered Dec 16 12:21:58.805628 kernel: Key type fscrypt-provisioning registered Dec 16 12:21:58.805636 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:21:58.805649 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:21:58.805657 kernel: ima: No architecture policies found Dec 16 12:21:58.805664 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 16 12:21:58.805671 kernel: clk: Disabling unused clocks Dec 16 12:21:58.805679 kernel: PM: genpd: Disabling unused power domains Dec 16 12:21:58.805686 kernel: Warning: unable to open an initial console. Dec 16 12:21:58.805693 kernel: Freeing unused kernel memory: 39552K Dec 16 12:21:58.805700 kernel: Run /init as init process Dec 16 12:21:58.805708 kernel: with arguments: Dec 16 12:21:58.805717 kernel: /init Dec 16 12:21:58.805724 kernel: with environment: Dec 16 12:21:58.805731 kernel: HOME=/ Dec 16 12:21:58.805738 kernel: TERM=linux Dec 16 12:21:58.805746 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:21:58.805756 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:21:58.805765 systemd[1]: Detected virtualization kvm. Dec 16 12:21:58.805772 systemd[1]: Detected architecture arm64. Dec 16 12:21:58.805781 systemd[1]: Running in initrd. Dec 16 12:21:58.805789 systemd[1]: No hostname configured, using default hostname. Dec 16 12:21:58.805800 systemd[1]: Hostname set to . Dec 16 12:21:58.805808 systemd[1]: Initializing machine ID from VM UUID. Dec 16 12:21:58.805815 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:21:58.805823 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:21:58.805839 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:21:58.805849 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:21:58.805857 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:21:58.805865 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:21:58.805875 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:21:58.805885 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 16 12:21:58.805893 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 16 12:21:58.805901 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:21:58.805909 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:21:58.805917 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:21:58.805925 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:21:58.805934 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:21:58.805942 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:21:58.805951 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:21:58.805959 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:21:58.805970 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:21:58.805978 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:21:58.805987 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:21:58.805995 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:21:58.806003 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:21:58.806015 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:21:58.806024 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:21:58.806036 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:21:58.806045 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:21:58.806053 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:21:58.806061 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:21:58.806069 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:21:58.806079 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:21:58.806087 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:21:58.806095 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:21:58.806103 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:21:58.806113 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:21:58.806122 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:21:58.806154 systemd-journald[312]: Collecting audit messages is disabled. Dec 16 12:21:58.806175 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:21:58.806184 kernel: Bridge firewalling registered Dec 16 12:21:58.806215 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:21:58.806226 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:21:58.806237 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:21:58.806246 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:21:58.806254 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:21:58.806262 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:21:58.806270 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:21:58.806282 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:21:58.806290 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:21:58.806298 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:21:58.806308 systemd-journald[312]: Journal started Dec 16 12:21:58.806327 systemd-journald[312]: Runtime Journal (/run/log/journal/a1664cffe8504a639b52ae91a8acd3a0) is 8M, max 319.5M, 311.5M free. Dec 16 12:21:58.748965 systemd-modules-load[313]: Inserted module 'overlay' Dec 16 12:21:58.764224 systemd-modules-load[313]: Inserted module 'br_netfilter' Dec 16 12:21:58.815344 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:21:58.815887 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:21:58.823239 systemd-tmpfiles[351]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:21:58.825755 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:21:58.827733 dracut-cmdline[344]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 16 12:21:58.832503 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:21:58.869990 systemd-resolved[380]: Positive Trust Anchors: Dec 16 12:21:58.870011 systemd-resolved[380]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:21:58.870042 systemd-resolved[380]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:21:58.875621 systemd-resolved[380]: Defaulting to hostname 'linux'. Dec 16 12:21:58.876655 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:21:58.878078 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:21:58.900228 kernel: SCSI subsystem initialized Dec 16 12:21:58.905214 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:21:58.912224 kernel: iscsi: registered transport (tcp) Dec 16 12:21:58.925224 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:21:58.925243 kernel: QLogic iSCSI HBA Driver Dec 16 12:21:58.942293 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:21:58.967070 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:21:58.968490 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:21:59.014943 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:21:59.017142 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:21:59.079249 kernel: raid6: neonx8 gen() 15761 MB/s Dec 16 12:21:59.096211 kernel: raid6: neonx4 gen() 15751 MB/s Dec 16 12:21:59.113206 kernel: raid6: neonx2 gen() 13215 MB/s Dec 16 12:21:59.130243 kernel: raid6: neonx1 gen() 10461 MB/s Dec 16 12:21:59.147209 kernel: raid6: int64x8 gen() 6897 MB/s Dec 16 12:21:59.164220 kernel: raid6: int64x4 gen() 7363 MB/s Dec 16 12:21:59.181212 kernel: raid6: int64x2 gen() 6089 MB/s Dec 16 12:21:59.198221 kernel: raid6: int64x1 gen() 5040 MB/s Dec 16 12:21:59.198278 kernel: raid6: using algorithm neonx8 gen() 15761 MB/s Dec 16 12:21:59.215261 kernel: raid6: .... xor() 12046 MB/s, rmw enabled Dec 16 12:21:59.215311 kernel: raid6: using neon recovery algorithm Dec 16 12:21:59.220430 kernel: xor: measuring software checksum speed Dec 16 12:21:59.220455 kernel: 8regs : 21658 MB/sec Dec 16 12:21:59.221530 kernel: 32regs : 21693 MB/sec Dec 16 12:21:59.221545 kernel: arm64_neon : 28013 MB/sec Dec 16 12:21:59.221554 kernel: xor: using function: arm64_neon (28013 MB/sec) Dec 16 12:21:59.274231 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:21:59.280710 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:21:59.283076 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:21:59.314217 systemd-udevd[567]: Using default interface naming scheme 'v255'. Dec 16 12:21:59.318326 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:21:59.320280 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:21:59.348515 dracut-pre-trigger[576]: rd.md=0: removing MD RAID activation Dec 16 12:21:59.373095 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:21:59.375637 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:21:59.454013 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:21:59.456732 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:21:59.507223 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Dec 16 12:21:59.519056 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 16 12:21:59.534399 kernel: ACPI: bus type USB registered Dec 16 12:21:59.534456 kernel: usbcore: registered new interface driver usbfs Dec 16 12:21:59.538219 kernel: usbcore: registered new interface driver hub Dec 16 12:21:59.539217 kernel: usbcore: registered new device driver usb Dec 16 12:21:59.557155 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:21:59.557329 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:21:59.560686 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:21:59.565025 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:21:59.565235 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:21:59.565330 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:21:59.566656 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:21:59.570630 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:21:59.570898 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:21:59.571022 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:21:59.571105 kernel: hub 1-0:1.0: USB hub found Dec 16 12:21:59.571223 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:21:59.572674 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:21:59.572717 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:21:59.574518 kernel: GPT:17805311 != 104857599 Dec 16 12:21:59.574543 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:21:59.574553 kernel: GPT:17805311 != 104857599 Dec 16 12:21:59.574562 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:21:59.575429 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 12:21:59.576441 kernel: hub 2-0:1.0: USB hub found Dec 16 12:21:59.576675 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:21:59.590944 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:21:59.632105 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 12:21:59.633359 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:21:59.641852 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:21:59.650062 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 12:21:59.656510 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 16 12:21:59.657460 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 12:21:59.660306 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:21:59.662262 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:21:59.663891 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:21:59.666220 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:21:59.667705 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:21:59.688789 disk-uuid[668]: Primary Header is updated. Dec 16 12:21:59.688789 disk-uuid[668]: Secondary Entries is updated. Dec 16 12:21:59.688789 disk-uuid[668]: Secondary Header is updated. Dec 16 12:21:59.691603 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:21:59.696217 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 12:21:59.810245 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:21:59.940658 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 16 12:21:59.940730 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:21:59.941505 kernel: usbcore: registered new interface driver usbhid Dec 16 12:21:59.942214 kernel: usbhid: USB HID core driver Dec 16 12:22:00.046240 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 16 12:22:00.171233 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 16 12:22:00.223262 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 16 12:22:00.707219 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 12:22:00.707445 disk-uuid[673]: The operation has completed successfully. Dec 16 12:22:00.753923 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:22:00.754027 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:22:00.773340 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 16 12:22:00.792326 sh[689]: Success Dec 16 12:22:00.805702 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:22:00.805782 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:22:00.807212 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:22:00.814224 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 16 12:22:00.864690 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:22:00.867494 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 16 12:22:00.882559 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 16 12:22:00.896225 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (701) Dec 16 12:22:00.900209 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 16 12:22:00.900233 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:22:00.913599 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:22:00.913659 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:22:00.916174 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 16 12:22:00.917285 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:22:00.918274 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:22:00.919086 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:22:00.921740 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:22:00.965250 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (735) Dec 16 12:22:00.967733 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 16 12:22:00.967768 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:22:00.973741 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:22:00.973794 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:22:00.978234 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 16 12:22:00.978825 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:22:00.981245 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:22:01.022916 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:22:01.026089 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:22:01.060940 systemd-networkd[872]: lo: Link UP Dec 16 12:22:01.060953 systemd-networkd[872]: lo: Gained carrier Dec 16 12:22:01.062114 systemd-networkd[872]: Enumeration completed Dec 16 12:22:01.062258 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:22:01.062842 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 12:22:01.062846 systemd-networkd[872]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:22:01.063305 systemd-networkd[872]: eth0: Link UP Dec 16 12:22:01.063322 systemd[1]: Reached target network.target - Network. Dec 16 12:22:01.064385 systemd-networkd[872]: eth0: Gained carrier Dec 16 12:22:01.064396 systemd-networkd[872]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 12:22:01.100490 systemd-networkd[872]: eth0: DHCPv4 address 10.0.23.32/25, gateway 10.0.23.1 acquired from 10.0.23.1 Dec 16 12:22:01.155384 ignition[814]: Ignition 2.22.0 Dec 16 12:22:01.156088 ignition[814]: Stage: fetch-offline Dec 16 12:22:01.156139 ignition[814]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:22:01.156147 ignition[814]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:22:01.158186 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:22:01.156252 ignition[814]: parsed url from cmdline: "" Dec 16 12:22:01.156255 ignition[814]: no config URL provided Dec 16 12:22:01.156260 ignition[814]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:22:01.160338 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:22:01.156266 ignition[814]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:22:01.156271 ignition[814]: failed to fetch config: resource requires networking Dec 16 12:22:01.156423 ignition[814]: Ignition finished successfully Dec 16 12:22:01.187769 ignition[889]: Ignition 2.22.0 Dec 16 12:22:01.187791 ignition[889]: Stage: fetch Dec 16 12:22:01.187923 ignition[889]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:22:01.187932 ignition[889]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:22:01.188006 ignition[889]: parsed url from cmdline: "" Dec 16 12:22:01.188009 ignition[889]: no config URL provided Dec 16 12:22:01.188014 ignition[889]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:22:01.188020 ignition[889]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:22:01.188226 ignition[889]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 12:22:01.188457 ignition[889]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 12:22:01.188563 ignition[889]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 12:22:01.442334 ignition[889]: GET result: OK Dec 16 12:22:01.442597 ignition[889]: parsing config with SHA512: 830be69322c442c8f43dc04d7fe083503dca3037c42645c50b74bc16ff31e40874cac59bf4c03249a071a43f7e20a1103560f79dcf398a06b7939eef161383c7 Dec 16 12:22:01.447695 unknown[889]: fetched base config from "system" Dec 16 12:22:01.448025 ignition[889]: fetch: fetch complete Dec 16 12:22:01.447705 unknown[889]: fetched base config from "system" Dec 16 12:22:01.448030 ignition[889]: fetch: fetch passed Dec 16 12:22:01.447710 unknown[889]: fetched user config from "openstack" Dec 16 12:22:01.448066 ignition[889]: Ignition finished successfully Dec 16 12:22:01.449986 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:22:01.451789 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:22:01.489498 ignition[897]: Ignition 2.22.0 Dec 16 12:22:01.489517 ignition[897]: Stage: kargs Dec 16 12:22:01.489650 ignition[897]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:22:01.489659 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:22:01.490387 ignition[897]: kargs: kargs passed Dec 16 12:22:01.490433 ignition[897]: Ignition finished successfully Dec 16 12:22:01.493500 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:22:01.496227 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:22:01.528698 ignition[905]: Ignition 2.22.0 Dec 16 12:22:01.528722 ignition[905]: Stage: disks Dec 16 12:22:01.528856 ignition[905]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:22:01.528865 ignition[905]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:22:01.529627 ignition[905]: disks: disks passed Dec 16 12:22:01.531573 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:22:01.529671 ignition[905]: Ignition finished successfully Dec 16 12:22:01.536332 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:22:01.538051 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:22:01.540227 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:22:01.540945 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:22:01.542308 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:22:01.544502 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:22:01.576863 systemd-fsck[914]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 16 12:22:01.580136 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:22:01.582132 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:22:01.683256 kernel: EXT4-fs (vda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 16 12:22:01.684361 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:22:01.685467 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:22:01.688491 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:22:01.690979 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:22:01.691897 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 12:22:01.701023 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 12:22:01.702015 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:22:01.702043 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:22:01.704936 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:22:01.707774 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:22:01.717221 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (922) Dec 16 12:22:01.719201 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 16 12:22:01.719236 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:22:01.725451 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:22:01.725823 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:22:01.727232 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:22:01.768223 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:01.782182 initrd-setup-root[950]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 12:22:01.786707 initrd-setup-root[957]: cut: /sysroot/etc/group: No such file or directory Dec 16 12:22:01.790666 initrd-setup-root[964]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 12:22:01.795648 initrd-setup-root[971]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 12:22:01.886271 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:22:01.888525 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:22:01.889925 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:22:01.906758 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:22:01.909008 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 16 12:22:01.932693 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:22:01.939988 ignition[1039]: INFO : Ignition 2.22.0 Dec 16 12:22:01.939988 ignition[1039]: INFO : Stage: mount Dec 16 12:22:01.941285 ignition[1039]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:22:01.941285 ignition[1039]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:22:01.941285 ignition[1039]: INFO : mount: mount passed Dec 16 12:22:01.941285 ignition[1039]: INFO : Ignition finished successfully Dec 16 12:22:01.942630 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:22:02.687484 systemd-networkd[872]: eth0: Gained IPv6LL Dec 16 12:22:02.805222 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:04.815228 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:08.820228 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:08.824808 coreos-metadata[924]: Dec 16 12:22:08.824 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:22:08.841541 coreos-metadata[924]: Dec 16 12:22:08.841 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:22:09.406393 coreos-metadata[924]: Dec 16 12:22:09.406 INFO Fetch successful Dec 16 12:22:09.407270 coreos-metadata[924]: Dec 16 12:22:09.407 INFO wrote hostname ci-4459-2-2-6-119dd6897d to /sysroot/etc/hostname Dec 16 12:22:09.409115 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 12:22:09.409237 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 12:22:09.412835 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:22:09.431643 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:22:09.448236 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1057) Dec 16 12:22:09.451327 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 16 12:22:09.451381 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:22:09.456939 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:22:09.457000 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:22:09.457229 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:22:09.487209 ignition[1075]: INFO : Ignition 2.22.0 Dec 16 12:22:09.487209 ignition[1075]: INFO : Stage: files Dec 16 12:22:09.488775 ignition[1075]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:22:09.488775 ignition[1075]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:22:09.488775 ignition[1075]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:22:09.491499 ignition[1075]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:22:09.491499 ignition[1075]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:22:09.491499 ignition[1075]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:22:09.491499 ignition[1075]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:22:09.495794 ignition[1075]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:22:09.495794 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:22:09.495794 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 16 12:22:09.491841 unknown[1075]: wrote ssh authorized keys file for user: core Dec 16 12:22:09.559073 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:22:09.908403 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:22:09.909998 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:22:09.921873 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:22:09.921873 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:22:09.921873 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:22:09.921873 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:22:09.921873 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:22:09.921873 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Dec 16 12:22:10.183031 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:22:10.756401 ignition[1075]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:22:10.756401 ignition[1075]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:22:10.759569 ignition[1075]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:22:10.761648 ignition[1075]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:22:10.761648 ignition[1075]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:22:10.765425 ignition[1075]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:22:10.765425 ignition[1075]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:22:10.765425 ignition[1075]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:22:10.765425 ignition[1075]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:22:10.765425 ignition[1075]: INFO : files: files passed Dec 16 12:22:10.765425 ignition[1075]: INFO : Ignition finished successfully Dec 16 12:22:10.764783 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:22:10.767261 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:22:10.769836 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:22:10.777743 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:22:10.778516 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:22:10.785017 initrd-setup-root-after-ignition[1106]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:22:10.785017 initrd-setup-root-after-ignition[1106]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:22:10.787612 initrd-setup-root-after-ignition[1110]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:22:10.788720 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:22:10.789868 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:22:10.792116 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:22:10.827103 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:22:10.827260 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:22:10.829109 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:22:10.830574 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:22:10.832099 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:22:10.832938 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:22:10.856779 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:22:10.859021 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:22:10.880300 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:22:10.881274 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:22:10.883669 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:22:10.885340 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:22:10.885457 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:22:10.887650 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:22:10.889107 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:22:10.890431 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:22:10.891776 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:22:10.893630 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:22:10.895342 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:22:10.896840 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:22:10.898239 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:22:10.899848 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:22:10.901333 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:22:10.902913 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:22:10.904098 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:22:10.904232 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:22:10.906032 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:22:10.907825 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:22:10.909316 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:22:10.910922 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:22:10.911995 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:22:10.912100 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:22:10.914374 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:22:10.914506 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:22:10.916053 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:22:10.916153 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:22:10.918700 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:22:10.920619 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:22:10.921962 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:22:10.922074 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:22:10.923508 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:22:10.923603 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:22:10.928058 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:22:10.929382 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:22:10.939612 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:22:10.943603 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:22:10.943712 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:22:10.945684 ignition[1130]: INFO : Ignition 2.22.0 Dec 16 12:22:10.945684 ignition[1130]: INFO : Stage: umount Dec 16 12:22:10.945684 ignition[1130]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:22:10.945684 ignition[1130]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:22:10.948565 ignition[1130]: INFO : umount: umount passed Dec 16 12:22:10.948565 ignition[1130]: INFO : Ignition finished successfully Dec 16 12:22:10.947609 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:22:10.949233 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:22:10.950802 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:22:10.950842 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:22:10.952080 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:22:10.952120 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:22:10.953870 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:22:10.953905 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:22:10.955168 systemd[1]: Stopped target network.target - Network. Dec 16 12:22:10.956445 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:22:10.956490 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:22:10.957959 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:22:10.959220 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:22:10.963350 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:22:10.964376 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:22:10.965714 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:22:10.967462 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:22:10.967499 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:22:10.968695 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:22:10.968724 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:22:10.969907 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:22:10.969954 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:22:10.971151 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:22:10.971188 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:22:10.973012 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:22:10.973058 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:22:10.975041 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:22:10.976498 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:22:10.984239 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:22:10.985810 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:22:10.988293 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 16 12:22:10.988489 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:22:10.988579 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:22:10.991814 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 16 12:22:10.992665 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:22:10.993588 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:22:10.993637 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:22:10.995865 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:22:10.997141 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:22:10.997364 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:22:10.998817 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:22:10.998858 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:22:11.001175 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:22:11.001234 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:22:11.002737 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:22:11.002774 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:22:11.005006 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:22:11.007431 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 16 12:22:11.007488 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 16 12:22:11.012792 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:22:11.012943 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:22:11.014711 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:22:11.014753 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:22:11.016147 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:22:11.016177 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:22:11.017713 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:22:11.017755 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:22:11.019932 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:22:11.019981 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:22:11.022022 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:22:11.022065 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:22:11.024972 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:22:11.025887 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:22:11.025939 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:22:11.028401 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:22:11.028445 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:22:11.031134 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 12:22:11.031174 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:22:11.033892 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:22:11.033930 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:22:11.035824 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:22:11.035866 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:22:11.039793 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 16 12:22:11.039841 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Dec 16 12:22:11.039867 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 16 12:22:11.039899 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 16 12:22:11.040163 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:22:11.040270 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:22:11.043938 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:22:11.044056 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:22:11.045752 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:22:11.047763 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:22:11.064160 systemd[1]: Switching root. Dec 16 12:22:11.114777 systemd-journald[312]: Journal stopped Dec 16 12:22:12.013160 systemd-journald[312]: Received SIGTERM from PID 1 (systemd). Dec 16 12:22:12.013269 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:22:12.013288 kernel: SELinux: policy capability open_perms=1 Dec 16 12:22:12.013302 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:22:12.013312 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:22:12.013329 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:22:12.013342 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:22:12.013352 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:22:12.013365 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:22:12.013375 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:22:12.013389 kernel: audit: type=1403 audit(1765887731.259:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 16 12:22:12.013407 systemd[1]: Successfully loaded SELinux policy in 66.512ms. Dec 16 12:22:12.013428 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.676ms. Dec 16 12:22:12.013440 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:22:12.013455 systemd[1]: Detected virtualization kvm. Dec 16 12:22:12.013465 systemd[1]: Detected architecture arm64. Dec 16 12:22:12.013480 systemd[1]: Detected first boot. Dec 16 12:22:12.013490 systemd[1]: Hostname set to . Dec 16 12:22:12.013501 systemd[1]: Initializing machine ID from VM UUID. Dec 16 12:22:12.013514 zram_generator::config[1178]: No configuration found. Dec 16 12:22:12.013525 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:22:12.013535 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:22:12.013546 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 16 12:22:12.013558 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:22:12.013569 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:22:12.013580 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:22:12.013594 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:22:12.013604 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:22:12.013615 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:22:12.013626 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:22:12.013640 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:22:12.013652 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:22:12.013663 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:22:12.013673 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:22:12.013684 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:22:12.013694 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:22:12.013705 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:22:12.013715 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:22:12.013726 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:22:12.013737 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:22:12.013749 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 16 12:22:12.013760 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:22:12.013771 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:22:12.013782 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:22:12.013792 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:22:12.013803 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:22:12.013815 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:22:12.013825 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:22:12.013836 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:22:12.013851 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:22:12.013861 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:22:12.013872 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:22:12.013882 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:22:12.013893 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:22:12.013904 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:22:12.013915 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:22:12.013927 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:22:12.013937 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:22:12.013948 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:22:12.013958 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:22:12.013969 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:22:12.013979 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:22:12.013990 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:22:12.014000 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:22:12.014012 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:22:12.014024 systemd[1]: Reached target machines.target - Containers. Dec 16 12:22:12.014035 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:22:12.014045 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:22:12.014056 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:22:12.014066 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:22:12.014077 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:22:12.014087 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:22:12.014098 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:22:12.014110 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:22:12.014121 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:22:12.014131 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:22:12.014142 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:22:12.014153 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:22:12.014168 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:22:12.014179 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:22:12.014190 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:22:12.014230 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:22:12.014251 kernel: loop: module loaded Dec 16 12:22:12.014262 kernel: fuse: init (API version 7.41) Dec 16 12:22:12.014272 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:22:12.014283 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:22:12.014297 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:22:12.014308 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:22:12.014320 kernel: ACPI: bus type drm_connector registered Dec 16 12:22:12.014330 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:22:12.014341 systemd[1]: verity-setup.service: Deactivated successfully. Dec 16 12:22:12.014351 systemd[1]: Stopped verity-setup.service. Dec 16 12:22:12.014363 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:22:12.014374 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:22:12.014384 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:22:12.014395 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:22:12.014405 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:22:12.014416 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:22:12.014455 systemd-journald[1246]: Collecting audit messages is disabled. Dec 16 12:22:12.014482 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:22:12.014494 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:22:12.014504 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:22:12.014516 systemd-journald[1246]: Journal started Dec 16 12:22:12.014538 systemd-journald[1246]: Runtime Journal (/run/log/journal/a1664cffe8504a639b52ae91a8acd3a0) is 8M, max 319.5M, 311.5M free. Dec 16 12:22:11.800088 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:22:11.819847 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 12:22:11.820294 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:22:12.017949 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:22:12.018812 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:22:12.018998 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:22:12.020343 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:22:12.020510 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:22:12.021667 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:22:12.022913 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:22:12.023082 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:22:12.024406 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:22:12.024560 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:22:12.025664 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:22:12.025833 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:22:12.027169 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:22:12.028373 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:22:12.029616 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:22:12.031142 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:22:12.043672 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:22:12.045953 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:22:12.047871 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:22:12.048881 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:22:12.048911 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:22:12.050745 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:22:12.054367 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:22:12.055371 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:22:12.058374 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:22:12.060358 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:22:12.061448 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:22:12.063072 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:22:12.064103 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:22:12.067372 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:22:12.069354 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:22:12.071705 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:22:12.074274 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:22:12.077022 systemd-journald[1246]: Time spent on flushing to /var/log/journal/a1664cffe8504a639b52ae91a8acd3a0 is 30.671ms for 1688 entries. Dec 16 12:22:12.077022 systemd-journald[1246]: System Journal (/var/log/journal/a1664cffe8504a639b52ae91a8acd3a0) is 8M, max 584.8M, 576.8M free. Dec 16 12:22:12.133391 systemd-journald[1246]: Received client request to flush runtime journal. Dec 16 12:22:12.133448 kernel: loop0: detected capacity change from 0 to 119840 Dec 16 12:22:12.133472 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:22:12.075384 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:22:12.094241 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:22:12.138056 kernel: loop1: detected capacity change from 0 to 1632 Dec 16 12:22:12.095729 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:22:12.098658 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:22:12.100955 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:22:12.107207 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:22:12.109793 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Dec 16 12:22:12.109803 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Dec 16 12:22:12.122754 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:22:12.125120 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:22:12.135926 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:22:12.153515 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:22:12.166731 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:22:12.170402 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:22:12.172253 kernel: loop2: detected capacity change from 0 to 100632 Dec 16 12:22:12.194148 systemd-tmpfiles[1318]: ACLs are not supported, ignoring. Dec 16 12:22:12.194171 systemd-tmpfiles[1318]: ACLs are not supported, ignoring. Dec 16 12:22:12.197586 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:22:12.211222 kernel: loop3: detected capacity change from 0 to 211168 Dec 16 12:22:12.249254 kernel: loop4: detected capacity change from 0 to 119840 Dec 16 12:22:12.263231 kernel: loop5: detected capacity change from 0 to 1632 Dec 16 12:22:12.269237 kernel: loop6: detected capacity change from 0 to 100632 Dec 16 12:22:12.286241 kernel: loop7: detected capacity change from 0 to 211168 Dec 16 12:22:12.316670 (sd-merge)[1323]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 16 12:22:12.317104 (sd-merge)[1323]: Merged extensions into '/usr'. Dec 16 12:22:12.320619 systemd[1]: Reload requested from client PID 1297 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:22:12.320637 systemd[1]: Reloading... Dec 16 12:22:12.365230 zram_generator::config[1345]: No configuration found. Dec 16 12:22:12.518949 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:22:12.519355 systemd[1]: Reloading finished in 198 ms. Dec 16 12:22:12.544288 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:22:12.545993 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:22:12.559411 systemd[1]: Starting ensure-sysext.service... Dec 16 12:22:12.561300 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:22:12.564380 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:22:12.575921 systemd[1]: Reload requested from client PID 1386 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:22:12.575939 systemd[1]: Reloading... Dec 16 12:22:12.577859 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:22:12.577902 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:22:12.578147 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:22:12.578384 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 16 12:22:12.579004 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 16 12:22:12.579252 systemd-tmpfiles[1387]: ACLs are not supported, ignoring. Dec 16 12:22:12.579318 systemd-tmpfiles[1387]: ACLs are not supported, ignoring. Dec 16 12:22:12.583023 systemd-tmpfiles[1387]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:22:12.583035 systemd-tmpfiles[1387]: Skipping /boot Dec 16 12:22:12.588956 systemd-tmpfiles[1387]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:22:12.588975 systemd-tmpfiles[1387]: Skipping /boot Dec 16 12:22:12.590565 systemd-udevd[1388]: Using default interface naming scheme 'v255'. Dec 16 12:22:12.632287 zram_generator::config[1422]: No configuration found. Dec 16 12:22:12.790390 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:22:12.805887 ldconfig[1292]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:22:12.849607 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:22:12.850714 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 16 12:22:12.851249 systemd[1]: Reloading finished in 275 ms. Dec 16 12:22:12.863895 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:22:12.866863 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:22:12.868989 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:22:12.903503 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Dec 16 12:22:12.903589 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:22:12.903607 kernel: [drm] features: -context_init Dec 16 12:22:12.910401 kernel: [drm] number of scanouts: 1 Dec 16 12:22:12.910471 kernel: [drm] number of cap sets: 0 Dec 16 12:22:12.911978 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Dec 16 12:22:12.917244 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:22:12.918248 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:22:12.925873 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:22:12.940612 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:22:12.942723 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:22:12.950830 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:22:12.954691 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:22:12.958177 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:22:12.961741 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:22:12.965555 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:22:12.971771 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:22:12.978654 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:22:12.980046 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:22:12.982749 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:22:12.986376 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:22:12.987252 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:22:12.987435 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:22:12.989123 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:22:12.992111 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:22:12.995247 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:22:12.997060 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:22:12.997312 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:22:12.999985 augenrules[1548]: No rules Dec 16 12:22:13.000205 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 16 12:22:13.000829 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:22:13.000979 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:22:13.002714 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:22:13.002883 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:22:13.004399 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:22:13.004544 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:22:13.007015 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:22:13.016093 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:22:13.017366 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:22:13.019328 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:22:13.022437 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:22:13.023278 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:22:13.023404 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:22:13.032119 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:22:13.034222 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:22:13.036119 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:22:13.040302 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:22:13.041733 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:22:13.041889 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:22:13.043355 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:22:13.043524 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:22:13.045929 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:22:13.050785 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:22:13.050967 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:22:13.064543 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:22:13.066497 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:22:13.067605 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:22:13.070482 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:22:13.079502 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:22:13.082931 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:22:13.085061 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 16 12:22:13.086046 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:22:13.086166 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:22:13.086321 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:22:13.089465 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:22:13.089653 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:22:13.095442 systemd[1]: Finished ensure-sysext.service. Dec 16 12:22:13.097258 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:22:13.097442 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:22:13.099277 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:22:13.099462 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:22:13.101392 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:22:13.101531 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:22:13.104796 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:22:13.104870 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:22:13.109807 systemd-networkd[1523]: lo: Link UP Dec 16 12:22:13.109819 systemd-networkd[1523]: lo: Gained carrier Dec 16 12:22:13.112318 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:22:13.114308 systemd-networkd[1523]: Enumeration completed Dec 16 12:22:13.114752 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 16 12:22:13.115288 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 16 12:22:13.114721 systemd-resolved[1524]: Positive Trust Anchors: Dec 16 12:22:13.114743 systemd-resolved[1524]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:22:13.114774 systemd-resolved[1524]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:22:13.115078 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:22:13.116367 systemd-networkd[1523]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 12:22:13.116378 systemd-networkd[1523]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:22:13.118107 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:22:13.118814 systemd-resolved[1524]: Using system hostname 'ci-4459-2-2-6-119dd6897d'. Dec 16 12:22:13.121261 kernel: PTP clock support registered Dec 16 12:22:13.120207 systemd-networkd[1523]: eth0: Link UP Dec 16 12:22:13.120334 systemd-networkd[1523]: eth0: Gained carrier Dec 16 12:22:13.120352 systemd-networkd[1523]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 12:22:13.120962 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:22:13.122145 augenrules[1578]: /sbin/augenrules: No change Dec 16 12:22:13.122396 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:22:13.122555 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:22:13.124210 systemd[1]: Reached target network.target - Network. Dec 16 12:22:13.125117 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:22:13.128018 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 16 12:22:13.131757 augenrules[1611]: No rules Dec 16 12:22:13.137440 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 16 12:22:13.139046 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:22:13.139315 systemd-networkd[1523]: eth0: DHCPv4 address 10.0.23.32/25, gateway 10.0.23.1 acquired from 10.0.23.1 Dec 16 12:22:13.139413 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:22:13.153961 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:22:13.155287 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:22:13.156250 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:22:13.157150 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:22:13.158503 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:22:13.159839 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:22:13.160869 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:22:13.161894 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:22:13.161924 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:22:13.162646 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:22:13.164335 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:22:13.166420 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:22:13.168921 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:22:13.170101 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:22:13.171209 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:22:13.176619 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:22:13.177722 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:22:13.179374 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:22:13.180462 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:22:13.181806 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:22:13.182636 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:22:13.183421 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:22:13.183453 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:22:13.185902 systemd[1]: Starting chronyd.service - NTP client/server... Dec 16 12:22:13.187493 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:22:13.189300 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:22:13.192373 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:22:13.195390 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:22:13.197241 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:13.197759 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:22:13.200424 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:22:13.201606 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:22:13.202554 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:22:13.204238 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:22:13.207318 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:22:13.208325 jq[1631]: false Dec 16 12:22:13.210753 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:22:13.213847 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:22:13.215594 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:22:13.215992 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:22:13.216895 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:22:13.219057 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:22:13.222913 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:22:13.224323 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:22:13.224514 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:22:13.230236 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:22:13.230431 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:22:13.232652 jq[1642]: true Dec 16 12:22:13.244828 extend-filesystems[1632]: Found /dev/vda6 Dec 16 12:22:13.246011 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:22:13.246894 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:22:13.246994 (ntainerd)[1660]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 16 12:22:13.254206 jq[1656]: true Dec 16 12:22:13.257922 extend-filesystems[1632]: Found /dev/vda9 Dec 16 12:22:13.262579 update_engine[1639]: I20251216 12:22:13.257995 1639 main.cc:92] Flatcar Update Engine starting Dec 16 12:22:13.262793 tar[1646]: linux-arm64/LICENSE Dec 16 12:22:13.262793 tar[1646]: linux-arm64/helm Dec 16 12:22:13.263310 extend-filesystems[1632]: Checking size of /dev/vda9 Dec 16 12:22:13.270066 chronyd[1624]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 16 12:22:13.271073 chronyd[1624]: Loaded seccomp filter (level 2) Dec 16 12:22:13.271377 systemd[1]: Started chronyd.service - NTP client/server. Dec 16 12:22:13.274743 dbus-daemon[1627]: [system] SELinux support is enabled Dec 16 12:22:13.274933 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:22:13.278111 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:22:13.278148 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:22:13.279758 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:22:13.279787 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:22:13.285184 extend-filesystems[1632]: Resized partition /dev/vda9 Dec 16 12:22:13.286372 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:22:13.287035 update_engine[1639]: I20251216 12:22:13.286487 1639 update_check_scheduler.cc:74] Next update check in 2m30s Dec 16 12:22:13.288929 extend-filesystems[1674]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:22:13.290447 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:22:13.298217 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 16 12:22:13.324504 systemd-logind[1637]: New seat seat0. Dec 16 12:22:13.336579 systemd-logind[1637]: Watching system buttons on /dev/input/event0 (Power Button) Dec 16 12:22:13.336611 systemd-logind[1637]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 16 12:22:13.338029 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:22:13.370406 locksmithd[1676]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:22:13.451400 containerd[1660]: time="2025-12-16T12:22:13Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:22:13.454845 containerd[1660]: time="2025-12-16T12:22:13.454793240Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 16 12:22:13.464984 containerd[1660]: time="2025-12-16T12:22:13.464842920Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.84µs" Dec 16 12:22:13.469372 containerd[1660]: time="2025-12-16T12:22:13.465062680Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:22:13.469372 containerd[1660]: time="2025-12-16T12:22:13.465094200Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:22:13.469566 containerd[1660]: time="2025-12-16T12:22:13.469507040Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:22:13.469566 containerd[1660]: time="2025-12-16T12:22:13.469557720Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:22:13.469698 containerd[1660]: time="2025-12-16T12:22:13.469587440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:22:13.479412 containerd[1660]: time="2025-12-16T12:22:13.479345520Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:22:13.479412 containerd[1660]: time="2025-12-16T12:22:13.479385360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:22:13.480278 bash[1690]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480604040Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480631760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480645000Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480653440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480746680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480929360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480957160Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.480968120Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:22:13.481215 containerd[1660]: time="2025-12-16T12:22:13.481018360Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:22:13.481572 containerd[1660]: time="2025-12-16T12:22:13.481552040Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:22:13.481690 containerd[1660]: time="2025-12-16T12:22:13.481672960Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:22:13.481722 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:22:13.487851 systemd[1]: Starting sshkeys.service... Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.508991400Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.509068400Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.509091160Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.509104080Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.509117480Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.509129000Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.509140760Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:22:13.509174 containerd[1660]: time="2025-12-16T12:22:13.509152040Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:22:13.509479 containerd[1660]: time="2025-12-16T12:22:13.509459720Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:22:13.509535 containerd[1660]: time="2025-12-16T12:22:13.509523000Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:22:13.509591 containerd[1660]: time="2025-12-16T12:22:13.509578120Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:22:13.509649 containerd[1660]: time="2025-12-16T12:22:13.509635640Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:22:13.509828 containerd[1660]: time="2025-12-16T12:22:13.509809440Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:22:13.509900 containerd[1660]: time="2025-12-16T12:22:13.509886000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:22:13.509953 containerd[1660]: time="2025-12-16T12:22:13.509940720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:22:13.510013 containerd[1660]: time="2025-12-16T12:22:13.510001480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:22:13.510066 containerd[1660]: time="2025-12-16T12:22:13.510054760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:22:13.510118 containerd[1660]: time="2025-12-16T12:22:13.510106480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:22:13.510167 containerd[1660]: time="2025-12-16T12:22:13.510156680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:22:13.510247 containerd[1660]: time="2025-12-16T12:22:13.510231640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:22:13.510315 containerd[1660]: time="2025-12-16T12:22:13.510302000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:22:13.510367 containerd[1660]: time="2025-12-16T12:22:13.510355280Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:22:13.510417 containerd[1660]: time="2025-12-16T12:22:13.510405840Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:22:13.510664 containerd[1660]: time="2025-12-16T12:22:13.510650000Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:22:13.510718 containerd[1660]: time="2025-12-16T12:22:13.510707200Z" level=info msg="Start snapshots syncer" Dec 16 12:22:13.510795 containerd[1660]: time="2025-12-16T12:22:13.510782000Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:22:13.511018 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:22:13.511232 containerd[1660]: time="2025-12-16T12:22:13.511179480Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511401440Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511478880Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511606680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511637040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511647920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511658640Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511672280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511687960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511698680Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511722080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511734560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511745720Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511780080Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:22:13.512216 containerd[1660]: time="2025-12-16T12:22:13.511795680Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511804640Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511815400Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511823320Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511833640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511845200Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511937720Z" level=info msg="runtime interface created" Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511942680Z" level=info msg="created NRI interface" Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511951160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511962680Z" level=info msg="Connect containerd service" Dec 16 12:22:13.512495 containerd[1660]: time="2025-12-16T12:22:13.511983800Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:22:13.513327 containerd[1660]: time="2025-12-16T12:22:13.513257520Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:22:13.514603 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:22:13.531218 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:13.597373 containerd[1660]: time="2025-12-16T12:22:13.597215960Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:22:13.597628 containerd[1660]: time="2025-12-16T12:22:13.597541120Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:22:13.597682 containerd[1660]: time="2025-12-16T12:22:13.597272120Z" level=info msg="Start subscribing containerd event" Dec 16 12:22:13.597785 containerd[1660]: time="2025-12-16T12:22:13.597738600Z" level=info msg="Start recovering state" Dec 16 12:22:13.598025 containerd[1660]: time="2025-12-16T12:22:13.598008960Z" level=info msg="Start event monitor" Dec 16 12:22:13.598102 containerd[1660]: time="2025-12-16T12:22:13.598090480Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:22:13.598205 containerd[1660]: time="2025-12-16T12:22:13.598136560Z" level=info msg="Start streaming server" Dec 16 12:22:13.598205 containerd[1660]: time="2025-12-16T12:22:13.598184760Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:22:13.598347 containerd[1660]: time="2025-12-16T12:22:13.598288360Z" level=info msg="runtime interface starting up..." Dec 16 12:22:13.598347 containerd[1660]: time="2025-12-16T12:22:13.598301480Z" level=info msg="starting plugins..." Dec 16 12:22:13.598347 containerd[1660]: time="2025-12-16T12:22:13.598318760Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:22:13.598697 containerd[1660]: time="2025-12-16T12:22:13.598675400Z" level=info msg="containerd successfully booted in 0.147676s" Dec 16 12:22:13.598749 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:22:13.644222 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 16 12:22:13.659755 extend-filesystems[1674]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 12:22:13.659755 extend-filesystems[1674]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 16 12:22:13.659755 extend-filesystems[1674]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 16 12:22:13.663046 extend-filesystems[1632]: Resized filesystem in /dev/vda9 Dec 16 12:22:13.662139 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:22:13.664239 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:22:13.749097 tar[1646]: linux-arm64/README.md Dec 16 12:22:13.765661 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:22:14.211222 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:14.335421 systemd-networkd[1523]: eth0: Gained IPv6LL Dec 16 12:22:14.338097 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:22:14.339768 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:22:14.342023 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:14.344175 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:22:14.373252 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:22:14.428915 sshd_keygen[1662]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:22:14.449321 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:22:14.452096 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:22:14.469698 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:22:14.469952 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:22:14.473938 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:22:14.498249 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:22:14.501008 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:22:14.503279 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 16 12:22:14.504532 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:22:14.542216 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:15.277904 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:15.281696 (kubelet)[1763]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:22:15.819000 kubelet[1763]: E1216 12:22:15.818921 1763 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:22:15.821433 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:22:15.821575 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:22:15.821923 systemd[1]: kubelet.service: Consumed 782ms CPU time, 258.5M memory peak. Dec 16 12:22:16.220237 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:16.554263 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:19.858316 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:22:19.859478 systemd[1]: Started sshd@0-10.0.23.32:22-139.178.68.195:60738.service - OpenSSH per-connection server daemon (139.178.68.195:60738). Dec 16 12:22:20.231303 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:20.238223 coreos-metadata[1626]: Dec 16 12:22:20.238 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:22:20.254584 coreos-metadata[1626]: Dec 16 12:22:20.254 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 12:22:20.564510 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:22:20.573692 coreos-metadata[1704]: Dec 16 12:22:20.573 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:22:20.586847 coreos-metadata[1704]: Dec 16 12:22:20.586 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 12:22:20.874498 sshd[1778]: Accepted publickey for core from 139.178.68.195 port 60738 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:22:20.877531 sshd-session[1778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:20.888325 systemd-logind[1637]: New session 1 of user core. Dec 16 12:22:20.889560 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:22:20.890636 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:22:20.919464 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:22:20.923759 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:22:20.939092 (systemd)[1787]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 12:22:20.941422 systemd-logind[1637]: New session c1 of user core. Dec 16 12:22:21.052981 systemd[1787]: Queued start job for default target default.target. Dec 16 12:22:21.063707 systemd[1787]: Created slice app.slice - User Application Slice. Dec 16 12:22:21.063742 systemd[1787]: Reached target paths.target - Paths. Dec 16 12:22:21.063779 systemd[1787]: Reached target timers.target - Timers. Dec 16 12:22:21.064996 systemd[1787]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:22:21.074048 systemd[1787]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:22:21.074115 systemd[1787]: Reached target sockets.target - Sockets. Dec 16 12:22:21.074153 systemd[1787]: Reached target basic.target - Basic System. Dec 16 12:22:21.074180 systemd[1787]: Reached target default.target - Main User Target. Dec 16 12:22:21.074232 systemd[1787]: Startup finished in 127ms. Dec 16 12:22:21.074449 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:22:21.075935 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:22:21.294249 coreos-metadata[1704]: Dec 16 12:22:21.294 INFO Fetch successful Dec 16 12:22:21.294249 coreos-metadata[1704]: Dec 16 12:22:21.294 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 12:22:21.380421 coreos-metadata[1626]: Dec 16 12:22:21.380 INFO Fetch successful Dec 16 12:22:21.380875 coreos-metadata[1626]: Dec 16 12:22:21.380 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:22:21.535178 coreos-metadata[1704]: Dec 16 12:22:21.535 INFO Fetch successful Dec 16 12:22:21.544028 unknown[1704]: wrote ssh authorized keys file for user: core Dec 16 12:22:21.545759 coreos-metadata[1626]: Dec 16 12:22:21.545 INFO Fetch successful Dec 16 12:22:21.545759 coreos-metadata[1626]: Dec 16 12:22:21.545 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 12:22:21.588946 update-ssh-keys[1799]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:22:21.589929 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:22:21.591559 systemd[1]: Finished sshkeys.service. Dec 16 12:22:21.680337 coreos-metadata[1626]: Dec 16 12:22:21.680 INFO Fetch successful Dec 16 12:22:21.680337 coreos-metadata[1626]: Dec 16 12:22:21.680 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 12:22:21.766567 systemd[1]: Started sshd@1-10.0.23.32:22-139.178.68.195:51628.service - OpenSSH per-connection server daemon (139.178.68.195:51628). Dec 16 12:22:21.810221 coreos-metadata[1626]: Dec 16 12:22:21.810 INFO Fetch successful Dec 16 12:22:21.810221 coreos-metadata[1626]: Dec 16 12:22:21.810 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 12:22:21.935754 coreos-metadata[1626]: Dec 16 12:22:21.935 INFO Fetch successful Dec 16 12:22:21.935754 coreos-metadata[1626]: Dec 16 12:22:21.935 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 12:22:22.062882 coreos-metadata[1626]: Dec 16 12:22:22.062 INFO Fetch successful Dec 16 12:22:22.096215 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:22:22.096658 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:22:22.096784 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:22:22.096922 systemd[1]: Startup finished in 2.870s (kernel) + 12.638s (initrd) + 10.907s (userspace) = 26.416s. Dec 16 12:22:22.757230 sshd[1803]: Accepted publickey for core from 139.178.68.195 port 51628 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:22:22.758544 sshd-session[1803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:22.762550 systemd-logind[1637]: New session 2 of user core. Dec 16 12:22:22.774340 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 12:22:23.427580 sshd[1811]: Connection closed by 139.178.68.195 port 51628 Dec 16 12:22:23.428181 sshd-session[1803]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:23.432079 systemd-logind[1637]: Session 2 logged out. Waiting for processes to exit. Dec 16 12:22:23.432338 systemd[1]: sshd@1-10.0.23.32:22-139.178.68.195:51628.service: Deactivated successfully. Dec 16 12:22:23.433719 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 12:22:23.435181 systemd-logind[1637]: Removed session 2. Dec 16 12:22:23.597369 systemd[1]: Started sshd@2-10.0.23.32:22-139.178.68.195:51632.service - OpenSSH per-connection server daemon (139.178.68.195:51632). Dec 16 12:22:24.580150 sshd[1817]: Accepted publickey for core from 139.178.68.195 port 51632 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:22:24.581391 sshd-session[1817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:24.585806 systemd-logind[1637]: New session 3 of user core. Dec 16 12:22:24.594557 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:22:25.237065 sshd[1820]: Connection closed by 139.178.68.195 port 51632 Dec 16 12:22:25.237577 sshd-session[1817]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:25.240899 systemd[1]: sshd@2-10.0.23.32:22-139.178.68.195:51632.service: Deactivated successfully. Dec 16 12:22:25.242375 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:22:25.243074 systemd-logind[1637]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:22:25.244244 systemd-logind[1637]: Removed session 3. Dec 16 12:22:25.405411 systemd[1]: Started sshd@3-10.0.23.32:22-139.178.68.195:51642.service - OpenSSH per-connection server daemon (139.178.68.195:51642). Dec 16 12:22:26.072295 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:22:26.073653 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:26.193554 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:26.197392 (kubelet)[1837]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:22:26.245427 kubelet[1837]: E1216 12:22:26.245369 1837 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:22:26.248975 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:22:26.249099 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:22:26.249403 systemd[1]: kubelet.service: Consumed 148ms CPU time, 108.7M memory peak. Dec 16 12:22:26.377663 sshd[1826]: Accepted publickey for core from 139.178.68.195 port 51642 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:22:26.378903 sshd-session[1826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:26.382807 systemd-logind[1637]: New session 4 of user core. Dec 16 12:22:26.391539 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:22:27.039135 sshd[1845]: Connection closed by 139.178.68.195 port 51642 Dec 16 12:22:27.039762 sshd-session[1826]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:27.044615 systemd[1]: sshd@3-10.0.23.32:22-139.178.68.195:51642.service: Deactivated successfully. Dec 16 12:22:27.047708 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:22:27.048461 systemd-logind[1637]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:22:27.050512 systemd-logind[1637]: Removed session 4. Dec 16 12:22:27.208710 systemd[1]: Started sshd@4-10.0.23.32:22-139.178.68.195:51656.service - OpenSSH per-connection server daemon (139.178.68.195:51656). Dec 16 12:22:28.182345 sshd[1851]: Accepted publickey for core from 139.178.68.195 port 51656 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:22:28.183574 sshd-session[1851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:28.188060 systemd-logind[1637]: New session 5 of user core. Dec 16 12:22:28.203471 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:22:28.706300 sudo[1855]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:22:28.706572 sudo[1855]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:22:28.722356 sudo[1855]: pam_unix(sudo:session): session closed for user root Dec 16 12:22:28.878829 sshd[1854]: Connection closed by 139.178.68.195 port 51656 Dec 16 12:22:28.879363 sshd-session[1851]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:28.882486 systemd[1]: sshd@4-10.0.23.32:22-139.178.68.195:51656.service: Deactivated successfully. Dec 16 12:22:28.883967 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:22:28.885558 systemd-logind[1637]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:22:28.886430 systemd-logind[1637]: Removed session 5. Dec 16 12:22:29.048354 systemd[1]: Started sshd@5-10.0.23.32:22-139.178.68.195:51664.service - OpenSSH per-connection server daemon (139.178.68.195:51664). Dec 16 12:22:30.022804 sshd[1861]: Accepted publickey for core from 139.178.68.195 port 51664 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:22:30.024081 sshd-session[1861]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:30.027650 systemd-logind[1637]: New session 6 of user core. Dec 16 12:22:30.041394 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:22:30.532299 sudo[1866]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:22:30.532569 sudo[1866]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:22:30.537179 sudo[1866]: pam_unix(sudo:session): session closed for user root Dec 16 12:22:30.542138 sudo[1865]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:22:30.542440 sudo[1865]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:22:30.551236 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:22:30.589680 augenrules[1888]: No rules Dec 16 12:22:30.590789 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:22:30.590991 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:22:30.592371 sudo[1865]: pam_unix(sudo:session): session closed for user root Dec 16 12:22:30.747964 sshd[1864]: Connection closed by 139.178.68.195 port 51664 Dec 16 12:22:30.747866 sshd-session[1861]: pam_unix(sshd:session): session closed for user core Dec 16 12:22:30.751405 systemd[1]: sshd@5-10.0.23.32:22-139.178.68.195:51664.service: Deactivated successfully. Dec 16 12:22:30.752842 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:22:30.754872 systemd-logind[1637]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:22:30.755819 systemd-logind[1637]: Removed session 6. Dec 16 12:22:30.914261 systemd[1]: Started sshd@6-10.0.23.32:22-139.178.68.195:34138.service - OpenSSH per-connection server daemon (139.178.68.195:34138). Dec 16 12:22:31.884929 sshd[1898]: Accepted publickey for core from 139.178.68.195 port 34138 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:22:31.886313 sshd-session[1898]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:22:31.890079 systemd-logind[1637]: New session 7 of user core. Dec 16 12:22:31.895381 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:22:32.390998 sudo[1902]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:22:32.391293 sudo[1902]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:22:32.730778 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:22:32.741786 (dockerd)[1923]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:22:32.969330 dockerd[1923]: time="2025-12-16T12:22:32.968503960Z" level=info msg="Starting up" Dec 16 12:22:32.970297 dockerd[1923]: time="2025-12-16T12:22:32.970272160Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:22:32.981509 dockerd[1923]: time="2025-12-16T12:22:32.981403320Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:22:33.022603 dockerd[1923]: time="2025-12-16T12:22:33.022555880Z" level=info msg="Loading containers: start." Dec 16 12:22:33.032218 kernel: Initializing XFRM netlink socket Dec 16 12:22:33.267893 systemd-networkd[1523]: docker0: Link UP Dec 16 12:22:33.273664 dockerd[1923]: time="2025-12-16T12:22:33.273609200Z" level=info msg="Loading containers: done." Dec 16 12:22:33.287742 dockerd[1923]: time="2025-12-16T12:22:33.287657400Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:22:33.287742 dockerd[1923]: time="2025-12-16T12:22:33.287754200Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:22:33.287944 dockerd[1923]: time="2025-12-16T12:22:33.287848880Z" level=info msg="Initializing buildkit" Dec 16 12:22:33.315136 dockerd[1923]: time="2025-12-16T12:22:33.315084360Z" level=info msg="Completed buildkit initialization" Dec 16 12:22:33.321708 dockerd[1923]: time="2025-12-16T12:22:33.321663640Z" level=info msg="Daemon has completed initialization" Dec 16 12:22:33.321708 dockerd[1923]: time="2025-12-16T12:22:33.321744680Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:22:33.322929 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:22:34.480159 containerd[1660]: time="2025-12-16T12:22:34.479746960Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Dec 16 12:22:35.112656 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2884621179.mount: Deactivated successfully. Dec 16 12:22:36.314646 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:22:36.317477 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:36.474387 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:36.488648 (kubelet)[2207]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:22:36.536140 kubelet[2207]: E1216 12:22:36.536079 2207 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:22:36.539775 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:22:36.539909 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:22:36.540179 systemd[1]: kubelet.service: Consumed 148ms CPU time, 107.7M memory peak. Dec 16 12:22:36.685402 containerd[1660]: time="2025-12-16T12:22:36.685231000Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:36.687125 containerd[1660]: time="2025-12-16T12:22:36.687090760Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=27387379" Dec 16 12:22:36.688424 containerd[1660]: time="2025-12-16T12:22:36.688386680Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:36.700000 containerd[1660]: time="2025-12-16T12:22:36.699940080Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:36.700801 containerd[1660]: time="2025-12-16T12:22:36.700762200Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 2.22096724s" Dec 16 12:22:36.700801 containerd[1660]: time="2025-12-16T12:22:36.700792840Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Dec 16 12:22:36.702727 containerd[1660]: time="2025-12-16T12:22:36.702689720Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Dec 16 12:22:37.054325 chronyd[1624]: Selected source PHC0 Dec 16 12:22:38.270285 containerd[1660]: time="2025-12-16T12:22:38.270225276Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:38.271552 containerd[1660]: time="2025-12-16T12:22:38.271225458Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23553101" Dec 16 12:22:38.272772 containerd[1660]: time="2025-12-16T12:22:38.272714182Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:38.277920 containerd[1660]: time="2025-12-16T12:22:38.277884265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:38.278911 containerd[1660]: time="2025-12-16T12:22:38.278871967Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.57614365s" Dec 16 12:22:38.278911 containerd[1660]: time="2025-12-16T12:22:38.278906374Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Dec 16 12:22:38.279368 containerd[1660]: time="2025-12-16T12:22:38.279334769Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Dec 16 12:22:39.498694 containerd[1660]: time="2025-12-16T12:22:39.498595300Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:39.501105 containerd[1660]: time="2025-12-16T12:22:39.501067783Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18298087" Dec 16 12:22:39.502794 containerd[1660]: time="2025-12-16T12:22:39.502732355Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:39.506358 containerd[1660]: time="2025-12-16T12:22:39.506317123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:39.507440 containerd[1660]: time="2025-12-16T12:22:39.507402889Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.228032065s" Dec 16 12:22:39.507493 containerd[1660]: time="2025-12-16T12:22:39.507440675Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Dec 16 12:22:39.508139 containerd[1660]: time="2025-12-16T12:22:39.507857328Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Dec 16 12:22:40.600793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount158630461.mount: Deactivated successfully. Dec 16 12:22:40.889948 containerd[1660]: time="2025-12-16T12:22:40.889359899Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:40.890553 containerd[1660]: time="2025-12-16T12:22:40.890497708Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=28258699" Dec 16 12:22:40.892462 containerd[1660]: time="2025-12-16T12:22:40.892403041Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:40.896001 containerd[1660]: time="2025-12-16T12:22:40.895928529Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:40.896837 containerd[1660]: time="2025-12-16T12:22:40.896566572Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.388675488s" Dec 16 12:22:40.896837 containerd[1660]: time="2025-12-16T12:22:40.896604359Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Dec 16 12:22:40.897161 containerd[1660]: time="2025-12-16T12:22:40.897081765Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Dec 16 12:22:41.593761 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2179504280.mount: Deactivated successfully. Dec 16 12:22:42.620971 containerd[1660]: time="2025-12-16T12:22:42.620907506Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:42.622959 containerd[1660]: time="2025-12-16T12:22:42.622907795Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Dec 16 12:22:42.624789 containerd[1660]: time="2025-12-16T12:22:42.624757332Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:42.628456 containerd[1660]: time="2025-12-16T12:22:42.628380212Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:42.629724 containerd[1660]: time="2025-12-16T12:22:42.629481071Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.732367377s" Dec 16 12:22:42.629724 containerd[1660]: time="2025-12-16T12:22:42.629514989Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Dec 16 12:22:42.629999 containerd[1660]: time="2025-12-16T12:22:42.629911847Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 16 12:22:43.210479 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3154721638.mount: Deactivated successfully. Dec 16 12:22:43.223657 containerd[1660]: time="2025-12-16T12:22:43.223468513Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:22:43.225289 containerd[1660]: time="2025-12-16T12:22:43.225234935Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Dec 16 12:22:43.226762 containerd[1660]: time="2025-12-16T12:22:43.226703073Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:22:43.229120 containerd[1660]: time="2025-12-16T12:22:43.229070885Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:22:43.230546 containerd[1660]: time="2025-12-16T12:22:43.230424412Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 600.482646ms" Dec 16 12:22:43.230546 containerd[1660]: time="2025-12-16T12:22:43.230456692Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Dec 16 12:22:43.231021 containerd[1660]: time="2025-12-16T12:22:43.230984535Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Dec 16 12:22:43.904683 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2796091438.mount: Deactivated successfully. Dec 16 12:22:45.647957 containerd[1660]: time="2025-12-16T12:22:45.646898542Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:45.647957 containerd[1660]: time="2025-12-16T12:22:45.647935546Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=70013713" Dec 16 12:22:45.649181 containerd[1660]: time="2025-12-16T12:22:45.649133951Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:45.652107 containerd[1660]: time="2025-12-16T12:22:45.652056123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:22:45.653151 containerd[1660]: time="2025-12-16T12:22:45.653029647Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.422001032s" Dec 16 12:22:45.653151 containerd[1660]: time="2025-12-16T12:22:45.653063488Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Dec 16 12:22:46.565073 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 12:22:46.566405 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:46.712792 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:46.717070 (kubelet)[2375]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:22:46.751936 kubelet[2375]: E1216 12:22:46.751882 2375 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:22:46.754520 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:22:46.754748 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:22:46.755139 systemd[1]: kubelet.service: Consumed 138ms CPU time, 105.1M memory peak. Dec 16 12:22:49.377660 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:49.378165 systemd[1]: kubelet.service: Consumed 138ms CPU time, 105.1M memory peak. Dec 16 12:22:49.379963 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:49.400824 systemd[1]: Reload requested from client PID 2390 ('systemctl') (unit session-7.scope)... Dec 16 12:22:49.400840 systemd[1]: Reloading... Dec 16 12:22:49.476535 zram_generator::config[2432]: No configuration found. Dec 16 12:22:49.637665 systemd[1]: Reloading finished in 236 ms. Dec 16 12:22:49.701773 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:22:49.701852 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:22:49.702087 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:49.702136 systemd[1]: kubelet.service: Consumed 89ms CPU time, 95M memory peak. Dec 16 12:22:49.703606 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:49.831252 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:49.835658 (kubelet)[2480]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:22:49.865350 kubelet[2480]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:22:49.865350 kubelet[2480]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:22:49.865350 kubelet[2480]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:22:49.865689 kubelet[2480]: I1216 12:22:49.865388 2480 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:22:50.867675 kubelet[2480]: I1216 12:22:50.867625 2480 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 16 12:22:50.867675 kubelet[2480]: I1216 12:22:50.867661 2480 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:22:50.867993 kubelet[2480]: I1216 12:22:50.867895 2480 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:22:50.890281 kubelet[2480]: E1216 12:22:50.890229 2480 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.23.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.23.32:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 12:22:50.891871 kubelet[2480]: I1216 12:22:50.891827 2480 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:22:50.908293 kubelet[2480]: I1216 12:22:50.908249 2480 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:22:50.911189 kubelet[2480]: I1216 12:22:50.911151 2480 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:22:50.913558 kubelet[2480]: I1216 12:22:50.913009 2480 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:22:50.913558 kubelet[2480]: I1216 12:22:50.913053 2480 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-6-119dd6897d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:22:50.913558 kubelet[2480]: I1216 12:22:50.913300 2480 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:22:50.913558 kubelet[2480]: I1216 12:22:50.913311 2480 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 12:22:50.914221 kubelet[2480]: I1216 12:22:50.914181 2480 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:22:50.918631 kubelet[2480]: I1216 12:22:50.918602 2480 kubelet.go:480] "Attempting to sync node with API server" Dec 16 12:22:50.918760 kubelet[2480]: I1216 12:22:50.918747 2480 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:22:50.918932 kubelet[2480]: I1216 12:22:50.918921 2480 kubelet.go:386] "Adding apiserver pod source" Dec 16 12:22:50.920851 kubelet[2480]: I1216 12:22:50.920835 2480 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:22:50.922270 kubelet[2480]: E1216 12:22:50.921091 2480 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.23.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-6-119dd6897d&limit=500&resourceVersion=0\": dial tcp 10.0.23.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:22:50.922652 kubelet[2480]: E1216 12:22:50.922570 2480 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.23.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.23.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:22:50.923546 kubelet[2480]: I1216 12:22:50.923520 2480 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 12:22:50.925034 kubelet[2480]: I1216 12:22:50.924996 2480 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:22:50.925156 kubelet[2480]: W1216 12:22:50.925143 2480 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:22:50.927700 kubelet[2480]: I1216 12:22:50.927657 2480 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:22:50.927700 kubelet[2480]: I1216 12:22:50.927703 2480 server.go:1289] "Started kubelet" Dec 16 12:22:50.928971 kubelet[2480]: I1216 12:22:50.928944 2480 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:22:50.931208 kubelet[2480]: I1216 12:22:50.930505 2480 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:22:50.931208 kubelet[2480]: I1216 12:22:50.930779 2480 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:22:50.931208 kubelet[2480]: I1216 12:22:50.930830 2480 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:22:50.931849 kubelet[2480]: I1216 12:22:50.931811 2480 server.go:317] "Adding debug handlers to kubelet server" Dec 16 12:22:50.932487 kubelet[2480]: I1216 12:22:50.932466 2480 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:22:50.935580 kubelet[2480]: E1216 12:22:50.935467 2480 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-6-119dd6897d\" not found" Dec 16 12:22:50.935649 kubelet[2480]: I1216 12:22:50.935589 2480 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:22:50.939806 kubelet[2480]: I1216 12:22:50.939710 2480 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:22:50.939806 kubelet[2480]: E1216 12:22:50.933934 2480 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.23.32:6443/api/v1/namespaces/default/events\": dial tcp 10.0.23.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-6-119dd6897d.1881b1925359b220 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-6-119dd6897d,UID:ci-4459-2-2-6-119dd6897d,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-6-119dd6897d,},FirstTimestamp:2025-12-16 12:22:50.927673888 +0000 UTC m=+1.088818153,LastTimestamp:2025-12-16 12:22:50.927673888 +0000 UTC m=+1.088818153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-6-119dd6897d,}" Dec 16 12:22:50.939995 kubelet[2480]: I1216 12:22:50.939909 2480 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:22:50.940463 kubelet[2480]: E1216 12:22:50.940428 2480 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.23.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.23.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:22:50.940558 kubelet[2480]: I1216 12:22:50.940534 2480 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:22:50.940597 kubelet[2480]: E1216 12:22:50.940585 2480 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:22:50.940669 kubelet[2480]: I1216 12:22:50.940653 2480 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:22:50.941023 kubelet[2480]: E1216 12:22:50.940984 2480 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.23.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-6-119dd6897d?timeout=10s\": dial tcp 10.0.23.32:6443: connect: connection refused" interval="200ms" Dec 16 12:22:50.942143 kubelet[2480]: I1216 12:22:50.941983 2480 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:22:50.952544 kubelet[2480]: I1216 12:22:50.952486 2480 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:22:50.952544 kubelet[2480]: I1216 12:22:50.952527 2480 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:22:50.952677 kubelet[2480]: I1216 12:22:50.952578 2480 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:22:50.955420 kubelet[2480]: I1216 12:22:50.955378 2480 policy_none.go:49] "None policy: Start" Dec 16 12:22:50.955420 kubelet[2480]: I1216 12:22:50.955404 2480 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:22:50.955420 kubelet[2480]: I1216 12:22:50.955416 2480 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:22:50.959232 kubelet[2480]: I1216 12:22:50.959205 2480 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 16 12:22:50.960321 kubelet[2480]: I1216 12:22:50.960287 2480 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 16 12:22:50.960321 kubelet[2480]: I1216 12:22:50.960308 2480 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 16 12:22:50.960321 kubelet[2480]: I1216 12:22:50.960325 2480 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:22:50.960439 kubelet[2480]: I1216 12:22:50.960334 2480 kubelet.go:2436] "Starting kubelet main sync loop" Dec 16 12:22:50.960439 kubelet[2480]: E1216 12:22:50.960367 2480 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:22:50.963187 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:22:50.964264 kubelet[2480]: E1216 12:22:50.963966 2480 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.23.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.23.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:22:50.977754 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:22:50.980715 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:22:50.996759 kubelet[2480]: E1216 12:22:50.996683 2480 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:22:50.997115 kubelet[2480]: I1216 12:22:50.996895 2480 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:22:50.997115 kubelet[2480]: I1216 12:22:50.996950 2480 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:22:50.997386 kubelet[2480]: I1216 12:22:50.997347 2480 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:22:50.998487 kubelet[2480]: E1216 12:22:50.998467 2480 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:22:50.998546 kubelet[2480]: E1216 12:22:50.998517 2480 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-6-119dd6897d\" not found" Dec 16 12:22:51.071303 systemd[1]: Created slice kubepods-burstable-pod8568293b24affd0104e0c8f05f4866b5.slice - libcontainer container kubepods-burstable-pod8568293b24affd0104e0c8f05f4866b5.slice. Dec 16 12:22:51.090223 kubelet[2480]: E1216 12:22:51.090162 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.094108 systemd[1]: Created slice kubepods-burstable-pod4d42e56df62129c18da75f0f7e0999ca.slice - libcontainer container kubepods-burstable-pod4d42e56df62129c18da75f0f7e0999ca.slice. Dec 16 12:22:51.098496 kubelet[2480]: I1216 12:22:51.098451 2480 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.098937 kubelet[2480]: E1216 12:22:51.098911 2480 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.23.32:6443/api/v1/nodes\": dial tcp 10.0.23.32:6443: connect: connection refused" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.109931 kubelet[2480]: E1216 12:22:51.109894 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.113454 systemd[1]: Created slice kubepods-burstable-pod7690d397bb2e072a72b2ca91e40bddc9.slice - libcontainer container kubepods-burstable-pod7690d397bb2e072a72b2ca91e40bddc9.slice. Dec 16 12:22:51.115095 kubelet[2480]: E1216 12:22:51.114941 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.141606 kubelet[2480]: E1216 12:22:51.141518 2480 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.23.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-6-119dd6897d?timeout=10s\": dial tcp 10.0.23.32:6443: connect: connection refused" interval="400ms" Dec 16 12:22:51.240871 kubelet[2480]: I1216 12:22:51.240749 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.240871 kubelet[2480]: I1216 12:22:51.240805 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.240871 kubelet[2480]: I1216 12:22:51.240841 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4d42e56df62129c18da75f0f7e0999ca-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" (UID: \"4d42e56df62129c18da75f0f7e0999ca\") " pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.241050 kubelet[2480]: I1216 12:22:51.240956 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.241050 kubelet[2480]: I1216 12:22:51.241035 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8568293b24affd0104e0c8f05f4866b5-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-6-119dd6897d\" (UID: \"8568293b24affd0104e0c8f05f4866b5\") " pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.241151 kubelet[2480]: I1216 12:22:51.241084 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4d42e56df62129c18da75f0f7e0999ca-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" (UID: \"4d42e56df62129c18da75f0f7e0999ca\") " pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.241270 kubelet[2480]: I1216 12:22:51.241187 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4d42e56df62129c18da75f0f7e0999ca-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" (UID: \"4d42e56df62129c18da75f0f7e0999ca\") " pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.241344 kubelet[2480]: I1216 12:22:51.241305 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.241402 kubelet[2480]: I1216 12:22:51.241356 2480 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.301416 kubelet[2480]: I1216 12:22:51.301386 2480 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.301746 kubelet[2480]: E1216 12:22:51.301704 2480 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.23.32:6443/api/v1/nodes\": dial tcp 10.0.23.32:6443: connect: connection refused" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.392342 containerd[1660]: time="2025-12-16T12:22:51.392238497Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-6-119dd6897d,Uid:8568293b24affd0104e0c8f05f4866b5,Namespace:kube-system,Attempt:0,}" Dec 16 12:22:51.411324 containerd[1660]: time="2025-12-16T12:22:51.411283914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-6-119dd6897d,Uid:4d42e56df62129c18da75f0f7e0999ca,Namespace:kube-system,Attempt:0,}" Dec 16 12:22:51.412897 containerd[1660]: time="2025-12-16T12:22:51.412866883Z" level=info msg="connecting to shim b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3" address="unix:///run/containerd/s/ebcda05243c9a03aabe641d3989fb38b99656500884c4bff4df978158021eab0" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:22:51.416209 containerd[1660]: time="2025-12-16T12:22:51.416152699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-6-119dd6897d,Uid:7690d397bb2e072a72b2ca91e40bddc9,Namespace:kube-system,Attempt:0,}" Dec 16 12:22:51.439478 containerd[1660]: time="2025-12-16T12:22:51.439433258Z" level=info msg="connecting to shim 54756e305a97fb24032a2df100fa3cc9bd0cf1b9ddf7f75ca3ae908b8320e36a" address="unix:///run/containerd/s/91a9306cac1080c23fc6270498c4ba8fe9df906718329b8e1f5fb9a8b5b23d6f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:22:51.443397 systemd[1]: Started cri-containerd-b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3.scope - libcontainer container b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3. Dec 16 12:22:51.449791 containerd[1660]: time="2025-12-16T12:22:51.449748871Z" level=info msg="connecting to shim 5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6" address="unix:///run/containerd/s/70289c6275ec55f2a760333f19ef7bf0e8948ba3662c861b223b67a494da611e" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:22:51.470403 systemd[1]: Started cri-containerd-54756e305a97fb24032a2df100fa3cc9bd0cf1b9ddf7f75ca3ae908b8320e36a.scope - libcontainer container 54756e305a97fb24032a2df100fa3cc9bd0cf1b9ddf7f75ca3ae908b8320e36a. Dec 16 12:22:51.475871 systemd[1]: Started cri-containerd-5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6.scope - libcontainer container 5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6. Dec 16 12:22:51.493706 containerd[1660]: time="2025-12-16T12:22:51.493561974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-6-119dd6897d,Uid:8568293b24affd0104e0c8f05f4866b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3\"" Dec 16 12:22:51.503346 containerd[1660]: time="2025-12-16T12:22:51.503309584Z" level=info msg="CreateContainer within sandbox \"b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:22:51.520584 containerd[1660]: time="2025-12-16T12:22:51.520534912Z" level=info msg="Container abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:22:51.522171 containerd[1660]: time="2025-12-16T12:22:51.521991079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-6-119dd6897d,Uid:4d42e56df62129c18da75f0f7e0999ca,Namespace:kube-system,Attempt:0,} returns sandbox id \"54756e305a97fb24032a2df100fa3cc9bd0cf1b9ddf7f75ca3ae908b8320e36a\"" Dec 16 12:22:51.526729 containerd[1660]: time="2025-12-16T12:22:51.526694823Z" level=info msg="CreateContainer within sandbox \"54756e305a97fb24032a2df100fa3cc9bd0cf1b9ddf7f75ca3ae908b8320e36a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:22:51.529743 containerd[1660]: time="2025-12-16T12:22:51.529608238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-6-119dd6897d,Uid:7690d397bb2e072a72b2ca91e40bddc9,Namespace:kube-system,Attempt:0,} returns sandbox id \"5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6\"" Dec 16 12:22:51.530807 containerd[1660]: time="2025-12-16T12:22:51.530640083Z" level=info msg="CreateContainer within sandbox \"b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1\"" Dec 16 12:22:51.531328 containerd[1660]: time="2025-12-16T12:22:51.531304847Z" level=info msg="StartContainer for \"abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1\"" Dec 16 12:22:51.532516 containerd[1660]: time="2025-12-16T12:22:51.532494253Z" level=info msg="connecting to shim abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1" address="unix:///run/containerd/s/ebcda05243c9a03aabe641d3989fb38b99656500884c4bff4df978158021eab0" protocol=ttrpc version=3 Dec 16 12:22:51.537013 containerd[1660]: time="2025-12-16T12:22:51.536947715Z" level=info msg="CreateContainer within sandbox \"5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:22:51.539006 containerd[1660]: time="2025-12-16T12:22:51.538953326Z" level=info msg="Container 9d70aea4caa11bb280e5bf596cf995dcd8701c6fc30a89c77d96ec125f4343bd: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:22:51.543447 kubelet[2480]: E1216 12:22:51.543404 2480 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.23.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-6-119dd6897d?timeout=10s\": dial tcp 10.0.23.32:6443: connect: connection refused" interval="800ms" Dec 16 12:22:51.549651 containerd[1660]: time="2025-12-16T12:22:51.549589860Z" level=info msg="Container b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:22:51.555443 systemd[1]: Started cri-containerd-abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1.scope - libcontainer container abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1. Dec 16 12:22:51.559707 containerd[1660]: time="2025-12-16T12:22:51.559532350Z" level=info msg="CreateContainer within sandbox \"54756e305a97fb24032a2df100fa3cc9bd0cf1b9ddf7f75ca3ae908b8320e36a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9d70aea4caa11bb280e5bf596cf995dcd8701c6fc30a89c77d96ec125f4343bd\"" Dec 16 12:22:51.561827 containerd[1660]: time="2025-12-16T12:22:51.561790802Z" level=info msg="StartContainer for \"9d70aea4caa11bb280e5bf596cf995dcd8701c6fc30a89c77d96ec125f4343bd\"" Dec 16 12:22:51.563151 containerd[1660]: time="2025-12-16T12:22:51.563102209Z" level=info msg="connecting to shim 9d70aea4caa11bb280e5bf596cf995dcd8701c6fc30a89c77d96ec125f4343bd" address="unix:///run/containerd/s/91a9306cac1080c23fc6270498c4ba8fe9df906718329b8e1f5fb9a8b5b23d6f" protocol=ttrpc version=3 Dec 16 12:22:51.567968 containerd[1660]: time="2025-12-16T12:22:51.567923433Z" level=info msg="CreateContainer within sandbox \"5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3\"" Dec 16 12:22:51.570230 containerd[1660]: time="2025-12-16T12:22:51.568601037Z" level=info msg="StartContainer for \"b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3\"" Dec 16 12:22:51.570230 containerd[1660]: time="2025-12-16T12:22:51.570094604Z" level=info msg="connecting to shim b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3" address="unix:///run/containerd/s/70289c6275ec55f2a760333f19ef7bf0e8948ba3662c861b223b67a494da611e" protocol=ttrpc version=3 Dec 16 12:22:51.589388 systemd[1]: Started cri-containerd-9d70aea4caa11bb280e5bf596cf995dcd8701c6fc30a89c77d96ec125f4343bd.scope - libcontainer container 9d70aea4caa11bb280e5bf596cf995dcd8701c6fc30a89c77d96ec125f4343bd. Dec 16 12:22:51.591926 systemd[1]: Started cri-containerd-b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3.scope - libcontainer container b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3. Dec 16 12:22:51.608126 containerd[1660]: time="2025-12-16T12:22:51.608085318Z" level=info msg="StartContainer for \"abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1\" returns successfully" Dec 16 12:22:51.645925 containerd[1660]: time="2025-12-16T12:22:51.645732350Z" level=info msg="StartContainer for \"b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3\" returns successfully" Dec 16 12:22:51.646772 containerd[1660]: time="2025-12-16T12:22:51.646299273Z" level=info msg="StartContainer for \"9d70aea4caa11bb280e5bf596cf995dcd8701c6fc30a89c77d96ec125f4343bd\" returns successfully" Dec 16 12:22:51.705458 kubelet[2480]: I1216 12:22:51.705211 2480 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.705550 kubelet[2480]: E1216 12:22:51.705518 2480 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.23.32:6443/api/v1/nodes\": dial tcp 10.0.23.32:6443: connect: connection refused" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.972543 kubelet[2480]: E1216 12:22:51.972313 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.973516 kubelet[2480]: E1216 12:22:51.972721 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:51.976289 kubelet[2480]: E1216 12:22:51.976267 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:52.507802 kubelet[2480]: I1216 12:22:52.507766 2480 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:52.979933 kubelet[2480]: E1216 12:22:52.979903 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:52.981303 kubelet[2480]: E1216 12:22:52.980048 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:52.981303 kubelet[2480]: E1216 12:22:52.980109 2480 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.771661 kubelet[2480]: E1216 12:22:53.771593 2480 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-6-119dd6897d\" not found" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.853362 kubelet[2480]: I1216 12:22:53.853318 2480 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.923678 kubelet[2480]: I1216 12:22:53.923641 2480 apiserver.go:52] "Watching apiserver" Dec 16 12:22:53.940513 kubelet[2480]: I1216 12:22:53.940467 2480 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 12:22:53.942077 kubelet[2480]: I1216 12:22:53.941537 2480 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.947940 kubelet[2480]: E1216 12:22:53.947861 2480 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.947940 kubelet[2480]: I1216 12:22:53.947943 2480 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.950111 kubelet[2480]: E1216 12:22:53.950059 2480 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-6-119dd6897d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.950111 kubelet[2480]: I1216 12:22:53.950091 2480 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:53.952047 kubelet[2480]: E1216 12:22:53.952004 2480 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:56.095970 kubelet[2480]: I1216 12:22:56.095893 2480 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:56.181282 systemd[1]: Reload requested from client PID 2764 ('systemctl') (unit session-7.scope)... Dec 16 12:22:56.181299 systemd[1]: Reloading... Dec 16 12:22:56.277226 zram_generator::config[2807]: No configuration found. Dec 16 12:22:56.450884 systemd[1]: Reloading finished in 269 ms. Dec 16 12:22:56.476258 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:56.498528 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:22:56.499509 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:56.500366 systemd[1]: kubelet.service: Consumed 1.510s CPU time, 127.5M memory peak. Dec 16 12:22:56.502980 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:22:56.663122 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:22:56.681009 (kubelet)[2852]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:22:57.073262 kubelet[2852]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:22:57.073262 kubelet[2852]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:22:57.073262 kubelet[2852]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:22:57.073262 kubelet[2852]: I1216 12:22:56.711777 2852 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:22:57.073262 kubelet[2852]: I1216 12:22:56.718276 2852 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 16 12:22:57.073262 kubelet[2852]: I1216 12:22:56.718302 2852 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:22:57.073262 kubelet[2852]: I1216 12:22:56.718504 2852 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:22:57.074866 kubelet[2852]: I1216 12:22:57.074846 2852 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 12:22:57.077569 kubelet[2852]: I1216 12:22:57.077520 2852 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:22:57.081563 kubelet[2852]: I1216 12:22:57.081544 2852 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:22:57.084136 kubelet[2852]: I1216 12:22:57.084121 2852 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:22:57.084339 kubelet[2852]: I1216 12:22:57.084318 2852 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:22:57.084480 kubelet[2852]: I1216 12:22:57.084341 2852 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-6-119dd6897d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:22:57.084611 kubelet[2852]: I1216 12:22:57.084490 2852 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:22:57.084611 kubelet[2852]: I1216 12:22:57.084499 2852 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 12:22:57.084611 kubelet[2852]: I1216 12:22:57.084557 2852 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:22:57.084705 kubelet[2852]: I1216 12:22:57.084696 2852 kubelet.go:480] "Attempting to sync node with API server" Dec 16 12:22:57.084730 kubelet[2852]: I1216 12:22:57.084709 2852 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:22:57.084824 kubelet[2852]: I1216 12:22:57.084731 2852 kubelet.go:386] "Adding apiserver pod source" Dec 16 12:22:57.084824 kubelet[2852]: I1216 12:22:57.084743 2852 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:22:57.087492 kubelet[2852]: I1216 12:22:57.087457 2852 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 12:22:57.088581 kubelet[2852]: I1216 12:22:57.088482 2852 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:22:57.091549 kubelet[2852]: I1216 12:22:57.091526 2852 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:22:57.092275 kubelet[2852]: I1216 12:22:57.091578 2852 server.go:1289] "Started kubelet" Dec 16 12:22:57.092520 kubelet[2852]: I1216 12:22:57.092499 2852 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:22:57.092806 kubelet[2852]: I1216 12:22:57.092424 2852 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:22:57.093634 kubelet[2852]: I1216 12:22:57.093605 2852 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:22:57.094354 kubelet[2852]: I1216 12:22:57.094323 2852 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:22:57.094883 kubelet[2852]: I1216 12:22:57.094851 2852 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:22:57.095373 kubelet[2852]: I1216 12:22:57.095268 2852 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:22:57.095701 kubelet[2852]: I1216 12:22:57.095646 2852 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:22:57.095866 kubelet[2852]: I1216 12:22:57.095856 2852 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:22:57.101276 kubelet[2852]: E1216 12:22:57.101244 2852 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-6-119dd6897d\" not found" Dec 16 12:22:57.103040 kubelet[2852]: I1216 12:22:57.103005 2852 server.go:317] "Adding debug handlers to kubelet server" Dec 16 12:22:57.106111 kubelet[2852]: I1216 12:22:57.105313 2852 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:22:57.106233 kubelet[2852]: I1216 12:22:57.106211 2852 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:22:57.108137 kubelet[2852]: I1216 12:22:57.108111 2852 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:22:57.129015 kubelet[2852]: I1216 12:22:57.128960 2852 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 16 12:22:57.130237 kubelet[2852]: I1216 12:22:57.129911 2852 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 16 12:22:57.130237 kubelet[2852]: I1216 12:22:57.129953 2852 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 16 12:22:57.130237 kubelet[2852]: I1216 12:22:57.129976 2852 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:22:57.130237 kubelet[2852]: I1216 12:22:57.129982 2852 kubelet.go:2436] "Starting kubelet main sync loop" Dec 16 12:22:57.130237 kubelet[2852]: E1216 12:22:57.130038 2852 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:22:57.149806 kubelet[2852]: I1216 12:22:57.149781 2852 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:22:57.149806 kubelet[2852]: I1216 12:22:57.149797 2852 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:22:57.149806 kubelet[2852]: I1216 12:22:57.149816 2852 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:22:57.149979 kubelet[2852]: I1216 12:22:57.149928 2852 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:22:57.149979 kubelet[2852]: I1216 12:22:57.149938 2852 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:22:57.149979 kubelet[2852]: I1216 12:22:57.149953 2852 policy_none.go:49] "None policy: Start" Dec 16 12:22:57.149979 kubelet[2852]: I1216 12:22:57.149969 2852 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:22:57.149979 kubelet[2852]: I1216 12:22:57.149978 2852 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:22:57.150077 kubelet[2852]: I1216 12:22:57.150054 2852 state_mem.go:75] "Updated machine memory state" Dec 16 12:22:57.154590 kubelet[2852]: E1216 12:22:57.154562 2852 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:22:57.154741 kubelet[2852]: I1216 12:22:57.154728 2852 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:22:57.154771 kubelet[2852]: I1216 12:22:57.154742 2852 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:22:57.155213 kubelet[2852]: I1216 12:22:57.155142 2852 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:22:57.155919 kubelet[2852]: E1216 12:22:57.155901 2852 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:22:57.230726 kubelet[2852]: I1216 12:22:57.230688 2852 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.230726 kubelet[2852]: I1216 12:22:57.230874 2852 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.230726 kubelet[2852]: I1216 12:22:57.230888 2852 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.239863 kubelet[2852]: E1216 12:22:57.239832 2852 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.258063 kubelet[2852]: I1216 12:22:57.258040 2852 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.269899 kubelet[2852]: I1216 12:22:57.269868 2852 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.270035 kubelet[2852]: I1216 12:22:57.269952 2852 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.297741 kubelet[2852]: I1216 12:22:57.297691 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.297741 kubelet[2852]: I1216 12:22:57.297737 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8568293b24affd0104e0c8f05f4866b5-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-6-119dd6897d\" (UID: \"8568293b24affd0104e0c8f05f4866b5\") " pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.297741 kubelet[2852]: I1216 12:22:57.297758 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4d42e56df62129c18da75f0f7e0999ca-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" (UID: \"4d42e56df62129c18da75f0f7e0999ca\") " pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.298063 kubelet[2852]: I1216 12:22:57.297774 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4d42e56df62129c18da75f0f7e0999ca-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" (UID: \"4d42e56df62129c18da75f0f7e0999ca\") " pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.298063 kubelet[2852]: I1216 12:22:57.297791 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4d42e56df62129c18da75f0f7e0999ca-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" (UID: \"4d42e56df62129c18da75f0f7e0999ca\") " pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.298063 kubelet[2852]: I1216 12:22:57.297808 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.298063 kubelet[2852]: I1216 12:22:57.297825 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.298063 kubelet[2852]: I1216 12:22:57.297843 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:57.298445 kubelet[2852]: I1216 12:22:57.297887 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7690d397bb2e072a72b2ca91e40bddc9-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" (UID: \"7690d397bb2e072a72b2ca91e40bddc9\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:58.090228 kubelet[2852]: I1216 12:22:58.086500 2852 apiserver.go:52] "Watching apiserver" Dec 16 12:22:58.096375 kubelet[2852]: I1216 12:22:58.096317 2852 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 12:22:58.148440 kubelet[2852]: I1216 12:22:58.147997 2852 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:58.148440 kubelet[2852]: I1216 12:22:58.148188 2852 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:58.148440 kubelet[2852]: I1216 12:22:58.148398 2852 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:58.156462 kubelet[2852]: E1216 12:22:58.156420 2852 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-6-119dd6897d\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:58.156697 kubelet[2852]: E1216 12:22:58.156429 2852 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-6-119dd6897d\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:58.157207 kubelet[2852]: E1216 12:22:58.157083 2852 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-6-119dd6897d\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" Dec 16 12:22:58.181851 kubelet[2852]: I1216 12:22:58.181775 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-6-119dd6897d" podStartSLOduration=1.181755442 podStartE2EDuration="1.181755442s" podCreationTimestamp="2025-12-16 12:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:22:58.170822826 +0000 UTC m=+1.486427553" watchObservedRunningTime="2025-12-16 12:22:58.181755442 +0000 UTC m=+1.497360169" Dec 16 12:22:58.192241 kubelet[2852]: I1216 12:22:58.192167 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-6-119dd6897d" podStartSLOduration=1.192149495 podStartE2EDuration="1.192149495s" podCreationTimestamp="2025-12-16 12:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:22:58.181927483 +0000 UTC m=+1.497532210" watchObservedRunningTime="2025-12-16 12:22:58.192149495 +0000 UTC m=+1.507754222" Dec 16 12:22:58.202606 kubelet[2852]: I1216 12:22:58.202555 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-6-119dd6897d" podStartSLOduration=2.202538188 podStartE2EDuration="2.202538188s" podCreationTimestamp="2025-12-16 12:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:22:58.192033814 +0000 UTC m=+1.507638541" watchObservedRunningTime="2025-12-16 12:22:58.202538188 +0000 UTC m=+1.518142915" Dec 16 12:22:58.681844 update_engine[1639]: I20251216 12:22:58.681762 1639 update_attempter.cc:509] Updating boot flags... Dec 16 12:23:02.631937 kubelet[2852]: I1216 12:23:02.631896 2852 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:23:02.632766 kubelet[2852]: I1216 12:23:02.632374 2852 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:23:02.632797 containerd[1660]: time="2025-12-16T12:23:02.632182460Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:23:03.407009 systemd[1]: Created slice kubepods-besteffort-pod91a84111_74ac_4118_970c_9ca9a6aead47.slice - libcontainer container kubepods-besteffort-pod91a84111_74ac_4118_970c_9ca9a6aead47.slice. Dec 16 12:23:03.439277 kubelet[2852]: I1216 12:23:03.439191 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/91a84111-74ac-4118-970c-9ca9a6aead47-kube-proxy\") pod \"kube-proxy-gjpzk\" (UID: \"91a84111-74ac-4118-970c-9ca9a6aead47\") " pod="kube-system/kube-proxy-gjpzk" Dec 16 12:23:03.439277 kubelet[2852]: I1216 12:23:03.439250 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/91a84111-74ac-4118-970c-9ca9a6aead47-xtables-lock\") pod \"kube-proxy-gjpzk\" (UID: \"91a84111-74ac-4118-970c-9ca9a6aead47\") " pod="kube-system/kube-proxy-gjpzk" Dec 16 12:23:03.439449 kubelet[2852]: I1216 12:23:03.439291 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91a84111-74ac-4118-970c-9ca9a6aead47-lib-modules\") pod \"kube-proxy-gjpzk\" (UID: \"91a84111-74ac-4118-970c-9ca9a6aead47\") " pod="kube-system/kube-proxy-gjpzk" Dec 16 12:23:03.439449 kubelet[2852]: I1216 12:23:03.439308 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccnmv\" (UniqueName: \"kubernetes.io/projected/91a84111-74ac-4118-970c-9ca9a6aead47-kube-api-access-ccnmv\") pod \"kube-proxy-gjpzk\" (UID: \"91a84111-74ac-4118-970c-9ca9a6aead47\") " pod="kube-system/kube-proxy-gjpzk" Dec 16 12:23:03.719683 containerd[1660]: time="2025-12-16T12:23:03.719579685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gjpzk,Uid:91a84111-74ac-4118-970c-9ca9a6aead47,Namespace:kube-system,Attempt:0,}" Dec 16 12:23:03.738030 containerd[1660]: time="2025-12-16T12:23:03.737975419Z" level=info msg="connecting to shim 62831b8de2ff4f1ddedf4b6c475789fab505f5124a12de0484a4f2a436b8bd35" address="unix:///run/containerd/s/bc4b390f25ac734107217a36e384b447a29812458ed6c74f4d37e438cab7a5fd" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:03.767531 systemd[1]: Started cri-containerd-62831b8de2ff4f1ddedf4b6c475789fab505f5124a12de0484a4f2a436b8bd35.scope - libcontainer container 62831b8de2ff4f1ddedf4b6c475789fab505f5124a12de0484a4f2a436b8bd35. Dec 16 12:23:03.800795 containerd[1660]: time="2025-12-16T12:23:03.800736659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gjpzk,Uid:91a84111-74ac-4118-970c-9ca9a6aead47,Namespace:kube-system,Attempt:0,} returns sandbox id \"62831b8de2ff4f1ddedf4b6c475789fab505f5124a12de0484a4f2a436b8bd35\"" Dec 16 12:23:03.812387 containerd[1660]: time="2025-12-16T12:23:03.812333758Z" level=info msg="CreateContainer within sandbox \"62831b8de2ff4f1ddedf4b6c475789fab505f5124a12de0484a4f2a436b8bd35\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:23:03.833037 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1388852187.mount: Deactivated successfully. Dec 16 12:23:03.835972 containerd[1660]: time="2025-12-16T12:23:03.835918719Z" level=info msg="Container 008a38dc30794fe60d60231b42d6200290812505592181b3871e597eb35d3c0d: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:03.841624 kubelet[2852]: I1216 12:23:03.841567 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e14e5b12-4666-48e8-9acb-a2b529d7ed68-var-lib-calico\") pod \"tigera-operator-7dcd859c48-ttqrb\" (UID: \"e14e5b12-4666-48e8-9acb-a2b529d7ed68\") " pod="tigera-operator/tigera-operator-7dcd859c48-ttqrb" Dec 16 12:23:03.841624 kubelet[2852]: I1216 12:23:03.841624 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4mr\" (UniqueName: \"kubernetes.io/projected/e14e5b12-4666-48e8-9acb-a2b529d7ed68-kube-api-access-rr4mr\") pod \"tigera-operator-7dcd859c48-ttqrb\" (UID: \"e14e5b12-4666-48e8-9acb-a2b529d7ed68\") " pod="tigera-operator/tigera-operator-7dcd859c48-ttqrb" Dec 16 12:23:03.843058 systemd[1]: Created slice kubepods-besteffort-pode14e5b12_4666_48e8_9acb_a2b529d7ed68.slice - libcontainer container kubepods-besteffort-pode14e5b12_4666_48e8_9acb_a2b529d7ed68.slice. Dec 16 12:23:03.853330 containerd[1660]: time="2025-12-16T12:23:03.852528563Z" level=info msg="CreateContainer within sandbox \"62831b8de2ff4f1ddedf4b6c475789fab505f5124a12de0484a4f2a436b8bd35\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"008a38dc30794fe60d60231b42d6200290812505592181b3871e597eb35d3c0d\"" Dec 16 12:23:03.854961 containerd[1660]: time="2025-12-16T12:23:03.854925776Z" level=info msg="StartContainer for \"008a38dc30794fe60d60231b42d6200290812505592181b3871e597eb35d3c0d\"" Dec 16 12:23:03.862580 containerd[1660]: time="2025-12-16T12:23:03.862535934Z" level=info msg="connecting to shim 008a38dc30794fe60d60231b42d6200290812505592181b3871e597eb35d3c0d" address="unix:///run/containerd/s/bc4b390f25ac734107217a36e384b447a29812458ed6c74f4d37e438cab7a5fd" protocol=ttrpc version=3 Dec 16 12:23:03.881502 systemd[1]: Started cri-containerd-008a38dc30794fe60d60231b42d6200290812505592181b3871e597eb35d3c0d.scope - libcontainer container 008a38dc30794fe60d60231b42d6200290812505592181b3871e597eb35d3c0d. Dec 16 12:23:03.969609 containerd[1660]: time="2025-12-16T12:23:03.969525800Z" level=info msg="StartContainer for \"008a38dc30794fe60d60231b42d6200290812505592181b3871e597eb35d3c0d\" returns successfully" Dec 16 12:23:04.151298 containerd[1660]: time="2025-12-16T12:23:04.150416602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-ttqrb,Uid:e14e5b12-4666-48e8-9acb-a2b529d7ed68,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:23:04.170309 containerd[1660]: time="2025-12-16T12:23:04.170233463Z" level=info msg="connecting to shim a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61" address="unix:///run/containerd/s/f7d6197acf7bb0c99701516aa082bef04efda8773f8114603ec7002d9252426f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:04.173077 kubelet[2852]: I1216 12:23:04.173017 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gjpzk" podStartSLOduration=1.172999918 podStartE2EDuration="1.172999918s" podCreationTimestamp="2025-12-16 12:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:23:04.172832277 +0000 UTC m=+7.488437004" watchObservedRunningTime="2025-12-16 12:23:04.172999918 +0000 UTC m=+7.488604605" Dec 16 12:23:04.207441 systemd[1]: Started cri-containerd-a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61.scope - libcontainer container a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61. Dec 16 12:23:04.239375 containerd[1660]: time="2025-12-16T12:23:04.239323976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-ttqrb,Uid:e14e5b12-4666-48e8-9acb-a2b529d7ed68,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61\"" Dec 16 12:23:04.241055 containerd[1660]: time="2025-12-16T12:23:04.241024944Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:23:06.016886 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount57989548.mount: Deactivated successfully. Dec 16 12:23:06.845607 containerd[1660]: time="2025-12-16T12:23:06.845523226Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:06.846500 containerd[1660]: time="2025-12-16T12:23:06.846473991Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 16 12:23:06.847711 containerd[1660]: time="2025-12-16T12:23:06.847651997Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:06.850141 containerd[1660]: time="2025-12-16T12:23:06.850087849Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:06.850849 containerd[1660]: time="2025-12-16T12:23:06.850814533Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.609755788s" Dec 16 12:23:06.850849 containerd[1660]: time="2025-12-16T12:23:06.850847973Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 16 12:23:06.856543 containerd[1660]: time="2025-12-16T12:23:06.856513802Z" level=info msg="CreateContainer within sandbox \"a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:23:06.864738 containerd[1660]: time="2025-12-16T12:23:06.864699283Z" level=info msg="Container 6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:06.872172 containerd[1660]: time="2025-12-16T12:23:06.872108721Z" level=info msg="CreateContainer within sandbox \"a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d\"" Dec 16 12:23:06.872773 containerd[1660]: time="2025-12-16T12:23:06.872616564Z" level=info msg="StartContainer for \"6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d\"" Dec 16 12:23:06.875230 containerd[1660]: time="2025-12-16T12:23:06.875169737Z" level=info msg="connecting to shim 6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d" address="unix:///run/containerd/s/f7d6197acf7bb0c99701516aa082bef04efda8773f8114603ec7002d9252426f" protocol=ttrpc version=3 Dec 16 12:23:06.896514 systemd[1]: Started cri-containerd-6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d.scope - libcontainer container 6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d. Dec 16 12:23:06.922975 containerd[1660]: time="2025-12-16T12:23:06.922936140Z" level=info msg="StartContainer for \"6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d\" returns successfully" Dec 16 12:23:07.191301 kubelet[2852]: I1216 12:23:07.190477 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-ttqrb" podStartSLOduration=1.579482031 podStartE2EDuration="4.190462105s" podCreationTimestamp="2025-12-16 12:23:03 +0000 UTC" firstStartedPulling="2025-12-16 12:23:04.240675343 +0000 UTC m=+7.556280070" lastFinishedPulling="2025-12-16 12:23:06.851655377 +0000 UTC m=+10.167260144" observedRunningTime="2025-12-16 12:23:07.178706085 +0000 UTC m=+10.494310772" watchObservedRunningTime="2025-12-16 12:23:07.190462105 +0000 UTC m=+10.506066832" Dec 16 12:23:12.492141 sudo[1902]: pam_unix(sudo:session): session closed for user root Dec 16 12:23:12.663221 sshd[1901]: Connection closed by 139.178.68.195 port 34138 Dec 16 12:23:12.663408 sshd-session[1898]: pam_unix(sshd:session): session closed for user core Dec 16 12:23:12.668458 systemd-logind[1637]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:23:12.668635 systemd[1]: sshd@6-10.0.23.32:22-139.178.68.195:34138.service: Deactivated successfully. Dec 16 12:23:12.672045 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:23:12.672350 systemd[1]: session-7.scope: Consumed 5.320s CPU time, 226.1M memory peak. Dec 16 12:23:12.674999 systemd-logind[1637]: Removed session 7. Dec 16 12:23:19.563548 systemd[1]: Created slice kubepods-besteffort-pod078f3197_47e9_4264_8eaa_810c9417ed45.slice - libcontainer container kubepods-besteffort-pod078f3197_47e9_4264_8eaa_810c9417ed45.slice. Dec 16 12:23:19.629311 kubelet[2852]: I1216 12:23:19.629246 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888xm\" (UniqueName: \"kubernetes.io/projected/078f3197-47e9-4264-8eaa-810c9417ed45-kube-api-access-888xm\") pod \"calico-typha-65f777d876-vhlmj\" (UID: \"078f3197-47e9-4264-8eaa-810c9417ed45\") " pod="calico-system/calico-typha-65f777d876-vhlmj" Dec 16 12:23:19.629311 kubelet[2852]: I1216 12:23:19.629302 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/078f3197-47e9-4264-8eaa-810c9417ed45-typha-certs\") pod \"calico-typha-65f777d876-vhlmj\" (UID: \"078f3197-47e9-4264-8eaa-810c9417ed45\") " pod="calico-system/calico-typha-65f777d876-vhlmj" Dec 16 12:23:19.629823 kubelet[2852]: I1216 12:23:19.629327 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/078f3197-47e9-4264-8eaa-810c9417ed45-tigera-ca-bundle\") pod \"calico-typha-65f777d876-vhlmj\" (UID: \"078f3197-47e9-4264-8eaa-810c9417ed45\") " pod="calico-system/calico-typha-65f777d876-vhlmj" Dec 16 12:23:19.757795 systemd[1]: Created slice kubepods-besteffort-pod17c31053_b610_43b2_8615_0933aa8b4a6a.slice - libcontainer container kubepods-besteffort-pod17c31053_b610_43b2_8615_0933aa8b4a6a.slice. Dec 16 12:23:19.868326 containerd[1660]: time="2025-12-16T12:23:19.868218634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65f777d876-vhlmj,Uid:078f3197-47e9-4264-8eaa-810c9417ed45,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:19.887825 containerd[1660]: time="2025-12-16T12:23:19.887343332Z" level=info msg="connecting to shim 917807dda1ea4dc35821fee18a8625da78b516bb3f25fb888b5eea3de53ebc6a" address="unix:///run/containerd/s/67da50e14d25f6a0ff0a2c8f63bcaab9ef76255a0529a9243a6b1617f28d12ed" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:19.913636 systemd[1]: Started cri-containerd-917807dda1ea4dc35821fee18a8625da78b516bb3f25fb888b5eea3de53ebc6a.scope - libcontainer container 917807dda1ea4dc35821fee18a8625da78b516bb3f25fb888b5eea3de53ebc6a. Dec 16 12:23:19.927524 kubelet[2852]: E1216 12:23:19.927476 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:19.930362 kubelet[2852]: I1216 12:23:19.930003 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17c31053-b610-43b2-8615-0933aa8b4a6a-tigera-ca-bundle\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930362 kubelet[2852]: I1216 12:23:19.930045 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-xtables-lock\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930362 kubelet[2852]: I1216 12:23:19.930063 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-cni-bin-dir\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930362 kubelet[2852]: I1216 12:23:19.930088 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-cni-net-dir\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930362 kubelet[2852]: I1216 12:23:19.930145 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqmgq\" (UniqueName: \"kubernetes.io/projected/17c31053-b610-43b2-8615-0933aa8b4a6a-kube-api-access-gqmgq\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930650 kubelet[2852]: I1216 12:23:19.930165 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-flexvol-driver-host\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930650 kubelet[2852]: I1216 12:23:19.930184 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-var-run-calico\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930650 kubelet[2852]: I1216 12:23:19.930215 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/17c31053-b610-43b2-8615-0933aa8b4a6a-node-certs\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930650 kubelet[2852]: I1216 12:23:19.930244 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-lib-modules\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930650 kubelet[2852]: I1216 12:23:19.930260 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-cni-log-dir\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930799 kubelet[2852]: I1216 12:23:19.930274 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-policysync\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.930799 kubelet[2852]: I1216 12:23:19.930293 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/17c31053-b610-43b2-8615-0933aa8b4a6a-var-lib-calico\") pod \"calico-node-29tl8\" (UID: \"17c31053-b610-43b2-8615-0933aa8b4a6a\") " pod="calico-system/calico-node-29tl8" Dec 16 12:23:19.970615 containerd[1660]: time="2025-12-16T12:23:19.970568476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65f777d876-vhlmj,Uid:078f3197-47e9-4264-8eaa-810c9417ed45,Namespace:calico-system,Attempt:0,} returns sandbox id \"917807dda1ea4dc35821fee18a8625da78b516bb3f25fb888b5eea3de53ebc6a\"" Dec 16 12:23:19.972993 containerd[1660]: time="2025-12-16T12:23:19.972834128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:23:20.030702 kubelet[2852]: I1216 12:23:20.030647 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/426e2a8a-5d6e-4966-b145-015ce9ecbfa0-kubelet-dir\") pod \"csi-node-driver-w78fj\" (UID: \"426e2a8a-5d6e-4966-b145-015ce9ecbfa0\") " pod="calico-system/csi-node-driver-w78fj" Dec 16 12:23:20.030702 kubelet[2852]: I1216 12:23:20.030701 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/426e2a8a-5d6e-4966-b145-015ce9ecbfa0-varrun\") pod \"csi-node-driver-w78fj\" (UID: \"426e2a8a-5d6e-4966-b145-015ce9ecbfa0\") " pod="calico-system/csi-node-driver-w78fj" Dec 16 12:23:20.030853 kubelet[2852]: I1216 12:23:20.030782 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/426e2a8a-5d6e-4966-b145-015ce9ecbfa0-registration-dir\") pod \"csi-node-driver-w78fj\" (UID: \"426e2a8a-5d6e-4966-b145-015ce9ecbfa0\") " pod="calico-system/csi-node-driver-w78fj" Dec 16 12:23:20.030853 kubelet[2852]: I1216 12:23:20.030801 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhdf\" (UniqueName: \"kubernetes.io/projected/426e2a8a-5d6e-4966-b145-015ce9ecbfa0-kube-api-access-tmhdf\") pod \"csi-node-driver-w78fj\" (UID: \"426e2a8a-5d6e-4966-b145-015ce9ecbfa0\") " pod="calico-system/csi-node-driver-w78fj" Dec 16 12:23:20.030853 kubelet[2852]: I1216 12:23:20.030847 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/426e2a8a-5d6e-4966-b145-015ce9ecbfa0-socket-dir\") pod \"csi-node-driver-w78fj\" (UID: \"426e2a8a-5d6e-4966-b145-015ce9ecbfa0\") " pod="calico-system/csi-node-driver-w78fj" Dec 16 12:23:20.032817 kubelet[2852]: E1216 12:23:20.032675 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.032817 kubelet[2852]: W1216 12:23:20.032698 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.032817 kubelet[2852]: E1216 12:23:20.032728 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.032992 kubelet[2852]: E1216 12:23:20.032960 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.032992 kubelet[2852]: W1216 12:23:20.032969 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.032992 kubelet[2852]: E1216 12:23:20.032978 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.035321 kubelet[2852]: E1216 12:23:20.033184 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.035321 kubelet[2852]: W1216 12:23:20.033207 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.035321 kubelet[2852]: E1216 12:23:20.033216 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.035321 kubelet[2852]: E1216 12:23:20.033407 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.035321 kubelet[2852]: W1216 12:23:20.033417 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.035321 kubelet[2852]: E1216 12:23:20.033425 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.035321 kubelet[2852]: E1216 12:23:20.033896 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.035321 kubelet[2852]: W1216 12:23:20.033910 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.035321 kubelet[2852]: E1216 12:23:20.033918 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.035592 kubelet[2852]: E1216 12:23:20.035409 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.035592 kubelet[2852]: W1216 12:23:20.035424 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.035592 kubelet[2852]: E1216 12:23:20.035438 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.038476 kubelet[2852]: E1216 12:23:20.037975 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.038476 kubelet[2852]: W1216 12:23:20.038001 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.038476 kubelet[2852]: E1216 12:23:20.038018 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.045035 kubelet[2852]: E1216 12:23:20.044956 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.045035 kubelet[2852]: W1216 12:23:20.044978 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.045035 kubelet[2852]: E1216 12:23:20.044995 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.062869 containerd[1660]: time="2025-12-16T12:23:20.062771786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-29tl8,Uid:17c31053-b610-43b2-8615-0933aa8b4a6a,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:20.083736 containerd[1660]: time="2025-12-16T12:23:20.083681413Z" level=info msg="connecting to shim b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18" address="unix:///run/containerd/s/fd846a2deafb2219871e56fd5afe0f157b1d30a926d5e63b51f6fae54cb89ff7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:20.119510 systemd[1]: Started cri-containerd-b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18.scope - libcontainer container b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18. Dec 16 12:23:20.131405 kubelet[2852]: E1216 12:23:20.131376 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.131405 kubelet[2852]: W1216 12:23:20.131400 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.131552 kubelet[2852]: E1216 12:23:20.131419 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.131661 kubelet[2852]: E1216 12:23:20.131645 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.131716 kubelet[2852]: W1216 12:23:20.131658 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.131716 kubelet[2852]: E1216 12:23:20.131708 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.132350 kubelet[2852]: E1216 12:23:20.132335 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.132383 kubelet[2852]: W1216 12:23:20.132351 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.132383 kubelet[2852]: E1216 12:23:20.132366 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.132924 kubelet[2852]: E1216 12:23:20.132880 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.132924 kubelet[2852]: W1216 12:23:20.132916 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.133012 kubelet[2852]: E1216 12:23:20.132934 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.133295 kubelet[2852]: E1216 12:23:20.133273 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.133295 kubelet[2852]: W1216 12:23:20.133290 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.133367 kubelet[2852]: E1216 12:23:20.133301 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.133576 kubelet[2852]: E1216 12:23:20.133513 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.133576 kubelet[2852]: W1216 12:23:20.133528 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.133576 kubelet[2852]: E1216 12:23:20.133538 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.134029 kubelet[2852]: E1216 12:23:20.134008 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.134090 kubelet[2852]: W1216 12:23:20.134080 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.134128 kubelet[2852]: E1216 12:23:20.134094 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.134680 kubelet[2852]: E1216 12:23:20.134629 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.134680 kubelet[2852]: W1216 12:23:20.134642 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.134680 kubelet[2852]: E1216 12:23:20.134653 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.134987 kubelet[2852]: E1216 12:23:20.134942 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.134987 kubelet[2852]: W1216 12:23:20.134955 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.134987 kubelet[2852]: E1216 12:23:20.134966 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.135596 kubelet[2852]: E1216 12:23:20.135240 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.135596 kubelet[2852]: W1216 12:23:20.135255 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.135596 kubelet[2852]: E1216 12:23:20.135265 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.135596 kubelet[2852]: E1216 12:23:20.135432 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.135596 kubelet[2852]: W1216 12:23:20.135440 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.135596 kubelet[2852]: E1216 12:23:20.135448 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.135772 kubelet[2852]: E1216 12:23:20.135645 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.135772 kubelet[2852]: W1216 12:23:20.135653 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.135772 kubelet[2852]: E1216 12:23:20.135662 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.135835 kubelet[2852]: E1216 12:23:20.135800 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.135835 kubelet[2852]: W1216 12:23:20.135808 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.135835 kubelet[2852]: E1216 12:23:20.135816 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.135959 kubelet[2852]: E1216 12:23:20.135946 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.135959 kubelet[2852]: W1216 12:23:20.135956 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.136009 kubelet[2852]: E1216 12:23:20.135964 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.136152 kubelet[2852]: E1216 12:23:20.136141 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.136152 kubelet[2852]: W1216 12:23:20.136151 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.136298 kubelet[2852]: E1216 12:23:20.136159 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.136350 kubelet[2852]: E1216 12:23:20.136334 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.136350 kubelet[2852]: W1216 12:23:20.136343 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.136425 kubelet[2852]: E1216 12:23:20.136351 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.136500 kubelet[2852]: E1216 12:23:20.136488 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.136500 kubelet[2852]: W1216 12:23:20.136499 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.136553 kubelet[2852]: E1216 12:23:20.136507 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.136671 kubelet[2852]: E1216 12:23:20.136661 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.136671 kubelet[2852]: W1216 12:23:20.136671 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.136770 kubelet[2852]: E1216 12:23:20.136682 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.136837 kubelet[2852]: E1216 12:23:20.136826 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.136837 kubelet[2852]: W1216 12:23:20.136837 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.136936 kubelet[2852]: E1216 12:23:20.136845 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.137149 kubelet[2852]: E1216 12:23:20.137092 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.137149 kubelet[2852]: W1216 12:23:20.137107 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.137149 kubelet[2852]: E1216 12:23:20.137118 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.137418 kubelet[2852]: E1216 12:23:20.137379 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.137418 kubelet[2852]: W1216 12:23:20.137390 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.137418 kubelet[2852]: E1216 12:23:20.137399 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.137614 kubelet[2852]: E1216 12:23:20.137603 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.137614 kubelet[2852]: W1216 12:23:20.137614 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.137690 kubelet[2852]: E1216 12:23:20.137623 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.137781 kubelet[2852]: E1216 12:23:20.137771 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.137781 kubelet[2852]: W1216 12:23:20.137781 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.137836 kubelet[2852]: E1216 12:23:20.137789 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.137923 kubelet[2852]: E1216 12:23:20.137913 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.137923 kubelet[2852]: W1216 12:23:20.137923 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.137972 kubelet[2852]: E1216 12:23:20.137931 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.138077 kubelet[2852]: E1216 12:23:20.138067 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.138077 kubelet[2852]: W1216 12:23:20.138077 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.138127 kubelet[2852]: E1216 12:23:20.138085 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:20.146666 containerd[1660]: time="2025-12-16T12:23:20.146623614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-29tl8,Uid:17c31053-b610-43b2-8615-0933aa8b4a6a,Namespace:calico-system,Attempt:0,} returns sandbox id \"b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18\"" Dec 16 12:23:20.151031 kubelet[2852]: E1216 12:23:20.151006 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:20.151031 kubelet[2852]: W1216 12:23:20.151028 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:20.151178 kubelet[2852]: E1216 12:23:20.151047 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:21.515588 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3246155077.mount: Deactivated successfully. Dec 16 12:23:22.131147 kubelet[2852]: E1216 12:23:22.131100 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:22.435555 containerd[1660]: time="2025-12-16T12:23:22.435402566Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:22.437139 containerd[1660]: time="2025-12-16T12:23:22.436894093Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 16 12:23:22.438007 containerd[1660]: time="2025-12-16T12:23:22.437961939Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:22.440621 containerd[1660]: time="2025-12-16T12:23:22.440577872Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:22.441174 containerd[1660]: time="2025-12-16T12:23:22.441138835Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.468113666s" Dec 16 12:23:22.441174 containerd[1660]: time="2025-12-16T12:23:22.441171195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 16 12:23:22.442612 containerd[1660]: time="2025-12-16T12:23:22.442505122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:23:22.454803 containerd[1660]: time="2025-12-16T12:23:22.454765465Z" level=info msg="CreateContainer within sandbox \"917807dda1ea4dc35821fee18a8625da78b516bb3f25fb888b5eea3de53ebc6a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:23:22.461230 containerd[1660]: time="2025-12-16T12:23:22.460974096Z" level=info msg="Container 627582e3a7fdd63aa65f18fe3c41a2419ce506a35de8256dd3facfe4cb8bef85: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:22.469378 containerd[1660]: time="2025-12-16T12:23:22.469332659Z" level=info msg="CreateContainer within sandbox \"917807dda1ea4dc35821fee18a8625da78b516bb3f25fb888b5eea3de53ebc6a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"627582e3a7fdd63aa65f18fe3c41a2419ce506a35de8256dd3facfe4cb8bef85\"" Dec 16 12:23:22.471330 containerd[1660]: time="2025-12-16T12:23:22.471289669Z" level=info msg="StartContainer for \"627582e3a7fdd63aa65f18fe3c41a2419ce506a35de8256dd3facfe4cb8bef85\"" Dec 16 12:23:22.472412 containerd[1660]: time="2025-12-16T12:23:22.472384874Z" level=info msg="connecting to shim 627582e3a7fdd63aa65f18fe3c41a2419ce506a35de8256dd3facfe4cb8bef85" address="unix:///run/containerd/s/67da50e14d25f6a0ff0a2c8f63bcaab9ef76255a0529a9243a6b1617f28d12ed" protocol=ttrpc version=3 Dec 16 12:23:22.491368 systemd[1]: Started cri-containerd-627582e3a7fdd63aa65f18fe3c41a2419ce506a35de8256dd3facfe4cb8bef85.scope - libcontainer container 627582e3a7fdd63aa65f18fe3c41a2419ce506a35de8256dd3facfe4cb8bef85. Dec 16 12:23:22.529494 containerd[1660]: time="2025-12-16T12:23:22.529452685Z" level=info msg="StartContainer for \"627582e3a7fdd63aa65f18fe3c41a2419ce506a35de8256dd3facfe4cb8bef85\" returns successfully" Dec 16 12:23:23.217993 kubelet[2852]: I1216 12:23:23.217914 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-65f777d876-vhlmj" podStartSLOduration=1.74806488 podStartE2EDuration="4.217898116s" podCreationTimestamp="2025-12-16 12:23:19 +0000 UTC" firstStartedPulling="2025-12-16 12:23:19.972178524 +0000 UTC m=+23.287783251" lastFinishedPulling="2025-12-16 12:23:22.44201176 +0000 UTC m=+25.757616487" observedRunningTime="2025-12-16 12:23:23.217718315 +0000 UTC m=+26.533323042" watchObservedRunningTime="2025-12-16 12:23:23.217898116 +0000 UTC m=+26.533502803" Dec 16 12:23:23.253030 kubelet[2852]: E1216 12:23:23.252938 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.253030 kubelet[2852]: W1216 12:23:23.252966 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.253030 kubelet[2852]: E1216 12:23:23.253004 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.253260 kubelet[2852]: E1216 12:23:23.253243 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.253299 kubelet[2852]: W1216 12:23:23.253255 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.253325 kubelet[2852]: E1216 12:23:23.253301 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.253449 kubelet[2852]: E1216 12:23:23.253437 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.253478 kubelet[2852]: W1216 12:23:23.253447 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.253478 kubelet[2852]: E1216 12:23:23.253469 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.253622 kubelet[2852]: E1216 12:23:23.253603 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.253647 kubelet[2852]: W1216 12:23:23.253622 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.253647 kubelet[2852]: E1216 12:23:23.253631 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.253792 kubelet[2852]: E1216 12:23:23.253782 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.253814 kubelet[2852]: W1216 12:23:23.253792 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.253814 kubelet[2852]: E1216 12:23:23.253800 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.253941 kubelet[2852]: E1216 12:23:23.253931 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.253941 kubelet[2852]: W1216 12:23:23.253940 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.253979 kubelet[2852]: E1216 12:23:23.253948 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.254078 kubelet[2852]: E1216 12:23:23.254064 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.254107 kubelet[2852]: W1216 12:23:23.254085 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.254107 kubelet[2852]: E1216 12:23:23.254094 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.254255 kubelet[2852]: E1216 12:23:23.254245 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.254293 kubelet[2852]: W1216 12:23:23.254255 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.254293 kubelet[2852]: E1216 12:23:23.254264 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.254419 kubelet[2852]: E1216 12:23:23.254409 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.254442 kubelet[2852]: W1216 12:23:23.254419 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.254442 kubelet[2852]: E1216 12:23:23.254427 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.254560 kubelet[2852]: E1216 12:23:23.254551 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.254586 kubelet[2852]: W1216 12:23:23.254560 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.254586 kubelet[2852]: E1216 12:23:23.254568 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.254708 kubelet[2852]: E1216 12:23:23.254698 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.254735 kubelet[2852]: W1216 12:23:23.254708 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.254735 kubelet[2852]: E1216 12:23:23.254715 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.254854 kubelet[2852]: E1216 12:23:23.254844 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.254876 kubelet[2852]: W1216 12:23:23.254854 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.254876 kubelet[2852]: E1216 12:23:23.254863 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.255009 kubelet[2852]: E1216 12:23:23.254999 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.255031 kubelet[2852]: W1216 12:23:23.255009 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.255031 kubelet[2852]: E1216 12:23:23.255017 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.255164 kubelet[2852]: E1216 12:23:23.255154 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.255186 kubelet[2852]: W1216 12:23:23.255164 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.255186 kubelet[2852]: E1216 12:23:23.255172 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.255342 kubelet[2852]: E1216 12:23:23.255332 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.255367 kubelet[2852]: W1216 12:23:23.255342 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.255367 kubelet[2852]: E1216 12:23:23.255350 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.258980 kubelet[2852]: E1216 12:23:23.258952 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.258980 kubelet[2852]: W1216 12:23:23.258969 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.258980 kubelet[2852]: E1216 12:23:23.258981 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.259165 kubelet[2852]: E1216 12:23:23.259139 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.259165 kubelet[2852]: W1216 12:23:23.259150 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.259165 kubelet[2852]: E1216 12:23:23.259158 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.259506 kubelet[2852]: E1216 12:23:23.259447 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.259506 kubelet[2852]: W1216 12:23:23.259460 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.259506 kubelet[2852]: E1216 12:23:23.259469 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.259743 kubelet[2852]: E1216 12:23:23.259713 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.259743 kubelet[2852]: W1216 12:23:23.259733 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.259790 kubelet[2852]: E1216 12:23:23.259746 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.259899 kubelet[2852]: E1216 12:23:23.259888 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.259899 kubelet[2852]: W1216 12:23:23.259898 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.259951 kubelet[2852]: E1216 12:23:23.259907 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.260048 kubelet[2852]: E1216 12:23:23.260037 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.260074 kubelet[2852]: W1216 12:23:23.260047 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.260074 kubelet[2852]: E1216 12:23:23.260056 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.260234 kubelet[2852]: E1216 12:23:23.260224 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.260258 kubelet[2852]: W1216 12:23:23.260234 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.260258 kubelet[2852]: E1216 12:23:23.260243 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.260508 kubelet[2852]: E1216 12:23:23.260476 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.260508 kubelet[2852]: W1216 12:23:23.260495 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.260508 kubelet[2852]: E1216 12:23:23.260506 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.260661 kubelet[2852]: E1216 12:23:23.260650 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.260661 kubelet[2852]: W1216 12:23:23.260660 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.260708 kubelet[2852]: E1216 12:23:23.260669 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.260826 kubelet[2852]: E1216 12:23:23.260816 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.260849 kubelet[2852]: W1216 12:23:23.260827 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.260849 kubelet[2852]: E1216 12:23:23.260837 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.261011 kubelet[2852]: E1216 12:23:23.260999 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.261037 kubelet[2852]: W1216 12:23:23.261010 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.261037 kubelet[2852]: E1216 12:23:23.261019 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.261200 kubelet[2852]: E1216 12:23:23.261183 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.261243 kubelet[2852]: W1216 12:23:23.261232 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.261271 kubelet[2852]: E1216 12:23:23.261245 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.261435 kubelet[2852]: E1216 12:23:23.261423 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.261459 kubelet[2852]: W1216 12:23:23.261435 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.261459 kubelet[2852]: E1216 12:23:23.261443 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.261675 kubelet[2852]: E1216 12:23:23.261663 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.261699 kubelet[2852]: W1216 12:23:23.261675 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.261699 kubelet[2852]: E1216 12:23:23.261686 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.261833 kubelet[2852]: E1216 12:23:23.261825 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.261855 kubelet[2852]: W1216 12:23:23.261834 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.261855 kubelet[2852]: E1216 12:23:23.261841 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.261990 kubelet[2852]: E1216 12:23:23.261980 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.262015 kubelet[2852]: W1216 12:23:23.261991 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.262015 kubelet[2852]: E1216 12:23:23.261998 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.262227 kubelet[2852]: E1216 12:23:23.262210 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.262253 kubelet[2852]: W1216 12:23:23.262227 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.262253 kubelet[2852]: E1216 12:23:23.262240 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.262437 kubelet[2852]: E1216 12:23:23.262424 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:23:23.262461 kubelet[2852]: W1216 12:23:23.262436 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:23:23.262461 kubelet[2852]: E1216 12:23:23.262446 2852 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:23:23.969105 containerd[1660]: time="2025-12-16T12:23:23.968243903Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:23.969105 containerd[1660]: time="2025-12-16T12:23:23.968886626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 16 12:23:23.969976 containerd[1660]: time="2025-12-16T12:23:23.969947311Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:23.974986 containerd[1660]: time="2025-12-16T12:23:23.974935697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:23.976289 containerd[1660]: time="2025-12-16T12:23:23.976258744Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.53348686s" Dec 16 12:23:23.976289 containerd[1660]: time="2025-12-16T12:23:23.976287704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 16 12:23:23.980175 containerd[1660]: time="2025-12-16T12:23:23.980140243Z" level=info msg="CreateContainer within sandbox \"b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:23:23.989884 containerd[1660]: time="2025-12-16T12:23:23.989518931Z" level=info msg="Container 5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:24.001667 containerd[1660]: time="2025-12-16T12:23:24.001614833Z" level=info msg="CreateContainer within sandbox \"b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048\"" Dec 16 12:23:24.002544 containerd[1660]: time="2025-12-16T12:23:24.002511997Z" level=info msg="StartContainer for \"5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048\"" Dec 16 12:23:24.004503 containerd[1660]: time="2025-12-16T12:23:24.004068725Z" level=info msg="connecting to shim 5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048" address="unix:///run/containerd/s/fd846a2deafb2219871e56fd5afe0f157b1d30a926d5e63b51f6fae54cb89ff7" protocol=ttrpc version=3 Dec 16 12:23:24.034781 systemd[1]: Started cri-containerd-5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048.scope - libcontainer container 5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048. Dec 16 12:23:24.106579 containerd[1660]: time="2025-12-16T12:23:24.106468528Z" level=info msg="StartContainer for \"5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048\" returns successfully" Dec 16 12:23:24.119538 systemd[1]: cri-containerd-5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048.scope: Deactivated successfully. Dec 16 12:23:24.122746 containerd[1660]: time="2025-12-16T12:23:24.122712210Z" level=info msg="received container exit event container_id:\"5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048\" id:\"5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048\" pid:3519 exited_at:{seconds:1765887804 nanos:121963767}" Dec 16 12:23:24.130940 kubelet[2852]: E1216 12:23:24.130891 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:24.144153 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5ac42fbdd384bb1fc528f11bc1b353f2a91c7edb7ba207d080a07224e667e048-rootfs.mount: Deactivated successfully. Dec 16 12:23:24.209811 kubelet[2852]: I1216 12:23:24.209777 2852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:23:26.131067 kubelet[2852]: E1216 12:23:26.130998 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:28.131911 kubelet[2852]: E1216 12:23:28.131404 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:28.222812 containerd[1660]: time="2025-12-16T12:23:28.222755480Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:23:30.130702 kubelet[2852]: E1216 12:23:30.130647 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:30.651118 containerd[1660]: time="2025-12-16T12:23:30.650636541Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:30.651869 containerd[1660]: time="2025-12-16T12:23:30.651814867Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 16 12:23:30.653114 containerd[1660]: time="2025-12-16T12:23:30.653065674Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:30.655938 containerd[1660]: time="2025-12-16T12:23:30.655908768Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:30.656865 containerd[1660]: time="2025-12-16T12:23:30.656582212Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.433780532s" Dec 16 12:23:30.656865 containerd[1660]: time="2025-12-16T12:23:30.656617052Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 16 12:23:30.660840 containerd[1660]: time="2025-12-16T12:23:30.660780593Z" level=info msg="CreateContainer within sandbox \"b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:23:30.673672 containerd[1660]: time="2025-12-16T12:23:30.673352417Z" level=info msg="Container 49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:30.687182 containerd[1660]: time="2025-12-16T12:23:30.687137288Z" level=info msg="CreateContainer within sandbox \"b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278\"" Dec 16 12:23:30.687654 containerd[1660]: time="2025-12-16T12:23:30.687623690Z" level=info msg="StartContainer for \"49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278\"" Dec 16 12:23:30.689773 containerd[1660]: time="2025-12-16T12:23:30.689737341Z" level=info msg="connecting to shim 49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278" address="unix:///run/containerd/s/fd846a2deafb2219871e56fd5afe0f157b1d30a926d5e63b51f6fae54cb89ff7" protocol=ttrpc version=3 Dec 16 12:23:30.713413 systemd[1]: Started cri-containerd-49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278.scope - libcontainer container 49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278. Dec 16 12:23:30.802038 containerd[1660]: time="2025-12-16T12:23:30.801981473Z" level=info msg="StartContainer for \"49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278\" returns successfully" Dec 16 12:23:32.063546 systemd[1]: cri-containerd-49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278.scope: Deactivated successfully. Dec 16 12:23:32.064289 systemd[1]: cri-containerd-49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278.scope: Consumed 458ms CPU time, 188.3M memory peak, 165.9M written to disk. Dec 16 12:23:32.065388 containerd[1660]: time="2025-12-16T12:23:32.064889634Z" level=info msg="received container exit event container_id:\"49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278\" id:\"49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278\" pid:3583 exited_at:{seconds:1765887812 nanos:64546512}" Dec 16 12:23:32.086830 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-49752c74e964530b5bff35bc77ae2d3e7b799e68e008f78494cce19a7c444278-rootfs.mount: Deactivated successfully. Dec 16 12:23:32.115295 kubelet[2852]: I1216 12:23:32.115035 2852 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 16 12:23:32.136605 systemd[1]: Created slice kubepods-besteffort-pod426e2a8a_5d6e_4966_b145_015ce9ecbfa0.slice - libcontainer container kubepods-besteffort-pod426e2a8a_5d6e_4966_b145_015ce9ecbfa0.slice. Dec 16 12:23:33.440800 containerd[1660]: time="2025-12-16T12:23:33.440692050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w78fj,Uid:426e2a8a-5d6e-4966-b145-015ce9ecbfa0,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:33.493821 systemd[1]: Created slice kubepods-burstable-pod0507b616_0c85_4998_9127_f71b3f5478c8.slice - libcontainer container kubepods-burstable-pod0507b616_0c85_4998_9127_f71b3f5478c8.slice. Dec 16 12:23:33.527358 systemd[1]: Created slice kubepods-burstable-pod13192563_a3ab_4770_a7c2_f05fdc031eaf.slice - libcontainer container kubepods-burstable-pod13192563_a3ab_4770_a7c2_f05fdc031eaf.slice. Dec 16 12:23:33.539704 systemd[1]: Created slice kubepods-besteffort-pod533a88a9_9b7b_4454_afaf_2a43ad5efc0e.slice - libcontainer container kubepods-besteffort-pod533a88a9_9b7b_4454_afaf_2a43ad5efc0e.slice. Dec 16 12:23:33.552695 kubelet[2852]: I1216 12:23:33.552623 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6czrg\" (UniqueName: \"kubernetes.io/projected/8d05b967-cc40-4074-a4a8-7c06f72a502c-kube-api-access-6czrg\") pod \"calico-kube-controllers-6957f646c-stmvl\" (UID: \"8d05b967-cc40-4074-a4a8-7c06f72a502c\") " pod="calico-system/calico-kube-controllers-6957f646c-stmvl" Dec 16 12:23:33.553796 kubelet[2852]: I1216 12:23:33.553684 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d05b967-cc40-4074-a4a8-7c06f72a502c-tigera-ca-bundle\") pod \"calico-kube-controllers-6957f646c-stmvl\" (UID: \"8d05b967-cc40-4074-a4a8-7c06f72a502c\") " pod="calico-system/calico-kube-controllers-6957f646c-stmvl" Dec 16 12:23:33.555562 systemd[1]: Created slice kubepods-besteffort-pod8d05b967_cc40_4074_a4a8_7c06f72a502c.slice - libcontainer container kubepods-besteffort-pod8d05b967_cc40_4074_a4a8_7c06f72a502c.slice. Dec 16 12:23:33.562299 systemd[1]: Created slice kubepods-besteffort-pod93e150fa_0b6c_4bf6_aa66_e7e32f1f3161.slice - libcontainer container kubepods-besteffort-pod93e150fa_0b6c_4bf6_aa66_e7e32f1f3161.slice. Dec 16 12:23:33.568257 systemd[1]: Created slice kubepods-besteffort-pod9ac6813a_66c7_4844_b378_32ad1d5e7849.slice - libcontainer container kubepods-besteffort-pod9ac6813a_66c7_4844_b378_32ad1d5e7849.slice. Dec 16 12:23:33.578116 systemd[1]: Created slice kubepods-besteffort-pod5e6ae6fc_592f_4274_8d1a_ed030be2d900.slice - libcontainer container kubepods-besteffort-pod5e6ae6fc_592f_4274_8d1a_ed030be2d900.slice. Dec 16 12:23:33.608681 containerd[1660]: time="2025-12-16T12:23:33.608622986Z" level=error msg="Failed to destroy network for sandbox \"c89e88aff1216f557307a89abb872ad1fd615c32ffb34fa6cfe0476da9cc57da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.613338 containerd[1660]: time="2025-12-16T12:23:33.612185285Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w78fj,Uid:426e2a8a-5d6e-4966-b145-015ce9ecbfa0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c89e88aff1216f557307a89abb872ad1fd615c32ffb34fa6cfe0476da9cc57da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.613419 kubelet[2852]: E1216 12:23:33.612404 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c89e88aff1216f557307a89abb872ad1fd615c32ffb34fa6cfe0476da9cc57da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.613419 kubelet[2852]: E1216 12:23:33.612497 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c89e88aff1216f557307a89abb872ad1fd615c32ffb34fa6cfe0476da9cc57da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-w78fj" Dec 16 12:23:33.613419 kubelet[2852]: E1216 12:23:33.612514 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c89e88aff1216f557307a89abb872ad1fd615c32ffb34fa6cfe0476da9cc57da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-w78fj" Dec 16 12:23:33.610667 systemd[1]: run-netns-cni\x2d10579a99\x2d4587\x2dd68d\x2dddb0\x2dcfd7c3fc8c09.mount: Deactivated successfully. Dec 16 12:23:33.613579 kubelet[2852]: E1216 12:23:33.612557 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c89e88aff1216f557307a89abb872ad1fd615c32ffb34fa6cfe0476da9cc57da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:33.654744 kubelet[2852]: I1216 12:23:33.654632 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-backend-key-pair\") pod \"whisker-7f94b8b664-bcw4w\" (UID: \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\") " pod="calico-system/whisker-7f94b8b664-bcw4w" Dec 16 12:23:33.654744 kubelet[2852]: I1216 12:23:33.654672 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg69n\" (UniqueName: \"kubernetes.io/projected/0507b616-0c85-4998-9127-f71b3f5478c8-kube-api-access-wg69n\") pod \"coredns-674b8bbfcf-rnf4t\" (UID: \"0507b616-0c85-4998-9127-f71b3f5478c8\") " pod="kube-system/coredns-674b8bbfcf-rnf4t" Dec 16 12:23:33.654744 kubelet[2852]: I1216 12:23:33.654737 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxqv\" (UniqueName: \"kubernetes.io/projected/533a88a9-9b7b-4454-afaf-2a43ad5efc0e-kube-api-access-9sxqv\") pod \"calico-apiserver-84f977f475-ns4mb\" (UID: \"533a88a9-9b7b-4454-afaf-2a43ad5efc0e\") " pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" Dec 16 12:23:33.654744 kubelet[2852]: I1216 12:23:33.654768 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv8k2\" (UniqueName: \"kubernetes.io/projected/9ac6813a-66c7-4844-b378-32ad1d5e7849-kube-api-access-jv8k2\") pod \"goldmane-666569f655-vhc9k\" (UID: \"9ac6813a-66c7-4844-b378-32ad1d5e7849\") " pod="calico-system/goldmane-666569f655-vhc9k" Dec 16 12:23:33.655496 kubelet[2852]: I1216 12:23:33.655098 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-ca-bundle\") pod \"whisker-7f94b8b664-bcw4w\" (UID: \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\") " pod="calico-system/whisker-7f94b8b664-bcw4w" Dec 16 12:23:33.655496 kubelet[2852]: I1216 12:23:33.655127 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzgh\" (UniqueName: \"kubernetes.io/projected/5e6ae6fc-592f-4274-8d1a-ed030be2d900-kube-api-access-6pzgh\") pod \"whisker-7f94b8b664-bcw4w\" (UID: \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\") " pod="calico-system/whisker-7f94b8b664-bcw4w" Dec 16 12:23:33.655496 kubelet[2852]: I1216 12:23:33.655146 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac6813a-66c7-4844-b378-32ad1d5e7849-config\") pod \"goldmane-666569f655-vhc9k\" (UID: \"9ac6813a-66c7-4844-b378-32ad1d5e7849\") " pod="calico-system/goldmane-666569f655-vhc9k" Dec 16 12:23:33.655496 kubelet[2852]: I1216 12:23:33.655165 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkt4q\" (UniqueName: \"kubernetes.io/projected/13192563-a3ab-4770-a7c2-f05fdc031eaf-kube-api-access-pkt4q\") pod \"coredns-674b8bbfcf-kk9ss\" (UID: \"13192563-a3ab-4770-a7c2-f05fdc031eaf\") " pod="kube-system/coredns-674b8bbfcf-kk9ss" Dec 16 12:23:33.655496 kubelet[2852]: I1216 12:23:33.655187 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwpv2\" (UniqueName: \"kubernetes.io/projected/93e150fa-0b6c-4bf6-aa66-e7e32f1f3161-kube-api-access-kwpv2\") pod \"calico-apiserver-84f977f475-gngqq\" (UID: \"93e150fa-0b6c-4bf6-aa66-e7e32f1f3161\") " pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" Dec 16 12:23:33.655615 kubelet[2852]: I1216 12:23:33.655230 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ac6813a-66c7-4844-b378-32ad1d5e7849-goldmane-ca-bundle\") pod \"goldmane-666569f655-vhc9k\" (UID: \"9ac6813a-66c7-4844-b378-32ad1d5e7849\") " pod="calico-system/goldmane-666569f655-vhc9k" Dec 16 12:23:33.655615 kubelet[2852]: I1216 12:23:33.655247 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/9ac6813a-66c7-4844-b378-32ad1d5e7849-goldmane-key-pair\") pod \"goldmane-666569f655-vhc9k\" (UID: \"9ac6813a-66c7-4844-b378-32ad1d5e7849\") " pod="calico-system/goldmane-666569f655-vhc9k" Dec 16 12:23:33.655615 kubelet[2852]: I1216 12:23:33.655264 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13192563-a3ab-4770-a7c2-f05fdc031eaf-config-volume\") pod \"coredns-674b8bbfcf-kk9ss\" (UID: \"13192563-a3ab-4770-a7c2-f05fdc031eaf\") " pod="kube-system/coredns-674b8bbfcf-kk9ss" Dec 16 12:23:33.655615 kubelet[2852]: I1216 12:23:33.655280 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/93e150fa-0b6c-4bf6-aa66-e7e32f1f3161-calico-apiserver-certs\") pod \"calico-apiserver-84f977f475-gngqq\" (UID: \"93e150fa-0b6c-4bf6-aa66-e7e32f1f3161\") " pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" Dec 16 12:23:33.655615 kubelet[2852]: I1216 12:23:33.655298 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0507b616-0c85-4998-9127-f71b3f5478c8-config-volume\") pod \"coredns-674b8bbfcf-rnf4t\" (UID: \"0507b616-0c85-4998-9127-f71b3f5478c8\") " pod="kube-system/coredns-674b8bbfcf-rnf4t" Dec 16 12:23:33.655719 kubelet[2852]: I1216 12:23:33.655314 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/533a88a9-9b7b-4454-afaf-2a43ad5efc0e-calico-apiserver-certs\") pod \"calico-apiserver-84f977f475-ns4mb\" (UID: \"533a88a9-9b7b-4454-afaf-2a43ad5efc0e\") " pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" Dec 16 12:23:33.816030 containerd[1660]: time="2025-12-16T12:23:33.815993644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rnf4t,Uid:0507b616-0c85-4998-9127-f71b3f5478c8,Namespace:kube-system,Attempt:0,}" Dec 16 12:23:33.831904 containerd[1660]: time="2025-12-16T12:23:33.831867245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-kk9ss,Uid:13192563-a3ab-4770-a7c2-f05fdc031eaf,Namespace:kube-system,Attempt:0,}" Dec 16 12:23:33.849469 containerd[1660]: time="2025-12-16T12:23:33.849430254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-ns4mb,Uid:533a88a9-9b7b-4454-afaf-2a43ad5efc0e,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:23:33.860543 containerd[1660]: time="2025-12-16T12:23:33.860146069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6957f646c-stmvl,Uid:8d05b967-cc40-4074-a4a8-7c06f72a502c,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:33.864702 containerd[1660]: time="2025-12-16T12:23:33.864664012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-gngqq,Uid:93e150fa-0b6c-4bf6-aa66-e7e32f1f3161,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:23:33.873140 containerd[1660]: time="2025-12-16T12:23:33.873087695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-vhc9k,Uid:9ac6813a-66c7-4844-b378-32ad1d5e7849,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:33.881292 containerd[1660]: time="2025-12-16T12:23:33.881247337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f94b8b664-bcw4w,Uid:5e6ae6fc-592f-4274-8d1a-ed030be2d900,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:33.885464 containerd[1660]: time="2025-12-16T12:23:33.885406878Z" level=error msg="Failed to destroy network for sandbox \"395a02b6f746cd05022283074c30553c14f00144acd3c4baf7b8789c7e331c32\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.890442 containerd[1660]: time="2025-12-16T12:23:33.889995461Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rnf4t,Uid:0507b616-0c85-4998-9127-f71b3f5478c8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"395a02b6f746cd05022283074c30553c14f00144acd3c4baf7b8789c7e331c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.890842 kubelet[2852]: E1216 12:23:33.890798 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"395a02b6f746cd05022283074c30553c14f00144acd3c4baf7b8789c7e331c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.890914 kubelet[2852]: E1216 12:23:33.890859 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"395a02b6f746cd05022283074c30553c14f00144acd3c4baf7b8789c7e331c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-rnf4t" Dec 16 12:23:33.890914 kubelet[2852]: E1216 12:23:33.890880 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"395a02b6f746cd05022283074c30553c14f00144acd3c4baf7b8789c7e331c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-rnf4t" Dec 16 12:23:33.890973 kubelet[2852]: E1216 12:23:33.890929 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-rnf4t_kube-system(0507b616-0c85-4998-9127-f71b3f5478c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-rnf4t_kube-system(0507b616-0c85-4998-9127-f71b3f5478c8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"395a02b6f746cd05022283074c30553c14f00144acd3c4baf7b8789c7e331c32\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-rnf4t" podUID="0507b616-0c85-4998-9127-f71b3f5478c8" Dec 16 12:23:33.943392 containerd[1660]: time="2025-12-16T12:23:33.943268413Z" level=error msg="Failed to destroy network for sandbox \"9cf935ae5d4bf41aab0ec0c5d64e291a6ab007fed8735322674127b766b6b8fd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.946477 containerd[1660]: time="2025-12-16T12:23:33.946392309Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-kk9ss,Uid:13192563-a3ab-4770-a7c2-f05fdc031eaf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cf935ae5d4bf41aab0ec0c5d64e291a6ab007fed8735322674127b766b6b8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.947158 kubelet[2852]: E1216 12:23:33.946916 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cf935ae5d4bf41aab0ec0c5d64e291a6ab007fed8735322674127b766b6b8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.947158 kubelet[2852]: E1216 12:23:33.946981 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cf935ae5d4bf41aab0ec0c5d64e291a6ab007fed8735322674127b766b6b8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-kk9ss" Dec 16 12:23:33.947158 kubelet[2852]: E1216 12:23:33.947002 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cf935ae5d4bf41aab0ec0c5d64e291a6ab007fed8735322674127b766b6b8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-kk9ss" Dec 16 12:23:33.947387 kubelet[2852]: E1216 12:23:33.947052 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-kk9ss_kube-system(13192563-a3ab-4770-a7c2-f05fdc031eaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-kk9ss_kube-system(13192563-a3ab-4770-a7c2-f05fdc031eaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9cf935ae5d4bf41aab0ec0c5d64e291a6ab007fed8735322674127b766b6b8fd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-kk9ss" podUID="13192563-a3ab-4770-a7c2-f05fdc031eaf" Dec 16 12:23:33.949330 containerd[1660]: time="2025-12-16T12:23:33.949252324Z" level=error msg="Failed to destroy network for sandbox \"f573f826b8bc1b7dbba27d527022c513206a15722a65501eb5270383d1321885\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.950706 containerd[1660]: time="2025-12-16T12:23:33.950654091Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6957f646c-stmvl,Uid:8d05b967-cc40-4074-a4a8-7c06f72a502c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f573f826b8bc1b7dbba27d527022c513206a15722a65501eb5270383d1321885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.951404 kubelet[2852]: E1216 12:23:33.950932 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f573f826b8bc1b7dbba27d527022c513206a15722a65501eb5270383d1321885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.951404 kubelet[2852]: E1216 12:23:33.951075 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f573f826b8bc1b7dbba27d527022c513206a15722a65501eb5270383d1321885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" Dec 16 12:23:33.951404 kubelet[2852]: E1216 12:23:33.951096 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f573f826b8bc1b7dbba27d527022c513206a15722a65501eb5270383d1321885\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" Dec 16 12:23:33.951642 kubelet[2852]: E1216 12:23:33.951315 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6957f646c-stmvl_calico-system(8d05b967-cc40-4074-a4a8-7c06f72a502c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6957f646c-stmvl_calico-system(8d05b967-cc40-4074-a4a8-7c06f72a502c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f573f826b8bc1b7dbba27d527022c513206a15722a65501eb5270383d1321885\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:23:33.953370 containerd[1660]: time="2025-12-16T12:23:33.953319064Z" level=error msg="Failed to destroy network for sandbox \"4ca5a731a945ee3933778ec32bd8f10e5ad8b03a2f346ba4731846e116ea8d36\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.954809 containerd[1660]: time="2025-12-16T12:23:33.954723511Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-gngqq,Uid:93e150fa-0b6c-4bf6-aa66-e7e32f1f3161,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ca5a731a945ee3933778ec32bd8f10e5ad8b03a2f346ba4731846e116ea8d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.955031 kubelet[2852]: E1216 12:23:33.954998 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ca5a731a945ee3933778ec32bd8f10e5ad8b03a2f346ba4731846e116ea8d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.955088 kubelet[2852]: E1216 12:23:33.955049 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ca5a731a945ee3933778ec32bd8f10e5ad8b03a2f346ba4731846e116ea8d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" Dec 16 12:23:33.955088 kubelet[2852]: E1216 12:23:33.955076 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ca5a731a945ee3933778ec32bd8f10e5ad8b03a2f346ba4731846e116ea8d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" Dec 16 12:23:33.955157 kubelet[2852]: E1216 12:23:33.955122 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84f977f475-gngqq_calico-apiserver(93e150fa-0b6c-4bf6-aa66-e7e32f1f3161)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84f977f475-gngqq_calico-apiserver(93e150fa-0b6c-4bf6-aa66-e7e32f1f3161)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ca5a731a945ee3933778ec32bd8f10e5ad8b03a2f346ba4731846e116ea8d36\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:23:33.963216 containerd[1660]: time="2025-12-16T12:23:33.963031874Z" level=error msg="Failed to destroy network for sandbox \"1ed98bf2657380ff3afd387be0f24e3d8a6ae0ed31d24a806837e1ce2a54553f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.963942 containerd[1660]: time="2025-12-16T12:23:33.963905278Z" level=error msg="Failed to destroy network for sandbox \"8e06f4f73776f3a11433e4c96f835a32de9d9a6ab9aacb65835defeaf235e3bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.968360 containerd[1660]: time="2025-12-16T12:23:33.965506526Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-ns4mb,Uid:533a88a9-9b7b-4454-afaf-2a43ad5efc0e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ed98bf2657380ff3afd387be0f24e3d8a6ae0ed31d24a806837e1ce2a54553f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.968561 kubelet[2852]: E1216 12:23:33.968160 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ed98bf2657380ff3afd387be0f24e3d8a6ae0ed31d24a806837e1ce2a54553f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.968561 kubelet[2852]: E1216 12:23:33.968262 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ed98bf2657380ff3afd387be0f24e3d8a6ae0ed31d24a806837e1ce2a54553f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" Dec 16 12:23:33.968561 kubelet[2852]: E1216 12:23:33.968286 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ed98bf2657380ff3afd387be0f24e3d8a6ae0ed31d24a806837e1ce2a54553f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" Dec 16 12:23:33.968687 kubelet[2852]: E1216 12:23:33.968361 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84f977f475-ns4mb_calico-apiserver(533a88a9-9b7b-4454-afaf-2a43ad5efc0e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84f977f475-ns4mb_calico-apiserver(533a88a9-9b7b-4454-afaf-2a43ad5efc0e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ed98bf2657380ff3afd387be0f24e3d8a6ae0ed31d24a806837e1ce2a54553f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:23:33.969503 containerd[1660]: time="2025-12-16T12:23:33.969456547Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-vhc9k,Uid:9ac6813a-66c7-4844-b378-32ad1d5e7849,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e06f4f73776f3a11433e4c96f835a32de9d9a6ab9aacb65835defeaf235e3bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.969702 kubelet[2852]: E1216 12:23:33.969669 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e06f4f73776f3a11433e4c96f835a32de9d9a6ab9aacb65835defeaf235e3bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.969767 kubelet[2852]: E1216 12:23:33.969712 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e06f4f73776f3a11433e4c96f835a32de9d9a6ab9aacb65835defeaf235e3bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-vhc9k" Dec 16 12:23:33.969767 kubelet[2852]: E1216 12:23:33.969730 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e06f4f73776f3a11433e4c96f835a32de9d9a6ab9aacb65835defeaf235e3bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-vhc9k" Dec 16 12:23:33.969843 kubelet[2852]: E1216 12:23:33.969785 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-vhc9k_calico-system(9ac6813a-66c7-4844-b378-32ad1d5e7849)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-vhc9k_calico-system(9ac6813a-66c7-4844-b378-32ad1d5e7849)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e06f4f73776f3a11433e4c96f835a32de9d9a6ab9aacb65835defeaf235e3bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:23:33.980335 containerd[1660]: time="2025-12-16T12:23:33.980286962Z" level=error msg="Failed to destroy network for sandbox \"c382f13ced2585e63b449be2cba800adda9e6391d075b446a4a062989e84e04b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.981865 containerd[1660]: time="2025-12-16T12:23:33.981821370Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f94b8b664-bcw4w,Uid:5e6ae6fc-592f-4274-8d1a-ed030be2d900,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c382f13ced2585e63b449be2cba800adda9e6391d075b446a4a062989e84e04b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.982085 kubelet[2852]: E1216 12:23:33.982033 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c382f13ced2585e63b449be2cba800adda9e6391d075b446a4a062989e84e04b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:23:33.982134 kubelet[2852]: E1216 12:23:33.982104 2852 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c382f13ced2585e63b449be2cba800adda9e6391d075b446a4a062989e84e04b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f94b8b664-bcw4w" Dec 16 12:23:33.982134 kubelet[2852]: E1216 12:23:33.982124 2852 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c382f13ced2585e63b449be2cba800adda9e6391d075b446a4a062989e84e04b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f94b8b664-bcw4w" Dec 16 12:23:33.982238 kubelet[2852]: E1216 12:23:33.982169 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7f94b8b664-bcw4w_calico-system(5e6ae6fc-592f-4274-8d1a-ed030be2d900)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7f94b8b664-bcw4w_calico-system(5e6ae6fc-592f-4274-8d1a-ed030be2d900)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c382f13ced2585e63b449be2cba800adda9e6391d075b446a4a062989e84e04b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7f94b8b664-bcw4w" podUID="5e6ae6fc-592f-4274-8d1a-ed030be2d900" Dec 16 12:23:34.241139 containerd[1660]: time="2025-12-16T12:23:34.240843770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:23:39.067185 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4244031178.mount: Deactivated successfully. Dec 16 12:23:39.087373 containerd[1660]: time="2025-12-16T12:23:39.087156324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:39.088472 containerd[1660]: time="2025-12-16T12:23:39.088444371Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 16 12:23:39.091688 containerd[1660]: time="2025-12-16T12:23:39.091442866Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:39.094328 containerd[1660]: time="2025-12-16T12:23:39.094294121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:23:39.094857 containerd[1660]: time="2025-12-16T12:23:39.094832363Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.853948712s" Dec 16 12:23:39.094977 containerd[1660]: time="2025-12-16T12:23:39.094962004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 16 12:23:39.118572 containerd[1660]: time="2025-12-16T12:23:39.118518084Z" level=info msg="CreateContainer within sandbox \"b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:23:39.131228 containerd[1660]: time="2025-12-16T12:23:39.130669506Z" level=info msg="Container 07c51550fb7bb843793f2be88b27c2b1396c22cae6b852ba0a66d3272071e9a1: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:39.132982 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2304159781.mount: Deactivated successfully. Dec 16 12:23:39.141984 containerd[1660]: time="2025-12-16T12:23:39.141927844Z" level=info msg="CreateContainer within sandbox \"b46c00571b7587e676f4511145451bede3d2a240a1a6d0f68b7b8f271d48ca18\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"07c51550fb7bb843793f2be88b27c2b1396c22cae6b852ba0a66d3272071e9a1\"" Dec 16 12:23:39.142452 containerd[1660]: time="2025-12-16T12:23:39.142368046Z" level=info msg="StartContainer for \"07c51550fb7bb843793f2be88b27c2b1396c22cae6b852ba0a66d3272071e9a1\"" Dec 16 12:23:39.144098 containerd[1660]: time="2025-12-16T12:23:39.144050894Z" level=info msg="connecting to shim 07c51550fb7bb843793f2be88b27c2b1396c22cae6b852ba0a66d3272071e9a1" address="unix:///run/containerd/s/fd846a2deafb2219871e56fd5afe0f157b1d30a926d5e63b51f6fae54cb89ff7" protocol=ttrpc version=3 Dec 16 12:23:39.171638 systemd[1]: Started cri-containerd-07c51550fb7bb843793f2be88b27c2b1396c22cae6b852ba0a66d3272071e9a1.scope - libcontainer container 07c51550fb7bb843793f2be88b27c2b1396c22cae6b852ba0a66d3272071e9a1. Dec 16 12:23:39.267580 containerd[1660]: time="2025-12-16T12:23:39.267528604Z" level=info msg="StartContainer for \"07c51550fb7bb843793f2be88b27c2b1396c22cae6b852ba0a66d3272071e9a1\" returns successfully" Dec 16 12:23:39.404048 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:23:39.404289 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:23:39.590922 kubelet[2852]: I1216 12:23:39.590847 2852 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pzgh\" (UniqueName: \"kubernetes.io/projected/5e6ae6fc-592f-4274-8d1a-ed030be2d900-kube-api-access-6pzgh\") pod \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\" (UID: \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\") " Dec 16 12:23:39.590922 kubelet[2852]: I1216 12:23:39.590918 2852 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-backend-key-pair\") pod \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\" (UID: \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\") " Dec 16 12:23:39.591973 kubelet[2852]: I1216 12:23:39.590956 2852 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-ca-bundle\") pod \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\" (UID: \"5e6ae6fc-592f-4274-8d1a-ed030be2d900\") " Dec 16 12:23:39.591973 kubelet[2852]: I1216 12:23:39.591845 2852 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "5e6ae6fc-592f-4274-8d1a-ed030be2d900" (UID: "5e6ae6fc-592f-4274-8d1a-ed030be2d900"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:23:39.594298 kubelet[2852]: I1216 12:23:39.594255 2852 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "5e6ae6fc-592f-4274-8d1a-ed030be2d900" (UID: "5e6ae6fc-592f-4274-8d1a-ed030be2d900"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:23:39.594673 kubelet[2852]: I1216 12:23:39.594639 2852 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6ae6fc-592f-4274-8d1a-ed030be2d900-kube-api-access-6pzgh" (OuterVolumeSpecName: "kube-api-access-6pzgh") pod "5e6ae6fc-592f-4274-8d1a-ed030be2d900" (UID: "5e6ae6fc-592f-4274-8d1a-ed030be2d900"). InnerVolumeSpecName "kube-api-access-6pzgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:23:39.691962 kubelet[2852]: I1216 12:23:39.691914 2852 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6pzgh\" (UniqueName: \"kubernetes.io/projected/5e6ae6fc-592f-4274-8d1a-ed030be2d900-kube-api-access-6pzgh\") on node \"ci-4459-2-2-6-119dd6897d\" DevicePath \"\"" Dec 16 12:23:39.691962 kubelet[2852]: I1216 12:23:39.691950 2852 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-backend-key-pair\") on node \"ci-4459-2-2-6-119dd6897d\" DevicePath \"\"" Dec 16 12:23:39.691962 kubelet[2852]: I1216 12:23:39.691968 2852 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e6ae6fc-592f-4274-8d1a-ed030be2d900-whisker-ca-bundle\") on node \"ci-4459-2-2-6-119dd6897d\" DevicePath \"\"" Dec 16 12:23:40.067991 systemd[1]: var-lib-kubelet-pods-5e6ae6fc\x2d592f\x2d4274\x2d8d1a\x2ded030be2d900-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6pzgh.mount: Deactivated successfully. Dec 16 12:23:40.068408 systemd[1]: var-lib-kubelet-pods-5e6ae6fc\x2d592f\x2d4274\x2d8d1a\x2ded030be2d900-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:23:40.268327 systemd[1]: Removed slice kubepods-besteffort-pod5e6ae6fc_592f_4274_8d1a_ed030be2d900.slice - libcontainer container kubepods-besteffort-pod5e6ae6fc_592f_4274_8d1a_ed030be2d900.slice. Dec 16 12:23:40.285899 kubelet[2852]: I1216 12:23:40.285813 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-29tl8" podStartSLOduration=2.338457172 podStartE2EDuration="21.285798557s" podCreationTimestamp="2025-12-16 12:23:19 +0000 UTC" firstStartedPulling="2025-12-16 12:23:20.148382183 +0000 UTC m=+23.463986910" lastFinishedPulling="2025-12-16 12:23:39.095723568 +0000 UTC m=+42.411328295" observedRunningTime="2025-12-16 12:23:40.285032873 +0000 UTC m=+43.600637600" watchObservedRunningTime="2025-12-16 12:23:40.285798557 +0000 UTC m=+43.601403284" Dec 16 12:23:40.344746 systemd[1]: Created slice kubepods-besteffort-pod3e5d8b5a_70d9_4d0b_a6fa_4d88b20890b8.slice - libcontainer container kubepods-besteffort-pod3e5d8b5a_70d9_4d0b_a6fa_4d88b20890b8.slice. Dec 16 12:23:40.497854 kubelet[2852]: I1216 12:23:40.497790 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ftv5\" (UniqueName: \"kubernetes.io/projected/3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8-kube-api-access-4ftv5\") pod \"whisker-68755c694b-r2nr2\" (UID: \"3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8\") " pod="calico-system/whisker-68755c694b-r2nr2" Dec 16 12:23:40.497854 kubelet[2852]: I1216 12:23:40.497844 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8-whisker-backend-key-pair\") pod \"whisker-68755c694b-r2nr2\" (UID: \"3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8\") " pod="calico-system/whisker-68755c694b-r2nr2" Dec 16 12:23:40.498036 kubelet[2852]: I1216 12:23:40.497908 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8-whisker-ca-bundle\") pod \"whisker-68755c694b-r2nr2\" (UID: \"3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8\") " pod="calico-system/whisker-68755c694b-r2nr2" Dec 16 12:23:40.649906 containerd[1660]: time="2025-12-16T12:23:40.649782693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68755c694b-r2nr2,Uid:3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:40.851688 systemd-networkd[1523]: cali439f97a6f3b: Link UP Dec 16 12:23:40.851854 systemd-networkd[1523]: cali439f97a6f3b: Gained carrier Dec 16 12:23:40.872433 containerd[1660]: 2025-12-16 12:23:40.673 [INFO][3960] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:23:40.872433 containerd[1660]: 2025-12-16 12:23:40.700 [INFO][3960] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0 whisker-68755c694b- calico-system 3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8 937 0 2025-12-16 12:23:40 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:68755c694b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d whisker-68755c694b-r2nr2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali439f97a6f3b [] [] }} ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-" Dec 16 12:23:40.872433 containerd[1660]: 2025-12-16 12:23:40.700 [INFO][3960] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" Dec 16 12:23:40.872433 containerd[1660]: 2025-12-16 12:23:40.767 [INFO][4007] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" HandleID="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Workload="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.767 [INFO][4007] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" HandleID="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Workload="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004daa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-6-119dd6897d", "pod":"whisker-68755c694b-r2nr2", "timestamp":"2025-12-16 12:23:40.767565573 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.768 [INFO][4007] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.768 [INFO][4007] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.768 [INFO][4007] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.779 [INFO][4007] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.791 [INFO][4007] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.797 [INFO][4007] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.800 [INFO][4007] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872660 containerd[1660]: 2025-12-16 12:23:40.804 [INFO][4007] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872857 containerd[1660]: 2025-12-16 12:23:40.804 [INFO][4007] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872857 containerd[1660]: 2025-12-16 12:23:40.807 [INFO][4007] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f Dec 16 12:23:40.872857 containerd[1660]: 2025-12-16 12:23:40.813 [INFO][4007] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872857 containerd[1660]: 2025-12-16 12:23:40.823 [INFO][4007] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.129/26] block=192.168.89.128/26 handle="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872857 containerd[1660]: 2025-12-16 12:23:40.823 [INFO][4007] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.129/26] handle="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:40.872857 containerd[1660]: 2025-12-16 12:23:40.824 [INFO][4007] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:40.872857 containerd[1660]: 2025-12-16 12:23:40.824 [INFO][4007] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.129/26] IPv6=[] ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" HandleID="k8s-pod-network.5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Workload="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" Dec 16 12:23:40.873601 containerd[1660]: 2025-12-16 12:23:40.840 [INFO][3960] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0", GenerateName:"whisker-68755c694b-", Namespace:"calico-system", SelfLink:"", UID:"3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68755c694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"whisker-68755c694b-r2nr2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.89.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali439f97a6f3b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:40.873601 containerd[1660]: 2025-12-16 12:23:40.841 [INFO][3960] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.129/32] ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" Dec 16 12:23:40.873699 containerd[1660]: 2025-12-16 12:23:40.841 [INFO][3960] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali439f97a6f3b ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" Dec 16 12:23:40.873699 containerd[1660]: 2025-12-16 12:23:40.851 [INFO][3960] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" Dec 16 12:23:40.873740 containerd[1660]: 2025-12-16 12:23:40.854 [INFO][3960] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0", GenerateName:"whisker-68755c694b-", Namespace:"calico-system", SelfLink:"", UID:"3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68755c694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f", Pod:"whisker-68755c694b-r2nr2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.89.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali439f97a6f3b", MAC:"ae:62:37:75:b9:dd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:40.873789 containerd[1660]: 2025-12-16 12:23:40.867 [INFO][3960] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" Namespace="calico-system" Pod="whisker-68755c694b-r2nr2" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-whisker--68755c694b--r2nr2-eth0" Dec 16 12:23:40.907660 containerd[1660]: time="2025-12-16T12:23:40.907258886Z" level=info msg="connecting to shim 5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f" address="unix:///run/containerd/s/c39801a4671ec0adc0230836f9413fb7a160cf411526f709c95de5d9dcfed942" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:40.947442 systemd[1]: Started cri-containerd-5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f.scope - libcontainer container 5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f. Dec 16 12:23:40.999803 containerd[1660]: time="2025-12-16T12:23:40.999719037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68755c694b-r2nr2,Uid:3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8,Namespace:calico-system,Attempt:0,} returns sandbox id \"5d6056a7b5aaf0a4308f459a5b6e49cb60bb755e7ad2ef6ab8da58a00fefe08f\"" Dec 16 12:23:41.006951 containerd[1660]: time="2025-12-16T12:23:41.006910794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:23:41.133653 kubelet[2852]: I1216 12:23:41.133614 2852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6ae6fc-592f-4274-8d1a-ed030be2d900" path="/var/lib/kubelet/pods/5e6ae6fc-592f-4274-8d1a-ed030be2d900/volumes" Dec 16 12:23:41.230682 systemd-networkd[1523]: vxlan.calico: Link UP Dec 16 12:23:41.230689 systemd-networkd[1523]: vxlan.calico: Gained carrier Dec 16 12:23:41.382493 containerd[1660]: time="2025-12-16T12:23:41.382413429Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:41.385320 containerd[1660]: time="2025-12-16T12:23:41.385216043Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:23:41.385320 containerd[1660]: time="2025-12-16T12:23:41.385253563Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 12:23:41.385543 kubelet[2852]: E1216 12:23:41.385484 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:23:41.385595 kubelet[2852]: E1216 12:23:41.385557 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:23:41.385778 kubelet[2852]: E1216 12:23:41.385695 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c718beb57b24e69b5023047bed65bf6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:41.388221 containerd[1660]: time="2025-12-16T12:23:41.388156258Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:23:41.748163 containerd[1660]: time="2025-12-16T12:23:41.748053333Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:41.749944 containerd[1660]: time="2025-12-16T12:23:41.749890183Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:23:41.750019 containerd[1660]: time="2025-12-16T12:23:41.749940463Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 12:23:41.750205 kubelet[2852]: E1216 12:23:41.750152 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:23:41.750290 kubelet[2852]: E1216 12:23:41.750232 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:23:41.750408 kubelet[2852]: E1216 12:23:41.750357 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:41.751497 kubelet[2852]: E1216 12:23:41.751463 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:23:42.268501 kubelet[2852]: E1216 12:23:42.268433 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:23:42.719346 systemd-networkd[1523]: cali439f97a6f3b: Gained IPv6LL Dec 16 12:23:43.039362 systemd-networkd[1523]: vxlan.calico: Gained IPv6LL Dec 16 12:23:45.131484 containerd[1660]: time="2025-12-16T12:23:45.131431467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rnf4t,Uid:0507b616-0c85-4998-9127-f71b3f5478c8,Namespace:kube-system,Attempt:0,}" Dec 16 12:23:45.132013 containerd[1660]: time="2025-12-16T12:23:45.131899589Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-gngqq,Uid:93e150fa-0b6c-4bf6-aa66-e7e32f1f3161,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:23:45.257559 systemd-networkd[1523]: calif9bc48423f2: Link UP Dec 16 12:23:45.258437 systemd-networkd[1523]: calif9bc48423f2: Gained carrier Dec 16 12:23:45.279329 containerd[1660]: 2025-12-16 12:23:45.191 [INFO][4322] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0 calico-apiserver-84f977f475- calico-apiserver 93e150fa-0b6c-4bf6-aa66-e7e32f1f3161 875 0 2025-12-16 12:23:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:84f977f475 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d calico-apiserver-84f977f475-gngqq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif9bc48423f2 [] [] }} ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-" Dec 16 12:23:45.279329 containerd[1660]: 2025-12-16 12:23:45.191 [INFO][4322] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" Dec 16 12:23:45.279329 containerd[1660]: 2025-12-16 12:23:45.216 [INFO][4347] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" HandleID="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.216 [INFO][4347] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" HandleID="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c28c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-6-119dd6897d", "pod":"calico-apiserver-84f977f475-gngqq", "timestamp":"2025-12-16 12:23:45.216654901 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.217 [INFO][4347] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.217 [INFO][4347] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.217 [INFO][4347] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.227 [INFO][4347] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.232 [INFO][4347] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.237 [INFO][4347] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.239 [INFO][4347] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279529 containerd[1660]: 2025-12-16 12:23:45.241 [INFO][4347] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279726 containerd[1660]: 2025-12-16 12:23:45.241 [INFO][4347] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279726 containerd[1660]: 2025-12-16 12:23:45.243 [INFO][4347] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935 Dec 16 12:23:45.279726 containerd[1660]: 2025-12-16 12:23:45.247 [INFO][4347] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279726 containerd[1660]: 2025-12-16 12:23:45.254 [INFO][4347] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.130/26] block=192.168.89.128/26 handle="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279726 containerd[1660]: 2025-12-16 12:23:45.254 [INFO][4347] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.130/26] handle="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.279726 containerd[1660]: 2025-12-16 12:23:45.254 [INFO][4347] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:45.279726 containerd[1660]: 2025-12-16 12:23:45.254 [INFO][4347] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.130/26] IPv6=[] ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" HandleID="k8s-pod-network.30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" Dec 16 12:23:45.279861 containerd[1660]: 2025-12-16 12:23:45.255 [INFO][4322] cni-plugin/k8s.go 418: Populated endpoint ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0", GenerateName:"calico-apiserver-84f977f475-", Namespace:"calico-apiserver", SelfLink:"", UID:"93e150fa-0b6c-4bf6-aa66-e7e32f1f3161", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f977f475", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"calico-apiserver-84f977f475-gngqq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9bc48423f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:45.279908 containerd[1660]: 2025-12-16 12:23:45.255 [INFO][4322] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.130/32] ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" Dec 16 12:23:45.279908 containerd[1660]: 2025-12-16 12:23:45.255 [INFO][4322] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9bc48423f2 ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" Dec 16 12:23:45.279908 containerd[1660]: 2025-12-16 12:23:45.257 [INFO][4322] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" Dec 16 12:23:45.279983 containerd[1660]: 2025-12-16 12:23:45.260 [INFO][4322] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0", GenerateName:"calico-apiserver-84f977f475-", Namespace:"calico-apiserver", SelfLink:"", UID:"93e150fa-0b6c-4bf6-aa66-e7e32f1f3161", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f977f475", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935", Pod:"calico-apiserver-84f977f475-gngqq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9bc48423f2", MAC:"ae:bd:6a:ea:3e:50", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:45.280032 containerd[1660]: 2025-12-16 12:23:45.277 [INFO][4322] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-gngqq" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--gngqq-eth0" Dec 16 12:23:45.298728 containerd[1660]: time="2025-12-16T12:23:45.298671360Z" level=info msg="connecting to shim 30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935" address="unix:///run/containerd/s/6cbb1b1b12a5e5582560fca05bc80ef5dd11dc41e45f9ad4c0afc32e03dd70b4" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:45.320381 systemd[1]: Started cri-containerd-30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935.scope - libcontainer container 30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935. Dec 16 12:23:45.359578 containerd[1660]: time="2025-12-16T12:23:45.359523310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-gngqq,Uid:93e150fa-0b6c-4bf6-aa66-e7e32f1f3161,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"30d9a63feaf5688722b7c60dc75b22d5b309c2fd7ec569e7f70a595f1ac84935\"" Dec 16 12:23:45.362786 containerd[1660]: time="2025-12-16T12:23:45.362749966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:23:45.371307 systemd-networkd[1523]: cali2b3efd3014b: Link UP Dec 16 12:23:45.372443 systemd-networkd[1523]: cali2b3efd3014b: Gained carrier Dec 16 12:23:45.387803 containerd[1660]: 2025-12-16 12:23:45.191 [INFO][4316] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0 coredns-674b8bbfcf- kube-system 0507b616-0c85-4998-9127-f71b3f5478c8 868 0 2025-12-16 12:23:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d coredns-674b8bbfcf-rnf4t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2b3efd3014b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-" Dec 16 12:23:45.387803 containerd[1660]: 2025-12-16 12:23:45.191 [INFO][4316] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" Dec 16 12:23:45.387803 containerd[1660]: 2025-12-16 12:23:45.217 [INFO][4345] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" HandleID="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Workload="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.217 [INFO][4345] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" HandleID="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Workload="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000528b30), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-6-119dd6897d", "pod":"coredns-674b8bbfcf-rnf4t", "timestamp":"2025-12-16 12:23:45.217786267 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.217 [INFO][4345] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.254 [INFO][4345] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.255 [INFO][4345] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.332 [INFO][4345] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.337 [INFO][4345] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.342 [INFO][4345] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.345 [INFO][4345] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.387998 containerd[1660]: 2025-12-16 12:23:45.350 [INFO][4345] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.388807 containerd[1660]: 2025-12-16 12:23:45.350 [INFO][4345] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.388807 containerd[1660]: 2025-12-16 12:23:45.352 [INFO][4345] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33 Dec 16 12:23:45.388807 containerd[1660]: 2025-12-16 12:23:45.359 [INFO][4345] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.388807 containerd[1660]: 2025-12-16 12:23:45.366 [INFO][4345] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.131/26] block=192.168.89.128/26 handle="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.388807 containerd[1660]: 2025-12-16 12:23:45.366 [INFO][4345] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.131/26] handle="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:45.388807 containerd[1660]: 2025-12-16 12:23:45.366 [INFO][4345] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:45.388807 containerd[1660]: 2025-12-16 12:23:45.366 [INFO][4345] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.131/26] IPv6=[] ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" HandleID="k8s-pod-network.6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Workload="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" Dec 16 12:23:45.389299 containerd[1660]: 2025-12-16 12:23:45.369 [INFO][4316] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0507b616-0c85-4998-9127-f71b3f5478c8", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"coredns-674b8bbfcf-rnf4t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b3efd3014b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:45.389299 containerd[1660]: 2025-12-16 12:23:45.369 [INFO][4316] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.131/32] ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" Dec 16 12:23:45.389299 containerd[1660]: 2025-12-16 12:23:45.369 [INFO][4316] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b3efd3014b ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" Dec 16 12:23:45.389299 containerd[1660]: 2025-12-16 12:23:45.372 [INFO][4316] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" Dec 16 12:23:45.389299 containerd[1660]: 2025-12-16 12:23:45.373 [INFO][4316] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0507b616-0c85-4998-9127-f71b3f5478c8", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33", Pod:"coredns-674b8bbfcf-rnf4t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b3efd3014b", MAC:"32:2f:e9:2f:43:00", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:45.389299 containerd[1660]: 2025-12-16 12:23:45.384 [INFO][4316] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" Namespace="kube-system" Pod="coredns-674b8bbfcf-rnf4t" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--rnf4t-eth0" Dec 16 12:23:45.420282 containerd[1660]: time="2025-12-16T12:23:45.420220220Z" level=info msg="connecting to shim 6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33" address="unix:///run/containerd/s/3d7491b73aed4daca9ba8c2c7b6d830739a0565a1207f89e500d04ab61b53f8c" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:45.449369 systemd[1]: Started cri-containerd-6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33.scope - libcontainer container 6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33. Dec 16 12:23:45.491023 containerd[1660]: time="2025-12-16T12:23:45.490966100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-rnf4t,Uid:0507b616-0c85-4998-9127-f71b3f5478c8,Namespace:kube-system,Attempt:0,} returns sandbox id \"6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33\"" Dec 16 12:23:45.496829 containerd[1660]: time="2025-12-16T12:23:45.496794850Z" level=info msg="CreateContainer within sandbox \"6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:23:45.505985 containerd[1660]: time="2025-12-16T12:23:45.505858856Z" level=info msg="Container 5c0a59d76a2e6e9568d81fa2391d2df1e461a48edc2a7f6498459226afc14f98: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:45.514459 containerd[1660]: time="2025-12-16T12:23:45.514394340Z" level=info msg="CreateContainer within sandbox \"6725b261a1fda74adbc174193f20eaf16164e6a96f8431713d5595ad20eb0d33\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5c0a59d76a2e6e9568d81fa2391d2df1e461a48edc2a7f6498459226afc14f98\"" Dec 16 12:23:45.514903 containerd[1660]: time="2025-12-16T12:23:45.514876822Z" level=info msg="StartContainer for \"5c0a59d76a2e6e9568d81fa2391d2df1e461a48edc2a7f6498459226afc14f98\"" Dec 16 12:23:45.516173 containerd[1660]: time="2025-12-16T12:23:45.516146869Z" level=info msg="connecting to shim 5c0a59d76a2e6e9568d81fa2391d2df1e461a48edc2a7f6498459226afc14f98" address="unix:///run/containerd/s/3d7491b73aed4daca9ba8c2c7b6d830739a0565a1207f89e500d04ab61b53f8c" protocol=ttrpc version=3 Dec 16 12:23:45.534410 systemd[1]: Started cri-containerd-5c0a59d76a2e6e9568d81fa2391d2df1e461a48edc2a7f6498459226afc14f98.scope - libcontainer container 5c0a59d76a2e6e9568d81fa2391d2df1e461a48edc2a7f6498459226afc14f98. Dec 16 12:23:45.561522 containerd[1660]: time="2025-12-16T12:23:45.561479940Z" level=info msg="StartContainer for \"5c0a59d76a2e6e9568d81fa2391d2df1e461a48edc2a7f6498459226afc14f98\" returns successfully" Dec 16 12:23:45.701741 containerd[1660]: time="2025-12-16T12:23:45.701608054Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:45.703730 containerd[1660]: time="2025-12-16T12:23:45.703663465Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:23:45.703807 containerd[1660]: time="2025-12-16T12:23:45.703737585Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:23:45.704030 kubelet[2852]: E1216 12:23:45.703983 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:23:45.704470 kubelet[2852]: E1216 12:23:45.704037 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:23:45.704470 kubelet[2852]: E1216 12:23:45.704162 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwpv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-gngqq_calico-apiserver(93e150fa-0b6c-4bf6-aa66-e7e32f1f3161): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:45.705393 kubelet[2852]: E1216 12:23:45.705341 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:23:46.278158 kubelet[2852]: E1216 12:23:46.278076 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:23:46.305868 kubelet[2852]: I1216 12:23:46.305790 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-rnf4t" podStartSLOduration=43.305771575 podStartE2EDuration="43.305771575s" podCreationTimestamp="2025-12-16 12:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:23:46.290729099 +0000 UTC m=+49.606333826" watchObservedRunningTime="2025-12-16 12:23:46.305771575 +0000 UTC m=+49.621376262" Dec 16 12:23:46.367472 systemd-networkd[1523]: calif9bc48423f2: Gained IPv6LL Dec 16 12:23:46.688495 systemd-networkd[1523]: cali2b3efd3014b: Gained IPv6LL Dec 16 12:23:47.279866 kubelet[2852]: E1216 12:23:47.279755 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:23:48.131820 containerd[1660]: time="2025-12-16T12:23:48.131657686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-ns4mb,Uid:533a88a9-9b7b-4454-afaf-2a43ad5efc0e,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:23:48.131820 containerd[1660]: time="2025-12-16T12:23:48.131782287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w78fj,Uid:426e2a8a-5d6e-4966-b145-015ce9ecbfa0,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:48.132579 containerd[1660]: time="2025-12-16T12:23:48.131797807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6957f646c-stmvl,Uid:8d05b967-cc40-4074-a4a8-7c06f72a502c,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:48.132579 containerd[1660]: time="2025-12-16T12:23:48.132379810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-vhc9k,Uid:9ac6813a-66c7-4844-b378-32ad1d5e7849,Namespace:calico-system,Attempt:0,}" Dec 16 12:23:48.284833 systemd-networkd[1523]: califab2c9ece76: Link UP Dec 16 12:23:48.285679 systemd-networkd[1523]: califab2c9ece76: Gained carrier Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.202 [INFO][4527] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0 csi-node-driver- calico-system 426e2a8a-5d6e-4966-b145-015ce9ecbfa0 744 0 2025-12-16 12:23:19 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d csi-node-driver-w78fj eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] califab2c9ece76 [] [] }} ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.202 [INFO][4527] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4579] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" HandleID="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Workload="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4579] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" HandleID="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Workload="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136dd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-6-119dd6897d", "pod":"csi-node-driver-w78fj", "timestamp":"2025-12-16 12:23:48.23426869 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4579] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4579] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4579] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.246 [INFO][4579] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.252 [INFO][4579] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.258 [INFO][4579] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.261 [INFO][4579] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.263 [INFO][4579] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.263 [INFO][4579] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.265 [INFO][4579] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06 Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.269 [INFO][4579] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.276 [INFO][4579] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.132/26] block=192.168.89.128/26 handle="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.276 [INFO][4579] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.132/26] handle="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.276 [INFO][4579] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:48.300232 containerd[1660]: 2025-12-16 12:23:48.277 [INFO][4579] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.132/26] IPv6=[] ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" HandleID="k8s-pod-network.3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Workload="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" Dec 16 12:23:48.300748 containerd[1660]: 2025-12-16 12:23:48.280 [INFO][4527] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"426e2a8a-5d6e-4966-b145-015ce9ecbfa0", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"csi-node-driver-w78fj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.89.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califab2c9ece76", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.300748 containerd[1660]: 2025-12-16 12:23:48.280 [INFO][4527] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.132/32] ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" Dec 16 12:23:48.300748 containerd[1660]: 2025-12-16 12:23:48.281 [INFO][4527] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califab2c9ece76 ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" Dec 16 12:23:48.300748 containerd[1660]: 2025-12-16 12:23:48.287 [INFO][4527] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" Dec 16 12:23:48.300748 containerd[1660]: 2025-12-16 12:23:48.287 [INFO][4527] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"426e2a8a-5d6e-4966-b145-015ce9ecbfa0", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06", Pod:"csi-node-driver-w78fj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.89.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califab2c9ece76", MAC:"f6:41:08:4c:32:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.300748 containerd[1660]: 2025-12-16 12:23:48.297 [INFO][4527] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" Namespace="calico-system" Pod="csi-node-driver-w78fj" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-csi--node--driver--w78fj-eth0" Dec 16 12:23:48.322298 containerd[1660]: time="2025-12-16T12:23:48.322247898Z" level=info msg="connecting to shim 3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06" address="unix:///run/containerd/s/f455fd3a5f0d3cd773e11d7942220a308f083ab929321848efb5e9ec74c2f4e6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:48.342491 systemd[1]: Started cri-containerd-3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06.scope - libcontainer container 3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06. Dec 16 12:23:48.371912 containerd[1660]: time="2025-12-16T12:23:48.371872071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w78fj,Uid:426e2a8a-5d6e-4966-b145-015ce9ecbfa0,Namespace:calico-system,Attempt:0,} returns sandbox id \"3ce63ef459c8c1ba8854d2f9820a0ac7485b93f2ed12761aa34af6cf65d51c06\"" Dec 16 12:23:48.374642 containerd[1660]: time="2025-12-16T12:23:48.374598405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:23:48.391686 systemd-networkd[1523]: cali015e49f5035: Link UP Dec 16 12:23:48.392930 systemd-networkd[1523]: cali015e49f5035: Gained carrier Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.205 [INFO][4539] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0 calico-kube-controllers-6957f646c- calico-system 8d05b967-cc40-4074-a4a8-7c06f72a502c 873 0 2025-12-16 12:23:20 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6957f646c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d calico-kube-controllers-6957f646c-stmvl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali015e49f5035 [] [] }} ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.205 [INFO][4539] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4581] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" HandleID="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4581] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" HandleID="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2c90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-6-119dd6897d", "pod":"calico-kube-controllers-6957f646c-stmvl", "timestamp":"2025-12-16 12:23:48.23441425 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.234 [INFO][4581] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.276 [INFO][4581] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.277 [INFO][4581] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.347 [INFO][4581] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.354 [INFO][4581] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.360 [INFO][4581] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.364 [INFO][4581] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.366 [INFO][4581] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.367 [INFO][4581] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.369 [INFO][4581] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.376 [INFO][4581] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.382 [INFO][4581] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.133/26] block=192.168.89.128/26 handle="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.383 [INFO][4581] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.133/26] handle="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.383 [INFO][4581] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:48.406958 containerd[1660]: 2025-12-16 12:23:48.383 [INFO][4581] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.133/26] IPv6=[] ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" HandleID="k8s-pod-network.24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" Dec 16 12:23:48.408179 containerd[1660]: 2025-12-16 12:23:48.388 [INFO][4539] cni-plugin/k8s.go 418: Populated endpoint ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0", GenerateName:"calico-kube-controllers-6957f646c-", Namespace:"calico-system", SelfLink:"", UID:"8d05b967-cc40-4074-a4a8-7c06f72a502c", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6957f646c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"calico-kube-controllers-6957f646c-stmvl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.89.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali015e49f5035", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.408179 containerd[1660]: 2025-12-16 12:23:48.389 [INFO][4539] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.133/32] ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" Dec 16 12:23:48.408179 containerd[1660]: 2025-12-16 12:23:48.389 [INFO][4539] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali015e49f5035 ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" Dec 16 12:23:48.408179 containerd[1660]: 2025-12-16 12:23:48.393 [INFO][4539] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" Dec 16 12:23:48.408179 containerd[1660]: 2025-12-16 12:23:48.393 [INFO][4539] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0", GenerateName:"calico-kube-controllers-6957f646c-", Namespace:"calico-system", SelfLink:"", UID:"8d05b967-cc40-4074-a4a8-7c06f72a502c", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6957f646c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed", Pod:"calico-kube-controllers-6957f646c-stmvl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.89.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali015e49f5035", MAC:"12:48:ac:54:94:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.408179 containerd[1660]: 2025-12-16 12:23:48.404 [INFO][4539] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" Namespace="calico-system" Pod="calico-kube-controllers-6957f646c-stmvl" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--kube--controllers--6957f646c--stmvl-eth0" Dec 16 12:23:48.427504 containerd[1660]: time="2025-12-16T12:23:48.427453795Z" level=info msg="connecting to shim 24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed" address="unix:///run/containerd/s/62db2c2942a5b295af73b7dcc12d370b2f490a458866445550ce2d51e60b789d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:48.453402 systemd[1]: Started cri-containerd-24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed.scope - libcontainer container 24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed. Dec 16 12:23:48.487280 systemd-networkd[1523]: cali82f264fd113: Link UP Dec 16 12:23:48.487972 systemd-networkd[1523]: cali82f264fd113: Gained carrier Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.209 [INFO][4521] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0 calico-apiserver-84f977f475- calico-apiserver 533a88a9-9b7b-4454-afaf-2a43ad5efc0e 870 0 2025-12-16 12:23:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:84f977f475 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d calico-apiserver-84f977f475-ns4mb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali82f264fd113 [] [] }} ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.209 [INFO][4521] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.249 [INFO][4593] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" HandleID="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.249 [INFO][4593] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" HandleID="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001375b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-6-119dd6897d", "pod":"calico-apiserver-84f977f475-ns4mb", "timestamp":"2025-12-16 12:23:48.249267206 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.249 [INFO][4593] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.383 [INFO][4593] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.383 [INFO][4593] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.447 [INFO][4593] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.454 [INFO][4593] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.460 [INFO][4593] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.462 [INFO][4593] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.465 [INFO][4593] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.465 [INFO][4593] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.467 [INFO][4593] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906 Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.472 [INFO][4593] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.480 [INFO][4593] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.134/26] block=192.168.89.128/26 handle="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.480 [INFO][4593] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.134/26] handle="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.480 [INFO][4593] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:48.509463 containerd[1660]: 2025-12-16 12:23:48.481 [INFO][4593] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.134/26] IPv6=[] ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" HandleID="k8s-pod-network.c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Workload="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" Dec 16 12:23:48.510586 containerd[1660]: 2025-12-16 12:23:48.483 [INFO][4521] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0", GenerateName:"calico-apiserver-84f977f475-", Namespace:"calico-apiserver", SelfLink:"", UID:"533a88a9-9b7b-4454-afaf-2a43ad5efc0e", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f977f475", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"calico-apiserver-84f977f475-ns4mb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali82f264fd113", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.510586 containerd[1660]: 2025-12-16 12:23:48.483 [INFO][4521] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.134/32] ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" Dec 16 12:23:48.510586 containerd[1660]: 2025-12-16 12:23:48.483 [INFO][4521] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali82f264fd113 ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" Dec 16 12:23:48.510586 containerd[1660]: 2025-12-16 12:23:48.487 [INFO][4521] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" Dec 16 12:23:48.510586 containerd[1660]: 2025-12-16 12:23:48.488 [INFO][4521] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0", GenerateName:"calico-apiserver-84f977f475-", Namespace:"calico-apiserver", SelfLink:"", UID:"533a88a9-9b7b-4454-afaf-2a43ad5efc0e", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f977f475", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906", Pod:"calico-apiserver-84f977f475-ns4mb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.89.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali82f264fd113", MAC:"12:00:98:19:18:5e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.510586 containerd[1660]: 2025-12-16 12:23:48.507 [INFO][4521] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" Namespace="calico-apiserver" Pod="calico-apiserver-84f977f475-ns4mb" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-calico--apiserver--84f977f475--ns4mb-eth0" Dec 16 12:23:48.513137 containerd[1660]: time="2025-12-16T12:23:48.513079831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6957f646c-stmvl,Uid:8d05b967-cc40-4074-a4a8-7c06f72a502c,Namespace:calico-system,Attempt:0,} returns sandbox id \"24d75f2db00a2c63d21ce890e227078bf710b368040eb22281fe6c07235f12ed\"" Dec 16 12:23:48.540811 containerd[1660]: time="2025-12-16T12:23:48.540767293Z" level=info msg="connecting to shim c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906" address="unix:///run/containerd/s/cffdd79272692344d09e3ea8d5c18f3aae8ad2249eeb487b465160fab595f3d7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:48.568477 systemd[1]: Started cri-containerd-c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906.scope - libcontainer container c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906. Dec 16 12:23:48.592805 systemd-networkd[1523]: cali0e8d078d509: Link UP Dec 16 12:23:48.593988 systemd-networkd[1523]: cali0e8d078d509: Gained carrier Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.232 [INFO][4562] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0 goldmane-666569f655- calico-system 9ac6813a-66c7-4844-b378-32ad1d5e7849 871 0 2025-12-16 12:23:16 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d goldmane-666569f655-vhc9k eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0e8d078d509 [] [] }} ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.232 [INFO][4562] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.267 [INFO][4605] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" HandleID="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Workload="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.269 [INFO][4605] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" HandleID="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Workload="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003220f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-6-119dd6897d", "pod":"goldmane-666569f655-vhc9k", "timestamp":"2025-12-16 12:23:48.26773062 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.269 [INFO][4605] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.480 [INFO][4605] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.481 [INFO][4605] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.548 [INFO][4605] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.555 [INFO][4605] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.564 [INFO][4605] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.566 [INFO][4605] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.568 [INFO][4605] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.569 [INFO][4605] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.570 [INFO][4605] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.577 [INFO][4605] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.584 [INFO][4605] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.135/26] block=192.168.89.128/26 handle="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.584 [INFO][4605] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.135/26] handle="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.584 [INFO][4605] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:48.616537 containerd[1660]: 2025-12-16 12:23:48.584 [INFO][4605] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.135/26] IPv6=[] ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" HandleID="k8s-pod-network.7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Workload="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" Dec 16 12:23:48.617704 containerd[1660]: 2025-12-16 12:23:48.589 [INFO][4562] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"9ac6813a-66c7-4844-b378-32ad1d5e7849", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"goldmane-666569f655-vhc9k", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.89.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0e8d078d509", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.617704 containerd[1660]: 2025-12-16 12:23:48.589 [INFO][4562] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.135/32] ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" Dec 16 12:23:48.617704 containerd[1660]: 2025-12-16 12:23:48.589 [INFO][4562] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0e8d078d509 ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" Dec 16 12:23:48.617704 containerd[1660]: 2025-12-16 12:23:48.593 [INFO][4562] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" Dec 16 12:23:48.617704 containerd[1660]: 2025-12-16 12:23:48.597 [INFO][4562] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"9ac6813a-66c7-4844-b378-32ad1d5e7849", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d", Pod:"goldmane-666569f655-vhc9k", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.89.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0e8d078d509", MAC:"42:42:d4:71:dc:37", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:48.617704 containerd[1660]: 2025-12-16 12:23:48.612 [INFO][4562] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" Namespace="calico-system" Pod="goldmane-666569f655-vhc9k" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-goldmane--666569f655--vhc9k-eth0" Dec 16 12:23:48.633759 containerd[1660]: time="2025-12-16T12:23:48.633716327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f977f475-ns4mb,Uid:533a88a9-9b7b-4454-afaf-2a43ad5efc0e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c4c3ab6c389ed2d7e3515340435748e09d68e086269a7b3be1d19ffb4b93e906\"" Dec 16 12:23:48.647443 containerd[1660]: time="2025-12-16T12:23:48.647320396Z" level=info msg="connecting to shim 7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d" address="unix:///run/containerd/s/f3560b942c6b7a62b95c64a839c87d1654da8e5acbf58c706e8291683a0d745a" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:48.677415 systemd[1]: Started cri-containerd-7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d.scope - libcontainer container 7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d. Dec 16 12:23:48.710864 containerd[1660]: time="2025-12-16T12:23:48.710801840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-vhc9k,Uid:9ac6813a-66c7-4844-b378-32ad1d5e7849,Namespace:calico-system,Attempt:0,} returns sandbox id \"7c356357429c0b022e400e1eb715ef464d76ce17e951a2b0513d4e28aa10eb0d\"" Dec 16 12:23:48.713893 containerd[1660]: time="2025-12-16T12:23:48.713856135Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:48.715061 containerd[1660]: time="2025-12-16T12:23:48.715028701Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:23:48.715125 containerd[1660]: time="2025-12-16T12:23:48.715101142Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 12:23:48.715279 kubelet[2852]: E1216 12:23:48.715249 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:23:48.715704 kubelet[2852]: E1216 12:23:48.715479 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:23:48.715731 kubelet[2852]: E1216 12:23:48.715688 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:48.716529 containerd[1660]: time="2025-12-16T12:23:48.716497949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:23:49.034607 containerd[1660]: time="2025-12-16T12:23:49.034395530Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:49.036221 containerd[1660]: time="2025-12-16T12:23:49.036152179Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:23:49.036311 containerd[1660]: time="2025-12-16T12:23:49.036232219Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 12:23:49.036496 kubelet[2852]: E1216 12:23:49.036454 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:23:49.036550 kubelet[2852]: E1216 12:23:49.036522 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:23:49.037146 kubelet[2852]: E1216 12:23:49.036858 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6czrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6957f646c-stmvl_calico-system(8d05b967-cc40-4074-a4a8-7c06f72a502c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:49.038219 containerd[1660]: time="2025-12-16T12:23:49.037083344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:23:49.038258 kubelet[2852]: E1216 12:23:49.038061 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:23:49.131101 containerd[1660]: time="2025-12-16T12:23:49.131031263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-kk9ss,Uid:13192563-a3ab-4770-a7c2-f05fdc031eaf,Namespace:kube-system,Attempt:0,}" Dec 16 12:23:49.238835 systemd-networkd[1523]: cali6cb40185b23: Link UP Dec 16 12:23:49.239598 systemd-networkd[1523]: cali6cb40185b23: Gained carrier Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.170 [INFO][4840] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0 coredns-674b8bbfcf- kube-system 13192563-a3ab-4770-a7c2-f05fdc031eaf 869 0 2025-12-16 12:23:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-6-119dd6897d coredns-674b8bbfcf-kk9ss eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6cb40185b23 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.170 [INFO][4840] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.194 [INFO][4849] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" HandleID="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Workload="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.194 [INFO][4849] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" HandleID="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Workload="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004cca0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-6-119dd6897d", "pod":"coredns-674b8bbfcf-kk9ss", "timestamp":"2025-12-16 12:23:49.194302905 +0000 UTC"}, Hostname:"ci-4459-2-2-6-119dd6897d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.194 [INFO][4849] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.194 [INFO][4849] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.194 [INFO][4849] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-6-119dd6897d' Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.204 [INFO][4849] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.209 [INFO][4849] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.214 [INFO][4849] ipam/ipam.go 511: Trying affinity for 192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.216 [INFO][4849] ipam/ipam.go 158: Attempting to load block cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.218 [INFO][4849] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.89.128/26 host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.218 [INFO][4849] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.89.128/26 handle="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.219 [INFO][4849] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4 Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.226 [INFO][4849] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.89.128/26 handle="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.233 [INFO][4849] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.89.136/26] block=192.168.89.128/26 handle="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.233 [INFO][4849] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.89.136/26] handle="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" host="ci-4459-2-2-6-119dd6897d" Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.233 [INFO][4849] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:23:49.253872 containerd[1660]: 2025-12-16 12:23:49.233 [INFO][4849] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.89.136/26] IPv6=[] ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" HandleID="k8s-pod-network.5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Workload="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" Dec 16 12:23:49.254621 containerd[1660]: 2025-12-16 12:23:49.235 [INFO][4840] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"13192563-a3ab-4770-a7c2-f05fdc031eaf", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"", Pod:"coredns-674b8bbfcf-kk9ss", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6cb40185b23", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:49.254621 containerd[1660]: 2025-12-16 12:23:49.235 [INFO][4840] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.89.136/32] ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" Dec 16 12:23:49.254621 containerd[1660]: 2025-12-16 12:23:49.235 [INFO][4840] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6cb40185b23 ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" Dec 16 12:23:49.254621 containerd[1660]: 2025-12-16 12:23:49.240 [INFO][4840] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" Dec 16 12:23:49.254621 containerd[1660]: 2025-12-16 12:23:49.241 [INFO][4840] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"13192563-a3ab-4770-a7c2-f05fdc031eaf", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 23, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-6-119dd6897d", ContainerID:"5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4", Pod:"coredns-674b8bbfcf-kk9ss", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.89.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6cb40185b23", MAC:"d2:6e:3d:3e:49:21", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:23:49.254621 containerd[1660]: 2025-12-16 12:23:49.251 [INFO][4840] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" Namespace="kube-system" Pod="coredns-674b8bbfcf-kk9ss" WorkloadEndpoint="ci--4459--2--2--6--119dd6897d-k8s-coredns--674b8bbfcf--kk9ss-eth0" Dec 16 12:23:49.276448 containerd[1660]: time="2025-12-16T12:23:49.276393684Z" level=info msg="connecting to shim 5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4" address="unix:///run/containerd/s/9f08c7e19d49927bdaa2dfedf3d5a1f69c6e9e960889b9843ecf5ec9d3f85a5e" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:23:49.288353 kubelet[2852]: E1216 12:23:49.288189 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:23:49.305439 systemd[1]: Started cri-containerd-5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4.scope - libcontainer container 5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4. Dec 16 12:23:49.336672 containerd[1660]: time="2025-12-16T12:23:49.336618431Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-kk9ss,Uid:13192563-a3ab-4770-a7c2-f05fdc031eaf,Namespace:kube-system,Attempt:0,} returns sandbox id \"5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4\"" Dec 16 12:23:49.342441 containerd[1660]: time="2025-12-16T12:23:49.342403421Z" level=info msg="CreateContainer within sandbox \"5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:23:49.353241 containerd[1660]: time="2025-12-16T12:23:49.352699633Z" level=info msg="Container ed9b7399aec5e1c85b46c75cb64ab32610639783a73c6da7a73d2211225de412: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:23:49.361579 containerd[1660]: time="2025-12-16T12:23:49.361536918Z" level=info msg="CreateContainer within sandbox \"5bba8e0295f33300c4a4295239704eba4a9e44231456546995eff8acacc074b4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ed9b7399aec5e1c85b46c75cb64ab32610639783a73c6da7a73d2211225de412\"" Dec 16 12:23:49.363669 containerd[1660]: time="2025-12-16T12:23:49.362402763Z" level=info msg="StartContainer for \"ed9b7399aec5e1c85b46c75cb64ab32610639783a73c6da7a73d2211225de412\"" Dec 16 12:23:49.363979 containerd[1660]: time="2025-12-16T12:23:49.363947530Z" level=info msg="connecting to shim ed9b7399aec5e1c85b46c75cb64ab32610639783a73c6da7a73d2211225de412" address="unix:///run/containerd/s/9f08c7e19d49927bdaa2dfedf3d5a1f69c6e9e960889b9843ecf5ec9d3f85a5e" protocol=ttrpc version=3 Dec 16 12:23:49.381358 systemd[1]: Started cri-containerd-ed9b7399aec5e1c85b46c75cb64ab32610639783a73c6da7a73d2211225de412.scope - libcontainer container ed9b7399aec5e1c85b46c75cb64ab32610639783a73c6da7a73d2211225de412. Dec 16 12:23:49.391352 containerd[1660]: time="2025-12-16T12:23:49.391312510Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:49.393461 containerd[1660]: time="2025-12-16T12:23:49.393415641Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:23:49.393548 containerd[1660]: time="2025-12-16T12:23:49.393509361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:23:49.393714 kubelet[2852]: E1216 12:23:49.393668 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:23:49.393764 kubelet[2852]: E1216 12:23:49.393727 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:23:49.394244 kubelet[2852]: E1216 12:23:49.393986 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sxqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-ns4mb_calico-apiserver(533a88a9-9b7b-4454-afaf-2a43ad5efc0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:49.394375 containerd[1660]: time="2025-12-16T12:23:49.394242605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:23:49.396053 kubelet[2852]: E1216 12:23:49.395638 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:23:49.411072 containerd[1660]: time="2025-12-16T12:23:49.410965050Z" level=info msg="StartContainer for \"ed9b7399aec5e1c85b46c75cb64ab32610639783a73c6da7a73d2211225de412\" returns successfully" Dec 16 12:23:49.730029 containerd[1660]: time="2025-12-16T12:23:49.729975757Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:49.731845 containerd[1660]: time="2025-12-16T12:23:49.731798126Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:23:49.731919 containerd[1660]: time="2025-12-16T12:23:49.731830486Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 12:23:49.732186 kubelet[2852]: E1216 12:23:49.732027 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:23:49.732186 kubelet[2852]: E1216 12:23:49.732076 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:23:49.732487 kubelet[2852]: E1216 12:23:49.732411 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jv8k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-vhc9k_calico-system(9ac6813a-66c7-4844-b378-32ad1d5e7849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:49.733008 containerd[1660]: time="2025-12-16T12:23:49.732774091Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:23:49.734017 kubelet[2852]: E1216 12:23:49.733964 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:23:49.887481 systemd-networkd[1523]: cali015e49f5035: Gained IPv6LL Dec 16 12:23:49.951565 systemd-networkd[1523]: cali0e8d078d509: Gained IPv6LL Dec 16 12:23:50.143654 systemd-networkd[1523]: califab2c9ece76: Gained IPv6LL Dec 16 12:23:50.260105 containerd[1660]: time="2025-12-16T12:23:50.260026260Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:50.261874 containerd[1660]: time="2025-12-16T12:23:50.261827549Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:23:50.261929 containerd[1660]: time="2025-12-16T12:23:50.261910630Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 12:23:50.262103 kubelet[2852]: E1216 12:23:50.262050 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:23:50.262175 kubelet[2852]: E1216 12:23:50.262108 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:23:50.262591 kubelet[2852]: E1216 12:23:50.262282 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:50.263843 kubelet[2852]: E1216 12:23:50.263782 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:50.293288 kubelet[2852]: E1216 12:23:50.293241 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:23:50.293773 kubelet[2852]: E1216 12:23:50.293713 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:23:50.294695 kubelet[2852]: E1216 12:23:50.294642 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:23:50.295359 kubelet[2852]: E1216 12:23:50.295317 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:23:50.362955 kubelet[2852]: I1216 12:23:50.362885 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-kk9ss" podStartSLOduration=47.362867344 podStartE2EDuration="47.362867344s" podCreationTimestamp="2025-12-16 12:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:23:50.362270261 +0000 UTC m=+53.677874988" watchObservedRunningTime="2025-12-16 12:23:50.362867344 +0000 UTC m=+53.678472031" Dec 16 12:23:50.527392 systemd-networkd[1523]: cali82f264fd113: Gained IPv6LL Dec 16 12:23:51.167525 systemd-networkd[1523]: cali6cb40185b23: Gained IPv6LL Dec 16 12:23:53.133162 containerd[1660]: time="2025-12-16T12:23:53.133063591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:23:53.473129 containerd[1660]: time="2025-12-16T12:23:53.472862324Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:53.474883 containerd[1660]: time="2025-12-16T12:23:53.474804053Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:23:53.474883 containerd[1660]: time="2025-12-16T12:23:53.474853614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 12:23:53.475461 kubelet[2852]: E1216 12:23:53.475093 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:23:53.475461 kubelet[2852]: E1216 12:23:53.475168 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:23:53.475461 kubelet[2852]: E1216 12:23:53.475337 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c718beb57b24e69b5023047bed65bf6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:53.477621 containerd[1660]: time="2025-12-16T12:23:53.477403707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:23:53.807572 containerd[1660]: time="2025-12-16T12:23:53.807377589Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:23:53.809473 containerd[1660]: time="2025-12-16T12:23:53.809371080Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:23:53.809578 containerd[1660]: time="2025-12-16T12:23:53.809443120Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 12:23:53.809807 kubelet[2852]: E1216 12:23:53.809752 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:23:53.809868 kubelet[2852]: E1216 12:23:53.809813 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:23:53.809972 kubelet[2852]: E1216 12:23:53.809934 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:23:53.811153 kubelet[2852]: E1216 12:23:53.811101 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:24:02.132362 containerd[1660]: time="2025-12-16T12:24:02.132305922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:24:02.460288 containerd[1660]: time="2025-12-16T12:24:02.460144514Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:02.461416 containerd[1660]: time="2025-12-16T12:24:02.461358320Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:24:02.461484 containerd[1660]: time="2025-12-16T12:24:02.461447680Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 12:24:02.461645 kubelet[2852]: E1216 12:24:02.461576 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:24:02.461645 kubelet[2852]: E1216 12:24:02.461636 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:24:02.461956 kubelet[2852]: E1216 12:24:02.461872 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6czrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6957f646c-stmvl_calico-system(8d05b967-cc40-4074-a4a8-7c06f72a502c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:02.463160 containerd[1660]: time="2025-12-16T12:24:02.462151644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:24:02.463299 kubelet[2852]: E1216 12:24:02.463248 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:24:02.834486 containerd[1660]: time="2025-12-16T12:24:02.834161301Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:02.840022 containerd[1660]: time="2025-12-16T12:24:02.839746609Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:24:02.840022 containerd[1660]: time="2025-12-16T12:24:02.839819170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:24:02.840414 kubelet[2852]: E1216 12:24:02.840041 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:02.840414 kubelet[2852]: E1216 12:24:02.840120 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:02.840414 kubelet[2852]: E1216 12:24:02.840294 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwpv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-gngqq_calico-apiserver(93e150fa-0b6c-4bf6-aa66-e7e32f1f3161): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:02.841929 kubelet[2852]: E1216 12:24:02.841885 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:24:03.132106 containerd[1660]: time="2025-12-16T12:24:03.131873339Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:24:03.480439 containerd[1660]: time="2025-12-16T12:24:03.480309836Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:03.481668 containerd[1660]: time="2025-12-16T12:24:03.481614282Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:24:03.481739 containerd[1660]: time="2025-12-16T12:24:03.481622243Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 12:24:03.481923 kubelet[2852]: E1216 12:24:03.481882 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:24:03.482499 kubelet[2852]: E1216 12:24:03.482182 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:24:03.482499 kubelet[2852]: E1216 12:24:03.482366 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jv8k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-vhc9k_calico-system(9ac6813a-66c7-4844-b378-32ad1d5e7849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:03.483862 kubelet[2852]: E1216 12:24:03.483815 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:24:04.132991 containerd[1660]: time="2025-12-16T12:24:04.132936004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:24:04.487180 containerd[1660]: time="2025-12-16T12:24:04.486940409Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:04.488309 containerd[1660]: time="2025-12-16T12:24:04.488190015Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:24:04.488309 containerd[1660]: time="2025-12-16T12:24:04.488290696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 12:24:04.488469 kubelet[2852]: E1216 12:24:04.488425 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:24:04.488764 kubelet[2852]: E1216 12:24:04.488479 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:24:04.488764 kubelet[2852]: E1216 12:24:04.488592 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:04.490890 containerd[1660]: time="2025-12-16T12:24:04.490603748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:24:04.874174 containerd[1660]: time="2025-12-16T12:24:04.874113703Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:04.875705 containerd[1660]: time="2025-12-16T12:24:04.875631671Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:24:04.875705 containerd[1660]: time="2025-12-16T12:24:04.875671511Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 12:24:04.875927 kubelet[2852]: E1216 12:24:04.875850 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:24:04.875927 kubelet[2852]: E1216 12:24:04.875919 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:24:04.876130 kubelet[2852]: E1216 12:24:04.876066 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:04.877485 kubelet[2852]: E1216 12:24:04.877421 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:24:05.132749 containerd[1660]: time="2025-12-16T12:24:05.132632022Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:24:05.487027 containerd[1660]: time="2025-12-16T12:24:05.486882868Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:05.488238 containerd[1660]: time="2025-12-16T12:24:05.488171115Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:24:05.488596 containerd[1660]: time="2025-12-16T12:24:05.488249555Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:24:05.488657 kubelet[2852]: E1216 12:24:05.488428 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:05.488657 kubelet[2852]: E1216 12:24:05.488474 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:05.488850 kubelet[2852]: E1216 12:24:05.488616 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sxqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-ns4mb_calico-apiserver(533a88a9-9b7b-4454-afaf-2a43ad5efc0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:05.489873 kubelet[2852]: E1216 12:24:05.489815 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:24:06.132627 kubelet[2852]: E1216 12:24:06.132535 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:24:16.132051 kubelet[2852]: E1216 12:24:16.131958 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:24:16.132051 kubelet[2852]: E1216 12:24:16.132023 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:24:17.132628 kubelet[2852]: E1216 12:24:17.132549 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:24:18.132245 kubelet[2852]: E1216 12:24:18.132147 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:24:20.132134 kubelet[2852]: E1216 12:24:20.132027 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:24:20.133695 containerd[1660]: time="2025-12-16T12:24:20.133533478Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:24:20.467965 containerd[1660]: time="2025-12-16T12:24:20.467751182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:20.470092 containerd[1660]: time="2025-12-16T12:24:20.470023193Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:24:20.470092 containerd[1660]: time="2025-12-16T12:24:20.470070434Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 12:24:20.470417 kubelet[2852]: E1216 12:24:20.470322 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:24:20.470417 kubelet[2852]: E1216 12:24:20.470371 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:24:20.470779 kubelet[2852]: E1216 12:24:20.470483 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c718beb57b24e69b5023047bed65bf6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:20.473352 containerd[1660]: time="2025-12-16T12:24:20.473316250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:24:20.801091 containerd[1660]: time="2025-12-16T12:24:20.800851561Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:20.802498 containerd[1660]: time="2025-12-16T12:24:20.802454249Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:24:20.802622 containerd[1660]: time="2025-12-16T12:24:20.802492249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 12:24:20.802712 kubelet[2852]: E1216 12:24:20.802675 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:24:20.802755 kubelet[2852]: E1216 12:24:20.802723 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:24:20.802873 kubelet[2852]: E1216 12:24:20.802836 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:20.804478 kubelet[2852]: E1216 12:24:20.804402 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:24:29.133248 containerd[1660]: time="2025-12-16T12:24:29.133161570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:24:29.479080 containerd[1660]: time="2025-12-16T12:24:29.478922573Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:29.480500 containerd[1660]: time="2025-12-16T12:24:29.480446501Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:24:29.480655 containerd[1660]: time="2025-12-16T12:24:29.480540182Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 12:24:29.480807 kubelet[2852]: E1216 12:24:29.480706 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:24:29.480807 kubelet[2852]: E1216 12:24:29.480794 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:24:29.481305 kubelet[2852]: E1216 12:24:29.481017 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:29.481457 containerd[1660]: time="2025-12-16T12:24:29.481243865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:24:29.813495 containerd[1660]: time="2025-12-16T12:24:29.813072957Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:29.814612 containerd[1660]: time="2025-12-16T12:24:29.814546845Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:24:29.814674 containerd[1660]: time="2025-12-16T12:24:29.814631205Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 12:24:29.816081 kubelet[2852]: E1216 12:24:29.814822 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:24:29.816179 kubelet[2852]: E1216 12:24:29.816092 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:24:29.816644 kubelet[2852]: E1216 12:24:29.816488 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jv8k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-vhc9k_calico-system(9ac6813a-66c7-4844-b378-32ad1d5e7849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:29.816800 containerd[1660]: time="2025-12-16T12:24:29.816730096Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:24:29.817824 kubelet[2852]: E1216 12:24:29.817791 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:24:30.140306 containerd[1660]: time="2025-12-16T12:24:30.140212586Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:30.145806 containerd[1660]: time="2025-12-16T12:24:30.145694373Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:24:30.145971 containerd[1660]: time="2025-12-16T12:24:30.145735894Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 12:24:30.146328 kubelet[2852]: E1216 12:24:30.146255 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:24:30.146328 kubelet[2852]: E1216 12:24:30.146324 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:24:30.147098 containerd[1660]: time="2025-12-16T12:24:30.147048260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:24:30.147223 kubelet[2852]: E1216 12:24:30.146957 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:30.148566 kubelet[2852]: E1216 12:24:30.148332 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:24:30.485448 containerd[1660]: time="2025-12-16T12:24:30.485177825Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:30.487013 containerd[1660]: time="2025-12-16T12:24:30.486978674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:24:30.487150 containerd[1660]: time="2025-12-16T12:24:30.487021554Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:24:30.487326 kubelet[2852]: E1216 12:24:30.487262 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:30.487939 kubelet[2852]: E1216 12:24:30.487339 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:30.487939 kubelet[2852]: E1216 12:24:30.487638 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwpv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-gngqq_calico-apiserver(93e150fa-0b6c-4bf6-aa66-e7e32f1f3161): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:30.488126 containerd[1660]: time="2025-12-16T12:24:30.487621677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:24:30.489260 kubelet[2852]: E1216 12:24:30.489222 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:24:30.837427 containerd[1660]: time="2025-12-16T12:24:30.837360381Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:30.840041 containerd[1660]: time="2025-12-16T12:24:30.839976274Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:24:30.840131 containerd[1660]: time="2025-12-16T12:24:30.840071954Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 12:24:30.840331 kubelet[2852]: E1216 12:24:30.840263 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:24:30.840331 kubelet[2852]: E1216 12:24:30.840321 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:24:30.840554 kubelet[2852]: E1216 12:24:30.840453 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6czrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6957f646c-stmvl_calico-system(8d05b967-cc40-4074-a4a8-7c06f72a502c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:30.841892 kubelet[2852]: E1216 12:24:30.841830 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:24:31.131997 containerd[1660]: time="2025-12-16T12:24:31.131623521Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:24:31.580889 containerd[1660]: time="2025-12-16T12:24:31.580804172Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:24:31.582284 containerd[1660]: time="2025-12-16T12:24:31.582234579Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:24:31.582365 containerd[1660]: time="2025-12-16T12:24:31.582273979Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:24:31.582542 kubelet[2852]: E1216 12:24:31.582497 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:31.582809 kubelet[2852]: E1216 12:24:31.582553 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:24:31.582809 kubelet[2852]: E1216 12:24:31.582679 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sxqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-ns4mb_calico-apiserver(533a88a9-9b7b-4454-afaf-2a43ad5efc0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:24:31.583922 kubelet[2852]: E1216 12:24:31.583876 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:24:36.131833 kubelet[2852]: E1216 12:24:36.131713 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:24:43.132608 kubelet[2852]: E1216 12:24:43.132268 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:24:43.133812 kubelet[2852]: E1216 12:24:43.133752 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:24:43.735367 update_engine[1639]: I20251216 12:24:43.735293 1639 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 16 12:24:43.735367 update_engine[1639]: I20251216 12:24:43.735353 1639 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 16 12:24:43.735755 update_engine[1639]: I20251216 12:24:43.735586 1639 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 16 12:24:43.735949 update_engine[1639]: I20251216 12:24:43.735915 1639 omaha_request_params.cc:62] Current group set to stable Dec 16 12:24:43.736028 update_engine[1639]: I20251216 12:24:43.736008 1639 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 16 12:24:43.736028 update_engine[1639]: I20251216 12:24:43.736021 1639 update_attempter.cc:643] Scheduling an action processor start. Dec 16 12:24:43.736086 update_engine[1639]: I20251216 12:24:43.736037 1639 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 12:24:43.736086 update_engine[1639]: I20251216 12:24:43.736061 1639 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 16 12:24:43.736126 update_engine[1639]: I20251216 12:24:43.736104 1639 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 12:24:43.736126 update_engine[1639]: I20251216 12:24:43.736110 1639 omaha_request_action.cc:272] Request: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: Dec 16 12:24:43.736126 update_engine[1639]: I20251216 12:24:43.736115 1639 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:24:43.738241 locksmithd[1676]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 16 12:24:43.741344 update_engine[1639]: I20251216 12:24:43.741289 1639 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:24:43.742054 update_engine[1639]: I20251216 12:24:43.742004 1639 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:24:43.752424 update_engine[1639]: E20251216 12:24:43.752357 1639 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 16 12:24:43.752561 update_engine[1639]: I20251216 12:24:43.752457 1639 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 16 12:24:44.131924 kubelet[2852]: E1216 12:24:44.131873 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:24:44.131924 kubelet[2852]: E1216 12:24:44.131840 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:24:46.132646 kubelet[2852]: E1216 12:24:46.132561 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:24:47.133614 kubelet[2852]: E1216 12:24:47.133537 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:24:53.688699 update_engine[1639]: I20251216 12:24:53.688620 1639 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:24:53.689113 update_engine[1639]: I20251216 12:24:53.688712 1639 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:24:53.689113 update_engine[1639]: I20251216 12:24:53.689042 1639 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:24:53.695223 update_engine[1639]: E20251216 12:24:53.694687 1639 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 16 12:24:53.695223 update_engine[1639]: I20251216 12:24:53.694778 1639 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 16 12:24:54.132001 kubelet[2852]: E1216 12:24:54.131927 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:24:54.134233 kubelet[2852]: E1216 12:24:54.134046 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:24:55.134395 kubelet[2852]: E1216 12:24:55.134304 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:24:58.132760 kubelet[2852]: E1216 12:24:58.132682 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:24:59.131983 kubelet[2852]: E1216 12:24:59.131912 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:25:00.132963 kubelet[2852]: E1216 12:25:00.132788 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:25:03.690291 update_engine[1639]: I20251216 12:25:03.690187 1639 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:25:03.690291 update_engine[1639]: I20251216 12:25:03.690294 1639 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:25:03.690828 update_engine[1639]: I20251216 12:25:03.690793 1639 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:25:03.697843 update_engine[1639]: E20251216 12:25:03.697772 1639 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 16 12:25:03.697962 update_engine[1639]: I20251216 12:25:03.697877 1639 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 16 12:25:07.132833 kubelet[2852]: E1216 12:25:07.132748 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:25:09.132586 kubelet[2852]: E1216 12:25:09.132493 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:25:09.133708 kubelet[2852]: E1216 12:25:09.133546 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:25:11.131855 containerd[1660]: time="2025-12-16T12:25:11.131616737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:25:11.469277 containerd[1660]: time="2025-12-16T12:25:11.469144098Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:11.470514 containerd[1660]: time="2025-12-16T12:25:11.470456145Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:25:11.470634 containerd[1660]: time="2025-12-16T12:25:11.470535065Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 12:25:11.470732 kubelet[2852]: E1216 12:25:11.470693 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:25:11.471025 kubelet[2852]: E1216 12:25:11.470742 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:25:11.471025 kubelet[2852]: E1216 12:25:11.470891 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jv8k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-vhc9k_calico-system(9ac6813a-66c7-4844-b378-32ad1d5e7849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:11.472156 kubelet[2852]: E1216 12:25:11.472096 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:25:12.132288 containerd[1660]: time="2025-12-16T12:25:12.132003918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:25:12.469374 containerd[1660]: time="2025-12-16T12:25:12.469220078Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:12.470570 containerd[1660]: time="2025-12-16T12:25:12.470515925Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:25:12.470647 containerd[1660]: time="2025-12-16T12:25:12.470603605Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 12:25:12.470785 kubelet[2852]: E1216 12:25:12.470731 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:25:12.471086 kubelet[2852]: E1216 12:25:12.470784 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:25:12.471086 kubelet[2852]: E1216 12:25:12.470890 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c718beb57b24e69b5023047bed65bf6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:12.472994 containerd[1660]: time="2025-12-16T12:25:12.472942537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:25:12.827652 containerd[1660]: time="2025-12-16T12:25:12.827590226Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:12.829888 containerd[1660]: time="2025-12-16T12:25:12.829802917Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:25:12.829986 containerd[1660]: time="2025-12-16T12:25:12.829895237Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 12:25:12.830233 kubelet[2852]: E1216 12:25:12.830038 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:25:12.830233 kubelet[2852]: E1216 12:25:12.830097 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:25:12.830769 kubelet[2852]: E1216 12:25:12.830257 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:12.831948 kubelet[2852]: E1216 12:25:12.831885 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:25:13.137126 containerd[1660]: time="2025-12-16T12:25:13.136723842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:25:13.515288 containerd[1660]: time="2025-12-16T12:25:13.514369528Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:13.517225 containerd[1660]: time="2025-12-16T12:25:13.516483778Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:25:13.517225 containerd[1660]: time="2025-12-16T12:25:13.516573779Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:25:13.517362 kubelet[2852]: E1216 12:25:13.516719 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:13.517362 kubelet[2852]: E1216 12:25:13.516765 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:13.517362 kubelet[2852]: E1216 12:25:13.516883 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwpv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-gngqq_calico-apiserver(93e150fa-0b6c-4bf6-aa66-e7e32f1f3161): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:13.518400 kubelet[2852]: E1216 12:25:13.518349 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:25:13.690358 update_engine[1639]: I20251216 12:25:13.690284 1639 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:25:13.690358 update_engine[1639]: I20251216 12:25:13.690366 1639 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:25:13.690723 update_engine[1639]: I20251216 12:25:13.690681 1639 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:25:13.697768 update_engine[1639]: E20251216 12:25:13.697695 1639 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 16 12:25:13.697959 update_engine[1639]: I20251216 12:25:13.697813 1639 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 12:25:13.697959 update_engine[1639]: I20251216 12:25:13.697824 1639 omaha_request_action.cc:617] Omaha request response: Dec 16 12:25:13.697959 update_engine[1639]: E20251216 12:25:13.697925 1639 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 16 12:25:13.698036 update_engine[1639]: I20251216 12:25:13.697945 1639 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 16 12:25:13.698036 update_engine[1639]: I20251216 12:25:13.697997 1639 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:25:13.698036 update_engine[1639]: I20251216 12:25:13.698006 1639 update_attempter.cc:306] Processing Done. Dec 16 12:25:13.698036 update_engine[1639]: E20251216 12:25:13.698020 1639 update_attempter.cc:619] Update failed. Dec 16 12:25:13.698036 update_engine[1639]: I20251216 12:25:13.698024 1639 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 16 12:25:13.698036 update_engine[1639]: I20251216 12:25:13.698028 1639 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 16 12:25:13.698036 update_engine[1639]: I20251216 12:25:13.698035 1639 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 16 12:25:13.698331 update_engine[1639]: I20251216 12:25:13.698305 1639 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 12:25:13.698604 update_engine[1639]: I20251216 12:25:13.698343 1639 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 12:25:13.698604 update_engine[1639]: I20251216 12:25:13.698598 1639 omaha_request_action.cc:272] Request: Dec 16 12:25:13.698604 update_engine[1639]: Dec 16 12:25:13.698604 update_engine[1639]: Dec 16 12:25:13.698604 update_engine[1639]: Dec 16 12:25:13.698604 update_engine[1639]: Dec 16 12:25:13.698604 update_engine[1639]: Dec 16 12:25:13.698604 update_engine[1639]: Dec 16 12:25:13.698790 update_engine[1639]: I20251216 12:25:13.698608 1639 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:25:13.698790 update_engine[1639]: I20251216 12:25:13.698633 1639 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:25:13.699601 update_engine[1639]: I20251216 12:25:13.698920 1639 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:25:13.699690 locksmithd[1676]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 16 12:25:13.704246 update_engine[1639]: E20251216 12:25:13.704208 1639 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 16 12:25:13.704315 update_engine[1639]: I20251216 12:25:13.704272 1639 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 12:25:13.704315 update_engine[1639]: I20251216 12:25:13.704280 1639 omaha_request_action.cc:617] Omaha request response: Dec 16 12:25:13.704315 update_engine[1639]: I20251216 12:25:13.704286 1639 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:25:13.704315 update_engine[1639]: I20251216 12:25:13.704290 1639 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:25:13.704315 update_engine[1639]: I20251216 12:25:13.704294 1639 update_attempter.cc:306] Processing Done. Dec 16 12:25:13.704315 update_engine[1639]: I20251216 12:25:13.704299 1639 update_attempter.cc:310] Error event sent. Dec 16 12:25:13.704315 update_engine[1639]: I20251216 12:25:13.704305 1639 update_check_scheduler.cc:74] Next update check in 47m6s Dec 16 12:25:13.704804 locksmithd[1676]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 16 12:25:21.133884 containerd[1660]: time="2025-12-16T12:25:21.132429175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:25:21.453067 containerd[1660]: time="2025-12-16T12:25:21.452667768Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:21.454262 containerd[1660]: time="2025-12-16T12:25:21.454135016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:25:21.454262 containerd[1660]: time="2025-12-16T12:25:21.454211656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:25:21.454408 kubelet[2852]: E1216 12:25:21.454360 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:21.454688 kubelet[2852]: E1216 12:25:21.454409 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:21.455322 kubelet[2852]: E1216 12:25:21.455263 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sxqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-ns4mb_calico-apiserver(533a88a9-9b7b-4454-afaf-2a43ad5efc0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:21.456483 kubelet[2852]: E1216 12:25:21.456442 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:25:22.133245 containerd[1660]: time="2025-12-16T12:25:22.132453675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:25:22.478670 containerd[1660]: time="2025-12-16T12:25:22.478547880Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:22.481793 containerd[1660]: time="2025-12-16T12:25:22.481685336Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:25:22.481793 containerd[1660]: time="2025-12-16T12:25:22.481733016Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 12:25:22.481969 kubelet[2852]: E1216 12:25:22.481908 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:25:22.481969 kubelet[2852]: E1216 12:25:22.481964 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:25:22.482276 kubelet[2852]: E1216 12:25:22.482094 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:22.484932 containerd[1660]: time="2025-12-16T12:25:22.484598111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:25:22.809778 containerd[1660]: time="2025-12-16T12:25:22.809639528Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:22.811324 containerd[1660]: time="2025-12-16T12:25:22.811275296Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:25:22.811399 containerd[1660]: time="2025-12-16T12:25:22.811356097Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 12:25:22.812220 kubelet[2852]: E1216 12:25:22.811478 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:25:22.812220 kubelet[2852]: E1216 12:25:22.811529 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:25:22.812220 kubelet[2852]: E1216 12:25:22.811641 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:22.813211 kubelet[2852]: E1216 12:25:22.813132 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:25:24.132561 containerd[1660]: time="2025-12-16T12:25:24.132502634Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:25:24.312514 systemd[1]: Started sshd@7-10.0.23.32:22-139.178.68.195:50352.service - OpenSSH per-connection server daemon (139.178.68.195:50352). Dec 16 12:25:24.458614 containerd[1660]: time="2025-12-16T12:25:24.458437736Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:24.460185 containerd[1660]: time="2025-12-16T12:25:24.460049144Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:25:24.460185 containerd[1660]: time="2025-12-16T12:25:24.460110025Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 12:25:24.460391 kubelet[2852]: E1216 12:25:24.460271 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:25:24.460391 kubelet[2852]: E1216 12:25:24.460324 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:25:24.461219 kubelet[2852]: E1216 12:25:24.460444 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6czrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6957f646c-stmvl_calico-system(8d05b967-cc40-4074-a4a8-7c06f72a502c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:24.461682 kubelet[2852]: E1216 12:25:24.461609 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:25:25.406398 sshd[5111]: Accepted publickey for core from 139.178.68.195 port 50352 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:25.407822 sshd-session[5111]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:25.412929 systemd-logind[1637]: New session 8 of user core. Dec 16 12:25:25.422419 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:25:26.132655 kubelet[2852]: E1216 12:25:26.132594 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:25:26.230002 sshd[5114]: Connection closed by 139.178.68.195 port 50352 Dec 16 12:25:26.230745 sshd-session[5111]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:26.234491 systemd[1]: sshd@7-10.0.23.32:22-139.178.68.195:50352.service: Deactivated successfully. Dec 16 12:25:26.237862 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:25:26.238669 systemd-logind[1637]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:25:26.240098 systemd-logind[1637]: Removed session 8. Dec 16 12:25:28.131873 kubelet[2852]: E1216 12:25:28.131822 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:25:28.132441 kubelet[2852]: E1216 12:25:28.132360 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:25:31.398712 systemd[1]: Started sshd@8-10.0.23.32:22-139.178.68.195:57622.service - OpenSSH per-connection server daemon (139.178.68.195:57622). Dec 16 12:25:32.393491 sshd[5131]: Accepted publickey for core from 139.178.68.195 port 57622 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:32.394797 sshd-session[5131]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:32.399318 systemd-logind[1637]: New session 9 of user core. Dec 16 12:25:32.409404 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:25:33.159409 sshd[5134]: Connection closed by 139.178.68.195 port 57622 Dec 16 12:25:33.159929 sshd-session[5131]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:33.163631 systemd[1]: sshd@8-10.0.23.32:22-139.178.68.195:57622.service: Deactivated successfully. Dec 16 12:25:33.165516 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:25:33.166264 systemd-logind[1637]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:25:33.167442 systemd-logind[1637]: Removed session 9. Dec 16 12:25:33.343431 systemd[1]: Started sshd@9-10.0.23.32:22-139.178.68.195:57634.service - OpenSSH per-connection server daemon (139.178.68.195:57634). Dec 16 12:25:34.380299 sshd[5149]: Accepted publickey for core from 139.178.68.195 port 57634 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:34.381849 sshd-session[5149]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:34.389572 systemd-logind[1637]: New session 10 of user core. Dec 16 12:25:34.400446 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:25:35.131572 kubelet[2852]: E1216 12:25:35.131521 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:25:35.183155 sshd[5154]: Connection closed by 139.178.68.195 port 57634 Dec 16 12:25:35.184135 sshd-session[5149]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:35.187688 systemd[1]: sshd@9-10.0.23.32:22-139.178.68.195:57634.service: Deactivated successfully. Dec 16 12:25:35.190079 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:25:35.191698 systemd-logind[1637]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:25:35.192879 systemd-logind[1637]: Removed session 10. Dec 16 12:25:35.350187 systemd[1]: Started sshd@10-10.0.23.32:22-139.178.68.195:57636.service - OpenSSH per-connection server daemon (139.178.68.195:57636). Dec 16 12:25:36.336997 sshd[5165]: Accepted publickey for core from 139.178.68.195 port 57636 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:36.338479 sshd-session[5165]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:36.342485 systemd-logind[1637]: New session 11 of user core. Dec 16 12:25:36.350508 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:25:37.085365 sshd[5168]: Connection closed by 139.178.68.195 port 57636 Dec 16 12:25:37.085977 sshd-session[5165]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:37.090174 systemd[1]: sshd@10-10.0.23.32:22-139.178.68.195:57636.service: Deactivated successfully. Dec 16 12:25:37.092362 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:25:37.095758 systemd-logind[1637]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:25:37.097300 systemd-logind[1637]: Removed session 11. Dec 16 12:25:37.133994 kubelet[2852]: E1216 12:25:37.133938 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:25:39.132226 kubelet[2852]: E1216 12:25:39.132145 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:25:39.133762 kubelet[2852]: E1216 12:25:39.133724 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:25:40.131893 kubelet[2852]: E1216 12:25:40.131831 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:25:41.132448 kubelet[2852]: E1216 12:25:41.132385 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:25:42.273965 systemd[1]: Started sshd@11-10.0.23.32:22-139.178.68.195:37780.service - OpenSSH per-connection server daemon (139.178.68.195:37780). Dec 16 12:25:43.345652 sshd[5205]: Accepted publickey for core from 139.178.68.195 port 37780 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:43.347087 sshd-session[5205]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:43.351257 systemd-logind[1637]: New session 12 of user core. Dec 16 12:25:43.362405 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:25:44.136725 sshd[5212]: Connection closed by 139.178.68.195 port 37780 Dec 16 12:25:44.137155 sshd-session[5205]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:44.142268 systemd[1]: sshd@11-10.0.23.32:22-139.178.68.195:37780.service: Deactivated successfully. Dec 16 12:25:44.146920 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:25:44.150584 systemd-logind[1637]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:25:44.151708 systemd-logind[1637]: Removed session 12. Dec 16 12:25:48.131363 kubelet[2852]: E1216 12:25:48.131322 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:25:49.309937 systemd[1]: Started sshd@12-10.0.23.32:22-139.178.68.195:37790.service - OpenSSH per-connection server daemon (139.178.68.195:37790). Dec 16 12:25:50.132177 kubelet[2852]: E1216 12:25:50.132115 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:25:50.348032 sshd[5227]: Accepted publickey for core from 139.178.68.195 port 37790 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:50.349532 sshd-session[5227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:50.353392 systemd-logind[1637]: New session 13 of user core. Dec 16 12:25:50.359387 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:25:51.126332 sshd[5230]: Connection closed by 139.178.68.195 port 37790 Dec 16 12:25:51.126295 sshd-session[5227]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:51.130440 systemd-logind[1637]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:25:51.130691 systemd[1]: sshd@12-10.0.23.32:22-139.178.68.195:37790.service: Deactivated successfully. Dec 16 12:25:51.132253 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:25:51.134235 systemd-logind[1637]: Removed session 13. Dec 16 12:25:51.305238 systemd[1]: Started sshd@13-10.0.23.32:22-139.178.68.195:53070.service - OpenSSH per-connection server daemon (139.178.68.195:53070). Dec 16 12:25:52.132833 kubelet[2852]: E1216 12:25:52.132771 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:25:52.343281 sshd[5244]: Accepted publickey for core from 139.178.68.195 port 53070 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:52.346779 sshd-session[5244]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:52.353043 systemd-logind[1637]: New session 14 of user core. Dec 16 12:25:52.357430 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:25:53.132417 kubelet[2852]: E1216 12:25:53.132368 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:25:53.132612 kubelet[2852]: E1216 12:25:53.132436 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:25:53.165666 sshd[5247]: Connection closed by 139.178.68.195 port 53070 Dec 16 12:25:53.166376 sshd-session[5244]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:53.170245 systemd-logind[1637]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:25:53.170494 systemd[1]: sshd@13-10.0.23.32:22-139.178.68.195:53070.service: Deactivated successfully. Dec 16 12:25:53.173015 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:25:53.176629 systemd-logind[1637]: Removed session 14. Dec 16 12:25:53.352930 systemd[1]: Started sshd@14-10.0.23.32:22-139.178.68.195:53078.service - OpenSSH per-connection server daemon (139.178.68.195:53078). Dec 16 12:25:54.133224 kubelet[2852]: E1216 12:25:54.133047 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:25:54.430177 sshd[5259]: Accepted publickey for core from 139.178.68.195 port 53078 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:54.431421 sshd-session[5259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:54.435615 systemd-logind[1637]: New session 15 of user core. Dec 16 12:25:54.444352 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:25:55.819608 sshd[5262]: Connection closed by 139.178.68.195 port 53078 Dec 16 12:25:55.819935 sshd-session[5259]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:55.823866 systemd[1]: sshd@14-10.0.23.32:22-139.178.68.195:53078.service: Deactivated successfully. Dec 16 12:25:55.825759 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:25:55.826595 systemd-logind[1637]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:25:55.827940 systemd-logind[1637]: Removed session 15. Dec 16 12:25:55.983281 systemd[1]: Started sshd@15-10.0.23.32:22-139.178.68.195:53084.service - OpenSSH per-connection server daemon (139.178.68.195:53084). Dec 16 12:25:56.985044 sshd[5282]: Accepted publickey for core from 139.178.68.195 port 53084 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:56.986636 sshd-session[5282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:56.992863 systemd-logind[1637]: New session 16 of user core. Dec 16 12:25:56.998685 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:25:57.864878 sshd[5285]: Connection closed by 139.178.68.195 port 53084 Dec 16 12:25:57.865451 sshd-session[5282]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:57.870516 systemd[1]: sshd@15-10.0.23.32:22-139.178.68.195:53084.service: Deactivated successfully. Dec 16 12:25:57.872421 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:25:57.873519 systemd-logind[1637]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:25:57.874896 systemd-logind[1637]: Removed session 16. Dec 16 12:25:58.048647 systemd[1]: Started sshd@16-10.0.23.32:22-139.178.68.195:53094.service - OpenSSH per-connection server daemon (139.178.68.195:53094). Dec 16 12:25:59.072262 sshd[5298]: Accepted publickey for core from 139.178.68.195 port 53094 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:25:59.073182 sshd-session[5298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:25:59.076987 systemd-logind[1637]: New session 17 of user core. Dec 16 12:25:59.086482 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:25:59.853190 sshd[5301]: Connection closed by 139.178.68.195 port 53094 Dec 16 12:25:59.853588 sshd-session[5298]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:59.857467 systemd[1]: sshd@16-10.0.23.32:22-139.178.68.195:53094.service: Deactivated successfully. Dec 16 12:25:59.860120 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:25:59.862072 systemd-logind[1637]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:25:59.863778 systemd-logind[1637]: Removed session 17. Dec 16 12:26:02.131495 kubelet[2852]: E1216 12:26:02.131435 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:26:02.131920 kubelet[2852]: E1216 12:26:02.131760 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:26:05.036963 systemd[1]: Started sshd@17-10.0.23.32:22-139.178.68.195:44760.service - OpenSSH per-connection server daemon (139.178.68.195:44760). Dec 16 12:26:06.090573 sshd[5318]: Accepted publickey for core from 139.178.68.195 port 44760 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:26:06.093801 sshd-session[5318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:26:06.101712 systemd-logind[1637]: New session 18 of user core. Dec 16 12:26:06.110187 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:26:06.902120 sshd[5321]: Connection closed by 139.178.68.195 port 44760 Dec 16 12:26:06.902660 sshd-session[5318]: pam_unix(sshd:session): session closed for user core Dec 16 12:26:06.905764 systemd-logind[1637]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:26:06.905892 systemd[1]: sshd@17-10.0.23.32:22-139.178.68.195:44760.service: Deactivated successfully. Dec 16 12:26:06.907860 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:26:06.911106 systemd-logind[1637]: Removed session 18. Dec 16 12:26:07.133738 kubelet[2852]: E1216 12:26:07.133683 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:26:07.135226 kubelet[2852]: E1216 12:26:07.135153 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:26:08.132032 kubelet[2852]: E1216 12:26:08.131891 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:26:09.131645 kubelet[2852]: E1216 12:26:09.131478 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:26:12.091160 systemd[1]: Started sshd@18-10.0.23.32:22-139.178.68.195:55992.service - OpenSSH per-connection server daemon (139.178.68.195:55992). Dec 16 12:26:13.131435 kubelet[2852]: E1216 12:26:13.131360 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:26:13.132391 kubelet[2852]: E1216 12:26:13.132349 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:26:13.139476 sshd[5358]: Accepted publickey for core from 139.178.68.195 port 55992 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:26:13.140866 sshd-session[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:26:13.144837 systemd-logind[1637]: New session 19 of user core. Dec 16 12:26:13.156394 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:26:13.910423 sshd[5361]: Connection closed by 139.178.68.195 port 55992 Dec 16 12:26:13.910781 sshd-session[5358]: pam_unix(sshd:session): session closed for user core Dec 16 12:26:13.914170 systemd-logind[1637]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:26:13.914299 systemd[1]: sshd@18-10.0.23.32:22-139.178.68.195:55992.service: Deactivated successfully. Dec 16 12:26:13.916708 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:26:13.918795 systemd-logind[1637]: Removed session 19. Dec 16 12:26:19.077541 systemd[1]: Started sshd@19-10.0.23.32:22-139.178.68.195:55994.service - OpenSSH per-connection server daemon (139.178.68.195:55994). Dec 16 12:26:20.037670 sshd[5375]: Accepted publickey for core from 139.178.68.195 port 55994 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:26:20.039121 sshd-session[5375]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:26:20.043669 systemd-logind[1637]: New session 20 of user core. Dec 16 12:26:20.051419 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 12:26:20.131491 kubelet[2852]: E1216 12:26:20.131441 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:26:20.760710 sshd[5378]: Connection closed by 139.178.68.195 port 55994 Dec 16 12:26:20.761094 sshd-session[5375]: pam_unix(sshd:session): session closed for user core Dec 16 12:26:20.766763 systemd[1]: sshd@19-10.0.23.32:22-139.178.68.195:55994.service: Deactivated successfully. Dec 16 12:26:20.768460 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 12:26:20.769642 systemd-logind[1637]: Session 20 logged out. Waiting for processes to exit. Dec 16 12:26:20.771011 systemd-logind[1637]: Removed session 20. Dec 16 12:26:22.134288 kubelet[2852]: E1216 12:26:22.134204 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:26:22.134288 kubelet[2852]: E1216 12:26:22.134257 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:26:22.135363 kubelet[2852]: E1216 12:26:22.135318 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:26:24.131244 kubelet[2852]: E1216 12:26:24.131160 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:26:25.926899 systemd[1]: Started sshd@20-10.0.23.32:22-139.178.68.195:56258.service - OpenSSH per-connection server daemon (139.178.68.195:56258). Dec 16 12:26:26.132209 kubelet[2852]: E1216 12:26:26.132129 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:26:26.895091 sshd[5397]: Accepted publickey for core from 139.178.68.195 port 56258 ssh2: RSA SHA256:GTgWIRL4td+QmBoArcwmAKMG8pzuy3h63+cX62+xwEw Dec 16 12:26:26.896819 sshd-session[5397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:26:26.903304 systemd-logind[1637]: New session 21 of user core. Dec 16 12:26:26.911406 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 12:26:27.617147 sshd[5400]: Connection closed by 139.178.68.195 port 56258 Dec 16 12:26:27.617782 sshd-session[5397]: pam_unix(sshd:session): session closed for user core Dec 16 12:26:27.622444 systemd[1]: sshd@20-10.0.23.32:22-139.178.68.195:56258.service: Deactivated successfully. Dec 16 12:26:27.624483 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 12:26:27.626220 systemd-logind[1637]: Session 21 logged out. Waiting for processes to exit. Dec 16 12:26:27.629153 systemd-logind[1637]: Removed session 21. Dec 16 12:26:33.134371 kubelet[2852]: E1216 12:26:33.134304 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:26:35.133902 containerd[1660]: time="2025-12-16T12:26:35.133862338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:26:35.506868 containerd[1660]: time="2025-12-16T12:26:35.506581879Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:35.508412 containerd[1660]: time="2025-12-16T12:26:35.508248447Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:26:35.508412 containerd[1660]: time="2025-12-16T12:26:35.508346528Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 12:26:35.508550 kubelet[2852]: E1216 12:26:35.508459 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:26:35.508550 kubelet[2852]: E1216 12:26:35.508511 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:26:35.508825 kubelet[2852]: E1216 12:26:35.508621 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c718beb57b24e69b5023047bed65bf6,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:35.510795 containerd[1660]: time="2025-12-16T12:26:35.510767980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:26:35.870659 containerd[1660]: time="2025-12-16T12:26:35.870602735Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:35.871880 containerd[1660]: time="2025-12-16T12:26:35.871839902Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:26:35.871957 containerd[1660]: time="2025-12-16T12:26:35.871921982Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 12:26:35.872122 kubelet[2852]: E1216 12:26:35.872083 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:26:35.872175 kubelet[2852]: E1216 12:26:35.872135 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:26:35.872305 kubelet[2852]: E1216 12:26:35.872265 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ftv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68755c694b-r2nr2_calico-system(3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:35.873620 kubelet[2852]: E1216 12:26:35.873578 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:26:36.132154 containerd[1660]: time="2025-12-16T12:26:36.132048788Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:26:36.492471 containerd[1660]: time="2025-12-16T12:26:36.492209665Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:36.494039 containerd[1660]: time="2025-12-16T12:26:36.493979074Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:26:36.494220 containerd[1660]: time="2025-12-16T12:26:36.494086795Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 12:26:36.494381 kubelet[2852]: E1216 12:26:36.494343 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:26:36.494601 kubelet[2852]: E1216 12:26:36.494481 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:26:36.494922 kubelet[2852]: E1216 12:26:36.494710 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jv8k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-vhc9k_calico-system(9ac6813a-66c7-4844-b378-32ad1d5e7849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:36.495289 containerd[1660]: time="2025-12-16T12:26:36.495224520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:26:36.496452 kubelet[2852]: E1216 12:26:36.496386 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:26:36.837478 containerd[1660]: time="2025-12-16T12:26:36.837432425Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:36.839346 containerd[1660]: time="2025-12-16T12:26:36.839271115Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:26:36.839443 containerd[1660]: time="2025-12-16T12:26:36.839290995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:26:36.840086 kubelet[2852]: E1216 12:26:36.840046 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:36.840514 kubelet[2852]: E1216 12:26:36.840097 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:36.840514 kubelet[2852]: E1216 12:26:36.840237 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwpv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-gngqq_calico-apiserver(93e150fa-0b6c-4bf6-aa66-e7e32f1f3161): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:36.841539 kubelet[2852]: E1216 12:26:36.841482 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:26:37.132607 kubelet[2852]: E1216 12:26:37.132497 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:26:39.135359 kubelet[2852]: E1216 12:26:39.135171 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:26:47.132646 containerd[1660]: time="2025-12-16T12:26:47.132562084Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:26:47.465919 containerd[1660]: time="2025-12-16T12:26:47.465780104Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:47.467359 containerd[1660]: time="2025-12-16T12:26:47.467304991Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:26:47.467443 containerd[1660]: time="2025-12-16T12:26:47.467384472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 12:26:47.467597 kubelet[2852]: E1216 12:26:47.467549 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:26:47.467899 kubelet[2852]: E1216 12:26:47.467608 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:26:47.467899 kubelet[2852]: E1216 12:26:47.467728 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6czrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6957f646c-stmvl_calico-system(8d05b967-cc40-4074-a4a8-7c06f72a502c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:47.468979 kubelet[2852]: E1216 12:26:47.468932 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:26:48.133213 kubelet[2852]: E1216 12:26:48.133149 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:26:49.131556 kubelet[2852]: E1216 12:26:49.131505 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:26:50.132237 containerd[1660]: time="2025-12-16T12:26:50.132021500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:26:50.445054 containerd[1660]: time="2025-12-16T12:26:50.444709975Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:50.446883 containerd[1660]: time="2025-12-16T12:26:50.446840826Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:26:50.446948 containerd[1660]: time="2025-12-16T12:26:50.446926906Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 12:26:50.447117 kubelet[2852]: E1216 12:26:50.447060 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:50.447117 kubelet[2852]: E1216 12:26:50.447114 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:50.447499 kubelet[2852]: E1216 12:26:50.447264 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9sxqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-84f977f475-ns4mb_calico-apiserver(533a88a9-9b7b-4454-afaf-2a43ad5efc0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:50.448464 kubelet[2852]: E1216 12:26:50.448404 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:26:52.131799 kubelet[2852]: E1216 12:26:52.131713 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:26:52.965230 systemd[1]: cri-containerd-b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3.scope: Deactivated successfully. Dec 16 12:26:52.965693 systemd[1]: cri-containerd-b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3.scope: Consumed 5.138s CPU time, 59.3M memory peak. Dec 16 12:26:52.966490 containerd[1660]: time="2025-12-16T12:26:52.966429594Z" level=info msg="received container exit event container_id:\"b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3\" id:\"b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3\" pid:2715 exit_status:1 exited_at:{seconds:1765888012 nanos:966092912}" Dec 16 12:26:52.986913 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3-rootfs.mount: Deactivated successfully. Dec 16 12:26:53.187121 kubelet[2852]: E1216 12:26:53.187078 2852 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.23.32:38308->10.0.23.46:2379: read: connection timed out" Dec 16 12:26:53.191142 systemd[1]: cri-containerd-abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1.scope: Deactivated successfully. Dec 16 12:26:53.191613 systemd[1]: cri-containerd-abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1.scope: Consumed 3.774s CPU time, 24M memory peak. Dec 16 12:26:53.193178 containerd[1660]: time="2025-12-16T12:26:53.193138710Z" level=info msg="received container exit event container_id:\"abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1\" id:\"abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1\" pid:2670 exit_status:1 exited_at:{seconds:1765888013 nanos:192682468}" Dec 16 12:26:53.213392 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1-rootfs.mount: Deactivated successfully. Dec 16 12:26:53.263411 systemd[1]: cri-containerd-6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d.scope: Deactivated successfully. Dec 16 12:26:53.263836 systemd[1]: cri-containerd-6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d.scope: Consumed 35.179s CPU time, 98.4M memory peak. Dec 16 12:26:53.265048 containerd[1660]: time="2025-12-16T12:26:53.264991117Z" level=info msg="received container exit event container_id:\"6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d\" id:\"6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d\" pid:3199 exit_status:1 exited_at:{seconds:1765888013 nanos:264629475}" Dec 16 12:26:53.283532 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d-rootfs.mount: Deactivated successfully. Dec 16 12:26:53.678236 kubelet[2852]: I1216 12:26:53.677968 2852 scope.go:117] "RemoveContainer" containerID="b437413d9d93521f6fd772aa8d8485640c5e2ba69aac806f214d25d86e2c5cd3" Dec 16 12:26:53.680219 containerd[1660]: time="2025-12-16T12:26:53.679969753Z" level=info msg="CreateContainer within sandbox \"5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 12:26:53.680704 kubelet[2852]: I1216 12:26:53.680668 2852 scope.go:117] "RemoveContainer" containerID="6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d" Dec 16 12:26:53.683855 containerd[1660]: time="2025-12-16T12:26:53.683277970Z" level=info msg="CreateContainer within sandbox \"a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 12:26:53.683937 kubelet[2852]: I1216 12:26:53.683662 2852 scope.go:117] "RemoveContainer" containerID="abd580e237a116090ecd564b4b636524d87daed0f0126a4d19331618ba2b3fd1" Dec 16 12:26:53.685774 containerd[1660]: time="2025-12-16T12:26:53.685744582Z" level=info msg="CreateContainer within sandbox \"b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 12:26:53.691619 containerd[1660]: time="2025-12-16T12:26:53.691579772Z" level=info msg="Container 3e01b4eb000355d075e89a736923b0baa93edb8934435283658ccbdf7f1a792c: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:26:53.697140 containerd[1660]: time="2025-12-16T12:26:53.697090080Z" level=info msg="Container 19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:26:53.701640 containerd[1660]: time="2025-12-16T12:26:53.701596623Z" level=info msg="Container d9fc2c42a68b27bbab1fcfb17805070936d9a834cfaf1f2ca83f69bbcd1087a1: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:26:53.706188 containerd[1660]: time="2025-12-16T12:26:53.706081526Z" level=info msg="CreateContainer within sandbox \"5138b173fde874723a4cb3a20ec2375dd0c6cc5381d3571afe4d052cd82c7cb6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"3e01b4eb000355d075e89a736923b0baa93edb8934435283658ccbdf7f1a792c\"" Dec 16 12:26:53.706690 containerd[1660]: time="2025-12-16T12:26:53.706648409Z" level=info msg="StartContainer for \"3e01b4eb000355d075e89a736923b0baa93edb8934435283658ccbdf7f1a792c\"" Dec 16 12:26:53.707977 containerd[1660]: time="2025-12-16T12:26:53.707927455Z" level=info msg="connecting to shim 3e01b4eb000355d075e89a736923b0baa93edb8934435283658ccbdf7f1a792c" address="unix:///run/containerd/s/70289c6275ec55f2a760333f19ef7bf0e8948ba3662c861b223b67a494da611e" protocol=ttrpc version=3 Dec 16 12:26:53.708861 containerd[1660]: time="2025-12-16T12:26:53.708816820Z" level=info msg="CreateContainer within sandbox \"b84e13ebb363f53e334fbd0183b91f8217f283a78b2d70fa8762192895b955f3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"d9fc2c42a68b27bbab1fcfb17805070936d9a834cfaf1f2ca83f69bbcd1087a1\"" Dec 16 12:26:53.709231 containerd[1660]: time="2025-12-16T12:26:53.709188142Z" level=info msg="StartContainer for \"d9fc2c42a68b27bbab1fcfb17805070936d9a834cfaf1f2ca83f69bbcd1087a1\"" Dec 16 12:26:53.710342 containerd[1660]: time="2025-12-16T12:26:53.710272227Z" level=info msg="connecting to shim d9fc2c42a68b27bbab1fcfb17805070936d9a834cfaf1f2ca83f69bbcd1087a1" address="unix:///run/containerd/s/ebcda05243c9a03aabe641d3989fb38b99656500884c4bff4df978158021eab0" protocol=ttrpc version=3 Dec 16 12:26:53.712117 containerd[1660]: time="2025-12-16T12:26:53.712075036Z" level=info msg="CreateContainer within sandbox \"a25034ebee23c47516f3ead7644c59c45091b26a0cc1d841c7deeb695fc6ad61\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49\"" Dec 16 12:26:53.714220 containerd[1660]: time="2025-12-16T12:26:53.713716285Z" level=info msg="StartContainer for \"19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49\"" Dec 16 12:26:53.715011 containerd[1660]: time="2025-12-16T12:26:53.714974651Z" level=info msg="connecting to shim 19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49" address="unix:///run/containerd/s/f7d6197acf7bb0c99701516aa082bef04efda8773f8114603ec7002d9252426f" protocol=ttrpc version=3 Dec 16 12:26:53.732425 systemd[1]: Started cri-containerd-d9fc2c42a68b27bbab1fcfb17805070936d9a834cfaf1f2ca83f69bbcd1087a1.scope - libcontainer container d9fc2c42a68b27bbab1fcfb17805070936d9a834cfaf1f2ca83f69bbcd1087a1. Dec 16 12:26:53.737825 systemd[1]: Started cri-containerd-19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49.scope - libcontainer container 19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49. Dec 16 12:26:53.738745 systemd[1]: Started cri-containerd-3e01b4eb000355d075e89a736923b0baa93edb8934435283658ccbdf7f1a792c.scope - libcontainer container 3e01b4eb000355d075e89a736923b0baa93edb8934435283658ccbdf7f1a792c. Dec 16 12:26:53.784436 containerd[1660]: time="2025-12-16T12:26:53.784395245Z" level=info msg="StartContainer for \"d9fc2c42a68b27bbab1fcfb17805070936d9a834cfaf1f2ca83f69bbcd1087a1\" returns successfully" Dec 16 12:26:53.795780 containerd[1660]: time="2025-12-16T12:26:53.795732823Z" level=info msg="StartContainer for \"19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49\" returns successfully" Dec 16 12:26:53.798132 containerd[1660]: time="2025-12-16T12:26:53.798016355Z" level=info msg="StartContainer for \"3e01b4eb000355d075e89a736923b0baa93edb8934435283658ccbdf7f1a792c\" returns successfully" Dec 16 12:26:53.990757 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount237838563.mount: Deactivated successfully. Dec 16 12:26:54.132346 containerd[1660]: time="2025-12-16T12:26:54.132263259Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:26:54.273678 kubelet[2852]: I1216 12:26:54.273560 2852 status_manager.go:895] "Failed to get status for pod" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" pod="calico-system/whisker-68755c694b-r2nr2" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.23.32:38228->10.0.23.46:2379: read: connection timed out" Dec 16 12:26:54.273995 kubelet[2852]: E1216 12:26:54.273600 2852 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.23.32:38140->10.0.23.46:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-2-6-119dd6897d.1881b1c9129e26a0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-2-6-119dd6897d,UID:4d42e56df62129c18da75f0f7e0999ca,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-6-119dd6897d,},FirstTimestamp:2025-12-16 12:26:46.0648424 +0000 UTC m=+229.380447127,LastTimestamp:2025-12-16 12:26:46.0648424 +0000 UTC m=+229.380447127,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-6-119dd6897d,}" Dec 16 12:26:54.472183 containerd[1660]: time="2025-12-16T12:26:54.472130472Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:54.473602 containerd[1660]: time="2025-12-16T12:26:54.473551639Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:26:54.473661 containerd[1660]: time="2025-12-16T12:26:54.473643240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 12:26:54.473840 kubelet[2852]: E1216 12:26:54.473793 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:26:54.473897 kubelet[2852]: E1216 12:26:54.473853 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:26:54.474023 kubelet[2852]: E1216 12:26:54.473982 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:54.476168 containerd[1660]: time="2025-12-16T12:26:54.476145773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:26:55.022304 containerd[1660]: time="2025-12-16T12:26:55.022152917Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:55.023923 containerd[1660]: time="2025-12-16T12:26:55.023845206Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:26:55.023923 containerd[1660]: time="2025-12-16T12:26:55.023885686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 12:26:55.024130 kubelet[2852]: E1216 12:26:55.024076 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:26:55.024178 kubelet[2852]: E1216 12:26:55.024132 2852 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:26:55.024308 kubelet[2852]: E1216 12:26:55.024258 2852 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmhdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-w78fj_calico-system(426e2a8a-5d6e-4966-b145-015ce9ecbfa0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:55.025474 kubelet[2852]: E1216 12:26:55.025436 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:26:59.131779 kubelet[2852]: E1216 12:26:59.131615 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:27:03.132214 kubelet[2852]: E1216 12:27:03.132087 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68755c694b-r2nr2" podUID="3e5d8b5a-70d9-4d0b-a6fa-4d88b20890b8" Dec 16 12:27:03.188653 kubelet[2852]: E1216 12:27:03.188570 2852 controller.go:195] "Failed to update lease" err="Put \"https://10.0.23.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-6-119dd6897d?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 12:27:04.131410 kubelet[2852]: E1216 12:27:04.131341 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-vhc9k" podUID="9ac6813a-66c7-4844-b378-32ad1d5e7849" Dec 16 12:27:04.133301 kubelet[2852]: E1216 12:27:04.131578 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-gngqq" podUID="93e150fa-0b6c-4bf6-aa66-e7e32f1f3161" Dec 16 12:27:05.014010 systemd[1]: cri-containerd-19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49.scope: Deactivated successfully. Dec 16 12:27:05.014822 containerd[1660]: time="2025-12-16T12:27:05.014783953Z" level=info msg="received container exit event container_id:\"19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49\" id:\"19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49\" pid:5540 exit_status:1 exited_at:{seconds:1765888025 nanos:14308631}" Dec 16 12:27:05.033729 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49-rootfs.mount: Deactivated successfully. Dec 16 12:27:05.131786 kubelet[2852]: E1216 12:27:05.131732 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-84f977f475-ns4mb" podUID="533a88a9-9b7b-4454-afaf-2a43ad5efc0e" Dec 16 12:27:05.715951 kubelet[2852]: I1216 12:27:05.715919 2852 scope.go:117] "RemoveContainer" containerID="6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d" Dec 16 12:27:05.716329 kubelet[2852]: I1216 12:27:05.716248 2852 scope.go:117] "RemoveContainer" containerID="19ef4d17e0b96c2db3191742c32d45a91a2ac02ba5dcb1b9a52a9c82a3b6af49" Dec 16 12:27:05.716426 kubelet[2852]: E1216 12:27:05.716394 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-ttqrb_tigera-operator(e14e5b12-4666-48e8-9acb-a2b529d7ed68)\"" pod="tigera-operator/tigera-operator-7dcd859c48-ttqrb" podUID="e14e5b12-4666-48e8-9acb-a2b529d7ed68" Dec 16 12:27:05.717701 containerd[1660]: time="2025-12-16T12:27:05.717666578Z" level=info msg="RemoveContainer for \"6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d\"" Dec 16 12:27:05.722421 containerd[1660]: time="2025-12-16T12:27:05.722373202Z" level=info msg="RemoveContainer for \"6861faabb795c92e99091783a59b20a8389de0a22200f12533153cfddb97652d\" returns successfully" Dec 16 12:27:07.132175 kubelet[2852]: E1216 12:27:07.132122 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-w78fj" podUID="426e2a8a-5d6e-4966-b145-015ce9ecbfa0" Dec 16 12:27:10.132091 kubelet[2852]: E1216 12:27:10.132026 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6957f646c-stmvl" podUID="8d05b967-cc40-4074-a4a8-7c06f72a502c" Dec 16 12:27:13.189819 kubelet[2852]: E1216 12:27:13.189755 2852 controller.go:195] "Failed to update lease" err="Put \"https://10.0.23.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-6-119dd6897d?timeout=10s\": context deadline exceeded" Dec 16 12:27:15.825248 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec